by System Admin
Tagged with: , , , ,
by System Admin
Tagged with: , , , ,
by System Admin
Tagged with: , , , ,
by Open Paws
๐ Whoโs it for This subworkflow is designed for developers, AI engineers, or automation builders who generate dynamic HTML content in their workflows (e.g. reports, dashboards, emails) and want a simple way to host and share it via a clean URL, without spinning up infrastructure or uploading to a CMS. Itโs especially useful when combined with AI agents that generate HTML content as part of a larger automated pipeline. โ๏ธ What it does This subworkflow: Accepts raw HTML content as input. Creates a new GitHub Gist with that content. Returns the shareable Gist URL, which can then be sent via Slack, Telegram, email, etc. The result is a lightweight, fast, and free way to publish AI-generated HTML (such as reports, articles, or formatted data outputs) to the web. ๐ ๏ธ How to set it up Add this subworkflow to any parent workflow where HTML is generated. Pass in a string of valid HTML via the html input parameter. Configure the GitHub credentials in the HTTP node using an access token with gist scope. โ Requirements GitHub account and personal access token with gist permissions ๐ง How to customize Change the filename (report.html) if your use case needs a different format or extension. Add metadata to the Gist (e.g., description, tags).
by System Admin
Tagged with: , , , ,
by System Admin
Tagged with: , , , ,
by kote2
Overview This workflow takes an image + instruction text from an n8n Form, edits only the hairstyle while keeping the face unchanged, uploads the result to Cloudinary, and pushes the image to a designated LINE group. Itโs designed for salons or creative teams that need quick, consistent โhair-onlyโ edits and one-click sharing to a staff group. Key Features Accepts image and free-form instructions via n8n Forms Enforces identity preservation: face is not altered; only the hairstyle is modified Uses OpenAI Image Edit to generate the edited image Uploads the output to Cloudinary and returns a public URL Sends the final image to a LINE group via the Push API (no replyToken required) Optional safety: store the groupId once (via webhook) in n8n Data Store and reuse it, avoiding mis-sending to individuals Requirements OpenAI API key (Image Edit capable model) Cloudinary account with an unsigned upload_preset LINE Official Account (Messaging API enabled) and the bot added to the target group (to capture groupId once) n8n with Forms enabled and Data Store available Notes Replace placeholders like CLOUDINARY_CLOUD_NAME, CLOUDINARY_UPLOAD_PRESET, LINE_CHANNEL_TOKEN, and LINE_GROUP_ID with your values.
by CentralStationCRM
Workflow Overview This workflow benefits anyone, who: wants to automate sending Postcards to tagged CentralStationCRM contacts Tools in this Workflow CentralStationCRM - the simple and intuitive CRM Software for small teams. EchtPost - send postcards online Workflow Screenshot Workflow Description This workflow consists of: a webhook trigger a set node an if node an HTTP request The Webhook Trigger The webhook triggers when a person is updated in CentralStationCRM. The persons data is sent to n8n, including the company name and the address. The Set node This node filters the webhook data for just the data the EchtPost node needs. It also transforms the country field to the associated country code (i.e. "Deutschland" -> "DE") The if node This node checks if there is an "EchtPost" tag in the persons data. -> if false: do nothing -> if true: go to HTTP request The HTTP Request This node sends the persons name, address and country code alongside with the id of the EchtPost template (inserted manually in the node) to EchtPost. It also needs an E-Mail-Adress for a confirmation mail. The data starts a "Mail in 3 days" process with the EchtPost service, which mails a postcard with that template to the person. Preconditions For this workflow, you need: a CentralStationCRM account with API access an EchtPost account with API access an EchtPost template, with a message, and the templates ID
by System Admin
No description available
by System Admin
Tagged with: , , , ,
by Sulieman Said
๐ YouTube Trends Workflow in n8n ๐ฏ Goal of the Workflow This workflow fetches the most popular YouTube videos in Germany and transforms them into a clean list of trending hashtags/keywords. Itโs not just raw data โ it highlights qualitative insights about which topics and keywords drive real engagement. Useful for: Content creators & marketers** โ quick inspiration for viral topics SEO/hashtag research** โ discover high-performing keywords Social media teams** โ automated trend reporting โ๏ธ How It Works โ Step by Step 1. Trigger Node:** Manual Trigger Entry point of the workflow. โ Can later be replaced with a Cron or Webhook trigger to run automatically. 2. API Call โ YouTube Node:** HTTP Request Calls the YouTube Data API v3 with: chart=mostPopular regionCode=DE maxResults=50 Returns a list of trending videos in Germany, including title, channel, views, likes, comments, and tags. 3. Split Items Node:** Split Out Splits the API response (items[]) into individual items, so each video is processed separately. 4. Context & KPIs Node:** Set Extracts key fields: title, channel, publishedAt views, likes, comments Calculates: engagementRate = (likes + comments) / (views + 1) Why? Looking at views alone is misleading. Engagement rate reveals which videos truly resonate with the audience. 5. Ranking & Limit Node:** Item Lists Sorts videos by highest engagement rate (descending). Limits results to the Top 20 videos. Why? This reduces noise and focuses only on the most relevant trends. 6. Collect Tags Node:** Aggregate Collects all snippet.tags from the top videos. Merges them into a single keyword list. 7. Keyword Summary Node:** Summarize Concatenates all collected tags into one string. Result: A compact hashtag/keyword list โ ready to use for: Content ideas SEO/hashtag strategies Social media planning ๐ Why This Workflow Is Useful Automated trend analysis**: no need to manually browse YouTube. Quality focus**: uses engagement rate, not just views. Reusable**: results can be: stored in Google Sheets or Airtable, sent to Slack/Telegram, or used in a dashboard. ๐ Possible Extensions Cron Trigger** โ run daily for automated reports. Notifier** โ send results directly to Slack, Discord, or Telegram. Storage** โ save trends in a database for historical analysis. AI Node** โ use GPT to summarize trends into โ3 key insights of the day.โ โ Conclusion With a few smartly combined nodes, this workflow: fetches external API data (YouTube), cleans and enriches it, calculates qualitative KPIs, and delivers actionable trend insights. Perfect as a community showcase or a practical tool for creators and marketers.
by Vigh Sandor
Automated Rsync Backup with Password Auth & Alert System Overview This n8n workflow provides automated rsync backup capabilities between servers using password authentication. It automatically installs required dependencies, performs the backup operation from a source server to a target server, and sends status notifications via Telegram and SMS. Features Password-based SSH authentication (no key management required) Automatic dependency installation (sshpass, rsync) Cross-platform support (Ubuntu/Debian, RHEL/CentOS, Alpine) Source-to-target backup execution Multi-channel notifications (Telegram and SMS) Detailed success/failure reporting Manual trigger for on-demand backups Setup Instructions Prerequisites n8n Instance: Running n8n with Linux environment Server Access: SSH access to both source and target servers Telegram Bot: Created via @BotFather (optional) Textbelt API Key: For SMS notifications (optional) Network: Connectivity between n8n, source, and target servers Server Requirements Source Server: SSH access enabled User with sudo privileges (for package installation) Read access to source folder Target Server: SSH access enabled Write access to target folder Sufficient storage space Configuration Steps 1. Server Parameters Configuration Open the Server Parameters node and configure: Source Server Settings: source_host: IP address or hostname of source server source_port: SSH port (typically 22) source_user: Username for source server source_password: Password for source user source_folder: Full path to folder to backup (e.g., /home/user/data) Target Server Settings: target_host: IP address or hostname of target server target_port: SSH port (typically 22) target_user: Username for target server target_password: Password for target user target_folder: Full path to destination folder (e.g., /backup/data) Rsync Options: rsync_options: Default is -avz --delete -a: Archive mode (preserves permissions, timestamps, etc.) -v: Verbose output -z: Compression during transfer --delete: Remove files from target that don't exist in source 2. Notification Setup (Optional) Telegram Configuration: Create bot via @BotFather on Telegram Get bot token (format: 1234567890:ABCdefGHIjklMNOpqrsTUVwxyz) Create notification channel Add bot as administrator Get channel ID: Send test message to channel Visit: https://api.telegram.org/bot<YOUR_BOT_TOKEN>/getUpdates Find "chat":{"id":-100XXXXXXXXXX} SMS Configuration: Register at https://textbelt.com Purchase credits Obtain API key Update Notification Node: Edit Process Finish Report --- Telegram & SMS node: Replace YOUR-TELEGRAM-BOT-TOKEN with bot token Replace YOUR-TELEGRAM-CHANNEL-ID with channel ID Replace +36301234567 with target phone number(s) Replace YOUR-TEXTBELT-API-KEY with Textbelt key 3. Security Considerations Password Storage: Consider using n8n credentials for sensitive passwords Avoid hardcoding passwords in workflow Use environment variables where possible SSH Security: Workflow uses StrictHostKeyChecking=no for automation Consider adding known hosts manually for production Review firewall rules between servers Testing Start with small test folder Verify network connectivity: ping source_host and ping target_host Test SSH access manually first Run workflow with test data Verify backup completion on target server How to Use Automatic Operation Once activated, the workflow runs automatically: Frequency**: Every days midnight Manual Execution Open the workflow in n8n Click on Manual Trigger node Click "Execute Workflow" Monitor execution progress Scheduled Execution To automate backups: Replace Manual Trigger with Schedule Trigger node Configure schedule (e.g., daily at 2 AM) Save and activate workflow Workflow Process Step 1: Dependency Check The workflow automatically: Checks if sshpass is installed locally Installs if missing (supports apt, yum, dnf, apk) Checks sshpass on source server Installs on source if needed (with sudo) Step 2: Backup Execution Connects to source server via SSH Executes rsync command from source to target Uses password authentication for both connections Transfers data directly between servers (not through n8n) Step 3: Status Reporting Success Message Format: [Timestamp] -- SUCCESS :: source_host:/path -> target_host:/path :: [rsync output] Failure Message Format: [Timestamp] -- ERROR :: source_host -> target_host :: [exit code] -- [error message] Rsync Options Guide Common Options: -a: Archive mode (recommended) -v: Verbose output for monitoring -z: Compression (useful for slow networks) --delete: Mirror source (removes extra files from target) --exclude: Skip specific files/folders --dry-run: Test without actual transfer --progress: Show transfer progress --bwlimit: Limit bandwidth usage Example Configurations: Basic backup -avz Mirror with deletion -avz --delete Exclude temporary files -avz --exclude='.tmp' --exclude='.cache' Bandwidth limited (1MB/s) -avz --bwlimit=1000 Dry run test -avzn --delete Monitoring Execution Logs Check n8n Executions tab Review stdout for rsync details Check stderr for error messages Verification After backup: SSH to target server Check folder size: du -sh /target/folder Verify file count: find /target/folder -type f | wc -l Compare with source: ls -la /target/folder Troubleshooting Connection Issues "Connection refused" error: Verify SSH port is correct Check firewall rules Ensure SSH service is running "Permission denied" error: Verify username/password Check user has required permissions Ensure sudo works (for installation) Installation Failures "Unsupported package manager": Workflow supports: apt, yum, dnf, apk Manual installation may be required for others "sudo: password required": User needs passwordless sudo or Modify installation commands Rsync Errors "rsync error: some files/attrs were not transferred": Usually permission issues Check file ownership Review excluded files "No space left on device": Check target server storage Clean up old backups Consider compression options Notification Issues No Telegram message: Verify bot token and channel ID Check bot is admin in channel Test with curl command manually SMS not received: Check Textbelt credit balance Verify phone number format Review API key validity Best Practices Backup Strategy Test First: Always test with small datasets Schedule Wisely: Run during low-traffic periods Monitor Space: Ensure adequate storage on target Verify Backups: Regularly test restore procedures Rotate Backups: Implement retention policies Security Use Strong Passwords: Complex passwords for all accounts Limit Permissions: Use dedicated backup users Network Security: Consider VPN for internet transfers Audit Access: Log all backup operations Encrypt Sensitive Data: Consider rsync with encryption Performance Compression: Use -z for slow networks Bandwidth Limits: Prevent network saturation Incremental Backups: Rsync only transfers changes Parallel Transfers: Consider multiple workflows for different folders Off-Peak Hours: Schedule during quiet periods Advanced Configuration Multiple Backup Jobs Create separate workflows for: Different server pairs Various schedules Distinct retention policies Backup Rotation Implement versioning: Add timestamp to target folder target_folder="/backup/data_$(date +%Y%m%d)" Pre/Post Scripts Add nodes for: Database dumps before backup Service stops/starts Cleanup operations Verification scripts Error Handling Enhance workflow with: Retry mechanisms Fallback servers Detailed error logging Escalation procedures Maintenance Regular Tasks Daily**: Check backup completion Weekly**: Verify backup integrity Monthly**: Test restore procedure Quarterly**: Review and optimize rsync options Annually**: Audit security settings Monitoring Metrics Track: Backup duration Transfer size Success/failure rate Storage utilization Network bandwidth usage Recovery Procedures Restore from Backup To restore files: Reverse the rsync direction rsync -avz target_server:/backup/folder/ source_server:/restore/location/ Disaster Recovery Document server configurations Maintain backup access credentials Test restore procedures regularly Keep workflow exports as backup Support Resources Rsync documentation: https://rsync.samba.org/ n8n community: https://community.n8n.io/ SSH troubleshooting guides Network diagnostics tools