by Vitorio Magalhães
Auto-publish NASA APOD to LinkedIn with AI translation and hashtags Transform NASA's daily astronomical wonders into engaging LinkedIn content automatically. This workflow fetches NASA's Astronomy Picture of the Day, translates it to Brazilian Portuguese using AI, generates strategic hashtags, and publishes everything to your LinkedIn profile with the stunning space image attached. Who's it for Content creators, astronomy enthusiasts, science communicators, and anyone wanting to share high-quality educational content consistently on LinkedIn. Perfect for Portuguese-speaking professionals who want to engage their network with fascinating space discoveries while building their personal brand as a science advocate. How it works The workflow runs on a daily schedule and handles the complete content pipeline automatically. It fetches the latest NASA APOD through the official API, including both the image and detailed explanation. The English description gets professionally translated to selected language using Google Gemini 2.5 Flash, while maintaining scientific accuracy and terminology. Smart hashtag generation combines fixed branding tags with content-specific ones, mixing Portuguese and English for maximum reach. The final post includes the NASA image, translated description, and strategic hashtags, then gets published to your LinkedIn profile automatically. How to set up You'll need accounts for Google AI Studio (free), LinkedIn Developer (free), and a Telegram bot for notifications. The setup takes about 15 minutes and uses only free services and APIs. First, create your Google AI Studio account and get an API key for the AI translation services. Then set up a LinkedIn OAuth2 application to enable posting permissions. Create a Telegram bot through BotFather and get your chat ID for notifications. Configure the Settings node with your Telegram chat ID and preferred language. The workflow comes with all prompts and configurations ready to use. Test each component individually before activating the daily automation. Requirements LinkedIn account with posting permissions Google AI Studio API key (free tier available) Telegram bot token and your chat ID Basic understanding of OAuth2 setup for LinkedIn NASA API key (optional - demo key included) All services used have generous free tiers, making this workflow completely free to operate indefinitely. How to customize the workflow The centralized Settings node makes customization simple. Change the target language from Brazilian Portuguese to any other language by updating the translate_to_language variable. Modify the posting schedule in the CRON trigger to match your preferred timing. Customize the post template in the "Create Final Post Text" node to match your personal brand voice. Adjust the hashtag strategy by editing the AI prompt in the "Generate Hashtags" node. Add additional social platforms by duplicating the LinkedIn publisher with different credentials. The AI prompts can be fine-tuned for different writing styles or specific astronomical topics. You can also extend the workflow to include additional content processing, image enhancements, or cross-posting to multiple platforms while maintaining the core NASA APOD automation.
by Jay Emp0
MCP Tool — Replicate (Flux) Image Generator → WordPress/Twitter Generates images via Replicate Flux models and uploads to WordPress (and optionally Twitter/X). Built to act as an MCP module that other agents/workflows call for on-demand image creation. Models configured in this workflow:\ black-forest-labs/flux-schnell, black-forest-labs/flux-dev, black-forest-labs/flux-1.1-pro Switch rationale: lower cost 💰, broader model choice 🎯, full control of parameters ⚙️ Leonardo API credits cannot be used in the web UI 🙅♂️; separate spend for API vs UI Links: 📜 Prior Leonardo-based workflow: https://n8n.io/workflows/6363-generate-and-upload-images-with-leonardo-ai-wordpress-and-twitter/ 📰 Blog automation consuming these images: https://n8n.io/workflows/6734-ai-blog-automation-publish-hourly-seo-articles-to-wordpress-and-twitter-v3/ 📥 Inputs | Field | Type | Description | | ------ | ------ | --------------------------------- | | prompt | string | Text description for the image | | slug | string | Filename slug for WP media | | model | string | One of the configured Flux models | Example: { "prompt":"Joker watching a Batman movie on his laptop", "slug":"joker-watching-batman", "model":"black-forest-labs/flux-dev" } 📤 Output { "public_image_url": "https://your-wp.com/wp-content/uploads/2025/08/img-joker-watching-batman.webp", "wordpress": {...}, "twitter": {...} } 🔄 Flow Trigger with prompt, slug, model Build model payload (quality/steps/ratio/output format) Call Replicate: POST /v1/models/{model}/predictions (Prefer: wait) Download the generated image URL Upload to WordPress (returns public URL) Optional: upload to Twitter/X Return URL + metadata 🤖 MCP Use at Scale (emp0.com) Operational pattern: I currently use this setup for my blog where i generate 300 posts/month, each with 4 images (banner + 2 to 3 inline images) → 1,000 images/month produced by this MCP. 💡 Hybrid Cost-Optimized Setup: High-priority images* (banners, main visuals): Generated using *Flux Dev** on Leonardo for slightly better prompt adherence. Low-priority images* (inline blog visuals): Generated using *Flux Schnell** on Replicate for maximum cost efficiency. 💰 Pricing Comparison (per image) Leonardo per-image cost uses API Basic math: $9 / 3,500 credits = $0.0025714 per credit. Flux Schnell (Leonardo)** = 7 credits Flux Dev (Leonardo)** = 7 credits Flux 1.1 Pro equivalent in Leonardo* = *Leonardo Phoenix** based on my experience = 10 credits | Flux Model | Replicate | Leonardo API* | | ------------------------ | ------------------------- | ------------------------------- | | flux-schnell | $0.0030 (=$3/1,000) | $0.0180 (7 × $0.0025714) | | flux-dev | $0.0250 | $0.0180 (7 × $0.0025714) | | flux-1.1-pro / Phoenix | $0.0400 | $0.0257 (10 × $0.0025714) | Replicate pricing: https://replicate.com/pricing\ Leonardo pricing: https://leonardo.ai/pricing/\ Leonardo API usage: https://docs.leonardo.ai/docs/commonly-used-api-values 📊 Monthly Cost Example (1,000 images/month) Mix: 300 ×flux-dev on Leonardo, 700 ×flux-schnell on Replicate. | Platform/Model | Images | Price per Image | Total | | ------------------------ | ------ | --------------- | ---------- | | Leonardo flux-dev | 300 | $0.0180 | $5.40 | | Replicate flux-schnell | 700 | $0.0030 | $2.10 | | Total Monthly Spend | 1000 | — | $7.50 | 💵 If using Leonardo for both: 300 × $0.0180 = $5.40 700 × $0.0180 = $12.60 Total = $18.00** Savings: $10.50/month (≈58% lower) with the hybrid setup. 📌 Notes More Replicate models can be added in Code1 node. Parameters tuned for aspect ratio, inference steps, quality, guidance. Leonardo credit model is API-only; credits are not spendable in Leonardo's web UI.
by Calistus Christian
What this workflow does Automatically triages risky AWS misconfigurations and alerts your team. Pipeline: Security Hub or AWS Config -> EventBridge rules -> SNS (HTTP) -> n8n Webhook -> Normalize -> AI Prioritizer -> Airtable (log) -> Gmail (email) Normalizes incoming findings (S3 / Security Groups / IAM / RDS) into a consistent JSON. Uses an LLM to assign a priority (P0–P3) with rationale and remediation steps. Upserts the finding into Airtable (avoids duplicates). Emails a compact incident summary to your inbox. This can be swapped for Microsoft Teams or Slack, etc. Category: Security / Cloud / Alerting Time to set up: ~10–15 minutes Difficulty: Beginner–Intermediate Cost: Mostly free (n8n CE + AWS SNS/EventBridge; OpenAI + Airtable/Gmail as used) What you’ll need An n8n instance reachable over HTTP. AWS account (one region) with permissions to create SNS topics and EventBridge rules. Security Hub** enabled (or AWS Config rules that emit compliance events). n8n credentials: OpenAI, Airtable, Gmail. Nodes used Webhook** (POST /aws-misconfig) Code:** SNS Handler (token check, confirm/unwrap) IF:** route mode === "confirm" vs notification HTTP Request:** SNS SubscriptionConfirmation (GET) Code:** Normalize Finding Message a model:** AI Prioritizer (JSON out) Airtable:** Create/Upsert Gmail:** Send message Edit Fields:** final JSON response Setup steps Import and activate the workflow in n8n. Webhook Respond: When Last Node Finishes -> First Entry JSON. Append a shared secret to the URL, e.g. ?token=MY_SUPER_TOKEN, and keep the check in the SNS Handler code node. Create an SNS topic (e.g., misconfig-events) in the same region as your EventBridge rules. Create EventBridge rules targeting the SNS topic: Rule A (Security Hub): source = aws.securityhub, detail-type = Security Hub Findings - Imported Rule B (AWS Config): source = aws.config, detail-type = Config Rules Compliance Change Create an SNS subscription with Protocol = HTTP and Endpoint = your production webhook URL: http://YOUR_HOST:5678/webhook/aws-misconfig?token=MY_SUPER_TOKEN (The workflow auto-confirms the subscription on first POST.) Configure Airtable (Upsert on Finding ID) and Gmail recipients.
by clancy jack
This n8n workflow recommends Taiwan indie music based on a user's city, mood, birthday, today's weather, and star sign. Here's a concise overview: Trigger: Starts manually with the "When clicking ‘Test workflow’" node. Input Setup: The "infomation" node sets user inputs (e.g., city: Taipei, mood: Happy, birthday: 1996/11/21). Song Recommendation: The "get song recommendation" node uses OpenAI's GPT-4o-mini to: Fetch today's weather for the specified city. Determine the user's zodiac sign from their birthday. Check the zodiac sign's daily fortune. Recommend a Taiwan indie song considering weather and fortune. Explain the song choice and highlight its features. Return results in JSON format. Data Extraction: The "Information Extractor" node parses the JSON output, extracting fields like date, city, weather, zodiac sign, fortune, song, artist, and additional info. Spotify Search: The "Spotify" node searches for the recommended song using the artist and song name, retrieving a Spotify URL. Final Output: The "Final Output" node compiles all data, including the Spotify link, into a structured format. Additional Note: A "Sticky Note" provides context about the workflow's purpose and credits the creator, n8nguide. This workflow integrates AI, weather data, astrology, and Spotify to deliver personalized Taiwan indie music recommendations.
by Custom Workflows AI
Introduction This workflow offers a streamlined solution for uploading multiple files to a GitHub repository simultaneously using GitHub's REST API. It addresses a significant limitation of n8n's native GitHub node, which only supports single-file uploads at a time. By leveraging GitHub's Git Data API, this workflow creates a new Git tree containing multiple files, commits this tree, and updates the target branch—all in a single automated process. The workflow is particularly valuable for automation scenarios that require batch file operations, such as deploying website updates, publishing documentation, or maintaining configuration files across repositories. It eliminates the need for multiple separate API calls when working with multiple files, making your automation more efficient and less prone to partial update issues. By abstracting the complexities of GitHub's Git Data API into a reusable workflow, it provides a practical solution for developers, content managers, and DevOps professionals who need to programmatically manage repository content at scale. Who is this for? This workflow is designed for: Developers and DevOps engineers who need to automate file updates in GitHub repositories Content managers who regularly publish multiple files to GitHub-hosted websites or documentation Automation specialists looking to integrate GitHub operations into larger workflows Teams using n8n for CI/CD processes who need to push code or configuration changes Users should have basic familiarity with GitHub concepts (repositories, branches, commits) and should be comfortable obtaining and using GitHub Personal Access Tokens. While the workflow handles the API complexity, users should understand the fundamentals of version control to effectively utilize and customize it. What problem is this workflow solving? This workflow addresses several key challenges: Limited batch operations: n8n's native GitHub node only supports uploading one file at a time, making multi-file operations cumbersome and inefficient. API complexity: GitHub's Git Data API requires multiple sequential calls with interdependent data to create commits with multiple files, which is complex to implement manually. Automation bottlenecks: Without this workflow, automating multi-file updates would require either multiple separate API calls (risking partial updates) or custom scripting outside of n8n. Consistency issues: When files need to be updated together (e.g., code and corresponding documentation), this workflow ensures they're committed in a single atomic operation. By solving these issues, the workflow enables reliable, atomic updates of multiple files, maintaining repository consistency and simplifying automation processes. What this workflow does Overview This workflow uses GitHub's REST API to push multiple files to a repository in a single operation. It follows Git's internal model by: Retrieving the current state of the repository Creating a new tree with the files to be added or updated Creating a new commit with this tree Updating the branch reference to point to the new commit Process Initialization: The workflow starts with a manual trigger and sets up GitHub credentials and repository information. File Content Definition: Two "Set" nodes define the content for the files to be uploaded. Repository State Retrieval: The workflow fetches the latest commit SHA for the specified branch It then retrieves the base tree SHA from this commit Tree Creation: A new Git tree is created that includes both files (file1.txt and file2.txt), specifying their paths and content. Commit Creation: A new commit is created with the specified commit message, referencing the new tree and the parent commit. Branch Update: Finally, the branch reference is updated to point to the new commit, making the changes visible in the repository. Setup To use this workflow: Import the workflow: Download the workflow JSON and import it into your n8n instance. Create a GitHub Personal Access Token: Go to GitHub Settings → Developer Settings → Personal Access Tokens → Fine-grained tokens Create a new token with "Contents" permission (Read and write) for your target repository Configure the workflow: Update the "Set Github Info" node with: Your GitHub Personal Access Token Your GitHub username Your repository name The target branch (default is "main") A commit message Define file content: Modify the "File 1" and "File 2" nodes with the content you want to upload Adjust file paths if needed: In the "Create new tree" node, update the file paths if you want to change where the files are stored in the repository Save and run the workflow: Click "Test workflow" to execute the process. How to customize this workflow to your needs This workflow can be adapted in several ways: Add more files: Create additional "Set" nodes for more file content In the "Create new tree" node, add more tree entries following the same pattern (path, mode, type, content) Change file locations: Modify the "path" parameters in the "Create new tree" node to place files in different directories Dynamic file content: Replace the static content in the "File" nodes with data from other sources Use previous nodes or HTTP requests to generate file content dynamically Conditional file updates: Add IF nodes to determine which files should be updated based on certain conditions Create separate branches in your workflow for different update scenarios Scheduled updates: Replace the manual trigger with a Schedule node to run the workflow at specific intervals Combine with other triggers like Webhook or database events to push files when certain events occur Error handling: Add Error Trigger nodes to handle potential API failures Implement notification nodes to alert you of successful pushes or failures
by Paul
Gmail AI Email Manager - Setup Guide 🎯 Workflow Overview This workflow will create an intelligent Gmail email manager that can: Monitor incoming emails via webhook Analyze email content using AI Categorize emails automatically Generate smart responses Take actions based on email content Send notifications for important emails 📋 Pre-Setup Checklist Before we build the workflow, let me gather the necessary information and validate our approach. Phase 1: Discovery & Planning [ ] Search for Gmail nodes [ ] Find AI analysis nodes [ ] Identify webhook trigger options [ ] Check notification nodes Phase 2: Configuration Requirements [ ] Gmail API credentials [ ] AI service (OpenAI/Claude) API key [ ] Webhook URL setup [ ] Email classification rules 🔧 Setup Instructions Step 1: Gmail API Setup Go to Google Cloud Console Create new project or select existing Enable Gmail API Create OAuth 2.0 credentials Add authorized redirect URI: https://your-n8n-instance.com/rest/oauth2-credential/callback Step 2: AI Service Setup Choose one of the following: OpenAI**: Get API key from platform.openai.com Claude**: Get API key from console.anthropic.com Local AI**: Set up Ollama or similar Step 3: n8n Credentials Gmail OAuth2: Add client ID, secret, and scopes AI Service: Add API key Webhook: Configure webhook URL Gmail AI Email Manager - Setup Guide 🔧 Quick Setup Checklist 1. Google Cloud Console [ ] Enable Gmail API [ ] Create OAuth2 credentials [ ] Add redirect URI: https://your-n8n.com/rest/oauth2-credential/callback [ ] Set up Gmail push notifications with Pub/Sub 2. API Keys [ ] Get OpenAI API key from platform.openai.com [ ] Create Google Sheets for logging (optional) 3. n8n Credentials [ ] Gmail OAuth2: Client ID, Secret, Scopes: gmail.readonly,gmail.modify,gmail.compose [ ] OpenAI API: Your API key 4. Gmail Labels (Create these) [ ] URGENT (red) [ ] IMPORTANT (orange) [ ] PROMOTIONAL (purple) [ ] PERSONAL (green) [ ] WORK (blue) [ ] SPAM (gray) 5. Update Workflow Values [ ] High Priority Alert: Change notification email [ ] Spreadsheet Log: Update sheet ID (if using) [ ] Webhook: Copy URL after saving workflow 6. Test [ ] Save & activate workflow [ ] Send test email to Gmail [ ] Check execution log [ ] Verify auto-categorization works That's it! Your AI email manager is ready! 🚀
by Rajeet Nair
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Description This workflow automatically collects daily trending topics from Twitter and YouTube, filters them for relevance, and uses an AI model (such as Mistral Cloud or another OpenAI-compatible API) to generate engaging social media hashtags. The final results, including source platform and date, are saved into a connected Google Sheet for easy access, tracking, or team collaboration. Ideal for content creators, marketers, and social media managers, this automation eliminates the manual effort of trend research and hashtag writing by combining real-time scraping with LLM-powered generation. The result is a scalable, daily strategy tool to stay aligned with what’s trending across major platforms. How It Works Daily Trigger Starts the workflow automatically on a daily schedule. Trend Scraping Scrapes current trending content from Twitter and YouTube using the Crawl and Scrape community node. Filtering & Slicing Removes irrelevant or duplicate entries and limits each platform’s list to top-performing trends. Merge Trends Combines Twitter and YouTube trends into a single dataset. AI Hashtag Generation Sends each trend topic to an AI model to generate relevant hashtags. Output to Google Sheets Loops through AI results and writes them to a Google Sheet, including trend, platform, hashtags, and timestamp. Setup Instructions Estimated time: 10–15 minutes Prerequisites A self-hosted instance of n8n (required for community nodes) API key for Mistral Cloud or any OpenAI-compatible LLM Google Sheets account connected via OAuth2 credentials Twitter and YouTube trend URLs (or scraping logic for target regions) Template Image: Example: Crawl and Scrape Node for Twitter Trends You can use the following configuration in the Crawl and Scrape node to extract Twitter trends from Trends24) { "parameters": { "url": "https://trends24.in/", "selectors": [ { "label": "Twitter Trends", "selector": ".trend-card__list li a", "type": "text" } ] }, "name": "Scrape Twitter Trends", "type": "n8n-nodes-crawl-and-scrape.crawlAndScrape", "typeVersion": 1, "position": [300, 200] } Google Sheet Column Format Column A: Generated Hashtags
by Agus Narestha
🔒 SSL Certificate Monitoring & Expiry Alert with Spreadsheet [FREE APIs] ✅ What This Workflow Does This n8n template automatically monitors SSL certificates of websites listed in a Google Sheet and sends email alerts if any are expiring within 14 days. It helps ensure you avoid downtime, security issues, and trust warnings due to expired certificates. 🧩 Key Features 📅 Weekly Automation: Runs every Monday at 7:00 AM (configurable). 📄 Google Sheets Integration: Fetches and updates data in a spreadsheet. 🔍 SSL Check via API: Uses ssl-checker.io to get certificate details. ⚠️ SSL Expiry Filter: Identifies certificates expiring within 14 days. 📧 Email Alerts: Sends notifications for certificates close to expiration. 📂 Input Spreadsheet Format Your Google Sheet should have the following columns: | No | Name | Link | SSL Issued On | SSL Expired On | SSL Status | |----|-----------------|-----------------------|-------------------|-------------------|------------| | 1 | Example Site | https://example.com | 2024-07-01 | 2025-07-01 | Valid | | 2 | My Blog | https://myblog.org | 2024-07-05 | 2024-07-20 | Expiring | Each row should include a valid website URL in the Link column. 🛠️ How It Works Scheduled Trigger Executes weekly (Monday 7:00 AM). Fetch Website List Reads all website entries from the Google Sheet. Check SSL Certificates Uses ssl-checker.io API to retrieve certificate details for each website. Update Spreadsheet Writes "Issued On" and "Expired On" fields back to the spreadsheet. Evaluate SSL Expiry Filters for certificates expiring within 14 days. Check Condition Determines whether to send alerts based on filtered results. Send Email Alert Notifies via email if any certificates are expiring soon. 📬 Example Email Output Subject: ⚠️ ALERT!! SSL EXPIRED SSL certificates expiring soon: example.com (expires in 5 days) anotherdomain.net (expires in 3 days) 🧰 Setup Requirements A Google Sheet with the correct columns and website links. SMTP credentials to send alert emails.
by Airtop
Automating Company Data Enrichment and HubSpot Integration Use Case This automation enriches company data based on email domain and LinkedIn profile, calculates an ICP (Ideal Customer Profile) score, and updates the corresponding company record in HubSpot. It’s ideal for onboarding, qualification, and CRM enrichment. What This Automation Does Input Parameters Contact email**: Used to derive the company domain. Company domain**: Primary web domain of the company. Company LinkedIn* *(optional): LinkedIn URL for enrichment accuracy. Airtop Profile (connected to LinkedIn)**: An authenticated Airtop Profile. What It Outputs Full company profile (name, tagline, website, headquarters) Employee count ICP score based on AI/tech profile, scale, agency type, and location Updates/creates record in HubSpot with all enriched attributes How It Works Input Validation: Filters out non-corporate domains like Gmail, Yahoo, or .edu. Enrichment Trigger: Launches Airtop workflows to extract and analyze data from LinkedIn and calculate the ICP score. Data Mapping: Compiles structured fields including: Overview, location (city, state, country) Company website and domain LinkedIn URL, employee count ICP score HubSpot Sync: Sends all the enriched fields to the designated HubSpot object for upsertion. Setup Requirements Airtop API Key Airtop Profile with active LinkedIn authentication HubSpot integration enabled for object updates Next Steps Use in Webforms**: Trigger this on signup to auto-populate CRM records. Enrich Manually Entered Contacts**: Use with list-based workflows for batch enrichment. Sync to Other CRMs**: Replace HubSpot step with Salesforce, Pipedrive, etc. for flexible integration. Read more about comapny data enrichment
by Basil Irfan
🚀 LinkedIn Lead-Gen Flywheel – Apify → GPT-4o → Google Sheets → Phantombuster What this workflow does Collect audience specs – simple web-form asks for your ideal company profile. Generate a laser-targeted Apollo search URL with GPT-4o (no manual filtering). Scrape the matching leads via an Apify actor (returns clean JSON). Craft hyper-personalized icebreakers for each lead using GPT-4o (ultra-short, human-sounding). Log everything to Google Sheets – name, LinkedIn URL, company site, summary, and the icebreaker. (Optional) Auto-launch Phantombuster to fire off those connection requests at scale. Why it matters Zero grunt work:** audience research, scraping, copy-writing, and outreach all run hands-free. Punchy personalization:** micro-icebreakers outperform canned intros, boosting accept rates. Scales with you:** flip a switch to go from 10 to 1 000+ connections/day. Node rundown | Step | Node | Key Inputs | Key Outputs | |------|------|-----------|-------------| | 1 | Form Trigger | Audience description | description_of_company | | 2 | OpenAI (GPT-4o) | Audience text | SearchUrl | | 3 | HTTP Request – Apify | SearchUrl, APIFY_TOKEN | Lead JSON | | 4 | OpenAI (GPT-4o) | Lead JSON | Icebreaker | | 5 | Google Sheets | Lead + Icebreaker | Row append/update | | 6 | Aggregate | Sheet rows | Batched output | | 7 | HTTP Request – Phantombuster | PHANTOM_KEY, AGENT_ID | Launch status | Prerequisites OpenAI API key** (GPT-4o access recommended) Apify API token** with access to actor id Google Service Account creds** shared with your target sheet Phantombuster API key** and Agent ID for your LinkedIn connector Active Apollo account to open the generated search URL (only required for debugging) Setup (5-minute sprint) Import the workflow into n8n. Add the required credentials in Credentials → OpenAI, Apify, Google Sheets, Phantombuster. Paste your Phantombuster Agent ID into the HTTP Request node URL. Publish the Form Trigger URL—this is where you (or your SDRs) describe the target audience. Hit Execute Workflow once to verify data flows end-to-end. Customization tips Titles & keywords:** tweak the prompt in the first GPT-4o node to lock in different roles or industries. Icebreaker style:** adjust the second GPT-4o prompt to match your brand voice. Data columns:** map extra fields from Apify into Google Sheets as needed. Skip outreach:** disable the Phantombuster node if you only want the leads + icebreakers.
by Yaron Been
Create your own intelligent Telegram bot that summarizes articles and processes commands automatically. This powerful workflow turns Telegram into your personal AI assistant, handling /help, /summary <URL>, and /img <prompt> commands with intelligent responses - perfect for teams, content creators, and anyone wanting smart automation in their messaging. 🚀 What It Does Smart Command Processing: Automatically recognizes and routes /help, /summary, and /img commands to appropriate AI-powered responses. Article Summarization: Fetches any URL, extracts content, and generates professional 10-12 bullet point summaries using OpenAI. Image Generation: Processes image prompts and integrates with AI image generation services. Help System: Provides users with clear command instructions and usage examples. 🎯 Key Benefits ✅ Personal AI Assistant: Get instant article summaries in Telegram ✅ Team Productivity: Share quick content summaries with colleagues ✅ Content Research: Rapidly digest articles and web content ✅ 24/7 Availability: Bot works around the clock without maintenance ✅ Easy Commands: Simple /summary <link> format anyone can use ✅ Scalable: Handles multiple users and requests simultaneously 🏢 Perfect For Content Teams & Researchers Journalists quickly summarizing news articles Marketing teams researching competitor content Students processing academic papers and articles Analysts digesting industry reports Business Applications Team Communication**: Share article insights in group chats Research Assistance**: Quick content analysis for decision making Content Curation**: Summarize articles for newsletters or reports Knowledge Sharing**: Help teams stay informed efficiently ⚙️ What's Included Complete Bot Workflow: Ready-to-deploy Telegram bot with all commands AI Integration: OpenAI-powered content summarization and processing Smart Routing: Intelligent command recognition and response system Error Handling: Robust system handles invalid commands gracefully Extensible Design: Easy to add new commands and features 🔧 Quick Setup Requirements n8n Platform**: Cloud or self-hosted instance Telegram Bot Token**: Create via @BotFather (free, 5 minutes) OpenAI API**: For content summarization (pay-per-use) Basic Configuration**: Follow 15-minute setup guide 📱 User Experience Simple Commands: /help - Show available commands /summary https://example.com - Get article summary /img sunset over mountains - Generate image (with supported APIs) Sample Summary Output: 📄 Article Summary: • Company reports 40% revenue growth in Q3 2024 • New AI features driving customer acquisition • Expansion into European markets planned for 2025 • Investment in R&D increased by 25% this quarter • Customer satisfaction scores improved to 94% • Three new product launches scheduled for next year • Remote work policy made permanent post-pandemic • Sustainability goals on track to meet 2025 targets • Partnership with major tech firm announced • Stock price up 15% following earnings report 🎨 Customization Options Command Extensions: Add custom commands for specific workflows Response Formatting: Customize summary style and length Multi-Language: Support different languages for international teams Integration APIs: Connect additional AI services and tools User Permissions: Control who can use specific commands Analytics: Track usage patterns and popular content 🏷️ Tags & Categories #telegram-bot #ai-automation #content-summarization #article-processing #team-productivity #openai-integration #smart-assistant #workflow-automation #messaging-bot #content-research #ai-agent #n8n-workflow #business-automation #telegram-integration #ai-powered-bot 💡 Use Case Examples News Team: Quickly summarize breaking news articles for editorial meetings Marketing Agency: Research competitor content and industry trends efficiently Sales Team: Digest industry reports and share insights with prospects Remote Team: Keep everyone informed with summarized company updates 📈 Expected Results 80% faster** content research and analysis 50% more articles** processed per day vs manual reading 100% team accessibility** through familiar Telegram interface 24/7 availability** for global teams across time zones 🛠️ Setup & Support Quick Start: Deploy your bot in 15 minutes with included guide Video Tutorial: Complete walkthrough available Template Commands: Pre-built responses and formatting Expert Support: Direct help from workflow creator 📞 Get Help & Resources YouTube: https://www.youtube.com/@YaronBeen/videos 💼 Professional Support LinkedIn: https://www.linkedin.com/in/yaronbeen/ 📧 Direct Help Email: Yaron@nofluff.online - Response within 24 hours Ready to build your intelligent Telegram assistant? Get this AI Telegram Bot Agent and transform your messaging app into a powerful content processing tool. Perfect for teams, researchers, and anyone who wants AI-powered assistance directly in Telegram. Stop manually reading long articles. Start getting instant, intelligent summaries with simple commands.
by Daniel Shashko
This workflow automates daily or manual keyword rank tracking on Google Search for your target domain. Results are logged in Google Sheets and sent via email using Bright Data's SERP API. Requirements: n8n (local or cloud) with Google Sheets and Gmail nodes enabled Bright Data API credentials Main Use Cases Track Google search rankings for multiple keywords and domains automatically Maintain historical rank logs in Google Sheets for SEO analysis Receive scheduled or on-demand HTML email reports with ranking summaries Customize or extend for advanced SEO monitoring and reporting How it works The workflow is divided into several logical steps: 1. Workflow Triggers Manual:** Start by clicking 'Test workflow' in n8n. Scheduled:** Automatically triggers every 24 hours via Schedule Trigger. 2. Read Keywords and Target Domains Fetches keywords and domains from a specified Google Sheets document. The sheet must have columns: Keyword and Domain. 3. Transform Keywords Formats each keyword for URL querying (spaces become +, e.g., seo expert → seo+expert). 4. Batch Processing Processes keywords in batches so each is checked individually. 5. Get Google Search Results via Bright Data Sends a request to Bright Data's SERP API for each keyword with location (default: US). Receives the raw HTML of the search results. 6. Parse and Find Ranking Extracts all non-Google links from HTML. Searches for the target domain among the results. Captures the rank (position), URL, and total number of results checked. Saves timestamp. 7. Save Results to Google Sheets Appends the findings (keyword, domain, rank, found URL, check time) to a “Results” sheet for history. 8. Generate HTML Report and Send Email Builds an HTML table with current rankings. Emails the formatted table to the specified recipient(s) with Gmail. Setup Steps Google Sheets: Create a sheet named “Results”, and another with Keyword and Domain columns. Update document ID and sheet names in the workflow’s config. Bright Data API: Acquire your Bright Data API token. Enter it in the Authorization header of the 'Getting Ranks' HTTP Request node. Gmail: Connect your Gmail account via OAuth2 in n8n. Set your destination email in the 'Sending Email Message' node. Location Customization: Modify the gl= parameter in the SERP API URL to change country/location (e.g., gl=GB for the UK). Notes This workflow is designed for n8n local or cloud environments with suitable connector credentials. Customize batch size, recipient list, or ranking extraction logic per your needs. Use sticky notes in n8n for further setup guidance and workflow tips. With this workflow, you have an automated, repeatable process to monitor, log, and report Google search rankings for your domains—ideal for SEO, digital marketing, and reporting to clients or stakeholders.