by Oneclick AI Squad
This n8n workflow helps users easily discover nearby residential construction projects by automatically scraping and analyzing property listings from 99acres and other real estate platforms. Users can send an email with their location preferences and receive a curated list of available properties with detailed information, including pricing, area, possession dates, and construction status. Good to know The workflow focuses specifically on residential construction projects and active developments Property data is scraped in real-time to ensure the most current information Results are automatically formatted and structured for easy reading The system handles multiple property formats and data variations from different sources Fallback mechanisms ensure reliable data extraction even when website structures change How it works Trigger: New Email** - Detects incoming emails with property search requests and extracts location preferences from email content Extract Area & City** - Parses the email body to identify target areas (e.g., Gota, Ahmedabad) and falls back to city-level search if specific area is not mentioned Scrape Construction Projects** - Performs web scraping on 99acres and other property websites based on the extracted area and city information Parse Project Listings** - Cleans and formats the scraped HTML data into structured project entries with standardized fields Format Project Details** - Transforms all parsed projects into a consistent email-ready list format with bullet points and organized information Send Results to User** - Delivers a professionally formatted email with the complete list of matching construction projects to the original requester Email Format Examples Input Email Format To: properties@yourcompany.com Subject: Property Search Request Hi, I am interested in buying a flat. Can you please send me the list of available properties in Gota, Ahmedabad? Output Email Example Subject: 🏘️ Property Search Results: 4 Projects Found in Gota, Ahmedabad 🏘️ Available Construction Projects in Gota, Ahmedabad Search Area: Gota, Ahmedabad Total Projects: 4 Search Date: August 4, 2025 📋 PROJECT LISTINGS: 🔷 Project 1 🏠 Name: Vivaan Oliver offers 🏢 BHK: 3 BHK 💰 Price: N/A 📐 Area: 851.0 Sq.Ft 🗓️ Possession: August 2025 📊 Status: under construction 📍 Location: Thaltej, Ahmedabad West 🕒 Scraped Date: 2025-08-04 🔷 Project 2 🏠 Name: Vivaan Oliver offers 🏢 BHK: 3 BHK 💰 Price: Price on Request 📐 Area: 891 Sq Ft 🗓️ Possession: N/A 📊 Status: Under Construction 📍 Location: Thaltej, Ahmedabad West 🕒 Scraped Date: 2025-08-04 🔷 Project 3 🏠 Name: It offers an exclusive range of 🏢 BHK: 3 BHK 💰 Price: N/A 📐 Area: 250 Sq.Ft 🗓️ Possession: 0 2250 📊 Status: Under Construction 📍 Location: Thaltej, Ahmedabad West 🕒 Scraped Date: 2025-08-04 🔷 Project 4 🏠 Name: N/A 🏢 BHK: 2 BHK 💰 Price: N/A 📐 Area: N/A 🗓️ Possession: N/A 📊 Status: N/A 📍 Location: Thaltej, Ahmedabad West 💡 Next Steps: • Contact builders directly for detailed pricing and floor plans • Schedule site visits to shortlisted properties • Verify possession timelines and construction progress • Compare amenities and location advantages 📞 For more information or specific requirements, reply to this email. How to use Setup Instructions Import the workflow into your n8n instance Configure Email Credentials: Set up email trigger for incoming property requests Set up SMTP credentials for sending property listings Configure Web Scraping: Ensure proper headers and user agents for 99acres access Set up fallback mechanisms for different property websites Test the workflow with sample property search emails Sending Property Search Requests Send an email to your configured property search address Include location details in natural language (e.g., "Gota, Ahmedabad") Optionally specify preferences like BHK, budget, or amenities Receive detailed property listings within minutes Requirements n8n instance** (cloud or self-hosted) with web scraping capabilities Email account** with IMAP/SMTP access for automated communication Reliable internet connection** for real-time property data scraping Valid target websites** (99acres, MagicBricks, etc.) access Troubleshooting No properties found**: Verify area spelling and check if the location has active listings Scraping errors**: Update user agents and headers if websites block requests Duplicate results**: Implement better deduplication logic based on property names and locations Email parsing issues**: Test with various email formats and improve regex patterns Website structure changes**: Implement fallback parsers and regular monitoring of scraping success rates
by Rakin Jakaria
Who this is for This workflow is for freelancers, job seekers, or service providers who want to automatically apply to businesses by scraping their website information, extracting contact details, and sending personalized job application emails with AI-powered content — all from one form submission. What this workflow does This workflow starts every time someone submits the Job Applier Form. It then: Scrapes the target business website** to gather company information and contact details. Converts HTML content** to readable markdown format for better AI processing. Extracts email addresses* and creates a company summary using *GPT-5 AI**. Validates email addresses** to ensure they contain proper formatting (@ symbol check). Accesses your experience data* from a connected *Google Sheet** with your skills and portfolio. Generates personalized application emails* (subject + body) using *GPT-5** based on the job position and company info. Sends the application email* automatically via *Gmail** with your name as sender. Provides confirmation** through a completion form showing the AI's response. Setup To set this workflow up: Form Trigger – Customize the job application form fields (Target Business Website, Applying As dropdown with positions like Video Editor, SEO Expert, etc.). OpenAI GPT-5 – Add your OpenAI API credentials for both AI models used in the workflow. Google Sheets – Connect your sheet containing your working experience data, skills, and portfolio information. Gmail Account – Link your Gmail account for sending application emails automatically. Experience Data – Update the Google Sheet with your relevant skills, experience, and achievements for each job type. Sender Name – Modify the sender name in Gmail settings (currently set to "Jamal Mia"). How to customize this workflow to your needs Add more job positions to the dropdown menu (currently includes Video Editor, SEO Expert, Full-Stack Developer, Social Media Manager). Modify the AI prompt to reflect your unique value proposition and application style. Enhance email validation with additional checks like domain verification or email format patterns. Add follow-up scheduling to automatically send reminder emails after a certain period. Include attachment functionality to automatically attach your resume or portfolio to applications. Switch to different email providers or add multiple sender accounts for variety.
by Wolf Bishop
A reliable, no-frills web scraper that extracts content directly from websites using their sitemaps. Perfect for content audits, migrations, and research when you need straightforward HTML extraction without external dependencies. How It Works This streamlined workflow takes a practical approach to web scraping by leveraging XML sitemaps and direct HTTP requests. Here's how it delivers consistent results: Direct Sitemap Processing: The workflow starts by fetching your target website's XML sitemap and parsing it to extract all available page URLs. This eliminates guesswork and ensures comprehensive coverage of the site's content structure. Robust HTTP Scraping: Each page is scraped using direct HTTP requests with realistic browser headers that mimic legitimate web traffic. The scraper includes comprehensive error handling and timeout protection to handle various website configurations gracefully. Intelligent Content Extraction: The workflow uses sophisticated JavaScript parsing to extract meaningful content from raw HTML. It automatically identifies page titles through multiple methods (title tags, Open Graph metadata, H1 headers) and converts HTML structure into readable text format. Framework Detection: Built-in detection identifies whether sites use WordPress, Divi themes, or heavy JavaScript frameworks. This helps explain content extraction quality and provides valuable insights about the site's technical architecture. Rich Metadata Collection: Each scraped page includes detailed metadata like word count, HTML size, response codes, and technical indicators. This data is formatted into comprehensive markdown files with YAML frontmatter for easy analysis and organization. Respectful Rate Limiting: The workflow includes a 3-second delay between page requests to respect server resources and avoid overwhelming target websites. The processing is sequential and controlled to maintain ethical scraping practices. Detailed Success Reporting: Every scraped page generates a report showing extraction success, potential issues (like JavaScript dependencies), and technical details about the site's structure and framework. Setup Steps Configure Google Drive Integration Connect your Google Drive account in the "Save to Google Drive" node Replace YOUR_GOOGLE_DRIVE_CREDENTIAL_ID with your actual Google Drive credential ID Create a dedicated folder for your scraped content in Google Drive Copy the folder ID from the Google Drive URL (the long string after /folders/) Replace YOUR_GOOGLE_DRIVE_FOLDER_ID_HERE with your actual folder ID in both the folderId field and cachedResultUrl Update YOUR_FOLDER_NAME_HERE with your folder's actual name Set Your Target Website In the "Set Sitemap URL" node, replace https://yourwebsitehere.com/page-sitemap.xml with your target website's sitemap URL Common sitemap locations include /sitemap.xml, /page-sitemap.xml, or /sitemap_index.xml Tip: Not sure where your sitemap is? Use a free online tool like https://seomator.com/sitemap-finder Verify the sitemap URL loads correctly in your browser before running the workflow Update Workflow IDs (Automatic) When you import this workflow, n8n will automatically generate new IDs for YOUR_WORKFLOW_ID_HERE, YOUR_VERSION_ID_HERE, YOUR_INSTANCE_ID_HERE, and YOUR_WEBHOOK_ID_HERE No manual changes needed for these placeholders Adjust Processing Limits (Optional) The "Limit URLs (Optional)" node is currently disabled for full site scraping Enable this node and set a smaller number (like 5-10) for initial testing For large websites, consider running in batches to manage processing time and storage Customize Rate Limiting (Optional) The "Wait Between Pages" node is set to 3 seconds by default Increase the delay for more respectful scraping of busy sites Decrease only if you have permission and the target site can handle faster requests Test Your Configuration Enable the "Limit URLs (Optional)" node and set it to 3-5 pages for testing Click "Test workflow" to verify the setup works correctly Check your Google Drive folder to confirm files are being created with proper content Review the generated markdown files to assess content extraction quality Run Full Extraction Disable the "Limit URLs (Optional)" node for complete site scraping Execute the workflow and monitor the execution log for any errors Large websites may take considerable time to process completely (plan for several hours for sites with hundreds of pages) Review Results Each generated file includes technical metadata to help you assess extraction quality Look for indicators like "Limited Content" warnings for JavaScript-heavy pages Files include word counts and framework detection to help you understand the site's structure Framework Compatibility: This scraper is specifically designed to work well with WordPress sites, Divi themes, and many JavaScript-heavy frameworks. The intelligent content extraction handles dynamic content effectively and provides detailed feedback about framework detection. While some single-page applications (SPAs) that render entirely through JavaScript may have limited content extraction, most modern websites including those built with popular CMS platforms will work excellently with this scraper. Important Notes: Always ensure you have permission to scrape your target website and respect their robots.txt guidelines. The workflow includes respectful delays and error handling, but monitor your usage to maintain ethical scraping practices.RetryClaude can make mistakes. Please double-check responses.
by Marth
Okay, here are the "How It Works" and "Setup Steps" for your "Automated Social Media Content Distribution System," presented clearly in Markdown. How It Works (Workflow Stages) ⚙️ This system transforms manual, repetitive tasks into a smooth, automated content distribution pipeline: Content Submission & Trigger: You add a new row to your designated Google Sheet with all the content details (Title, URL, Short_Description, Image_URL, Hashtags, and boolean flags for which platforms to post to). The Google Sheets Trigger node immediately detects this new entry, initiating the workflow. Content Preparation: The Set node takes the raw data from your Google Sheet and formats it into a cohesive text string (social_media_text_core) that is suitable for posting across different social media platforms. Conditional Social Media Posting: A series of If nodes (Check Facebook Post, Check Twitter Post, Check LinkedIn Post) sequentially check your preferences (based on the Post_to_Facebook, Post_to_Twitter, Post_to_LinkedIn columns in your sheet). If a platform is marked TRUE, the corresponding social media node (Facebook, Twitter, LinkedIn) is activated to publish your content. If FALSE, that platform is skipped, and the workflow moves to the next check. Status Update & Notification: After attempting to post to all selected platforms, the Google Sheets (Update) node updates the Publication_Status column of your original row to "Published." This prevents re-posting and provides a clear record. Finally, the Slack (Notification) node sends an alert to your chosen Slack channel, confirming that the content has been successfully distributed. Setup Steps 🛠️ (Build It Yourself!) Follow these detailed steps to build and implement this workflow in your n8n instance: Prepare Your Google Sheet: Create a new Google Sheet (e.g., named "Social Media Posts"). Set up the following exact column headers in the first row: Title, URL, Short_Description, Image_URL, Hashtags, Post_to_Facebook, Post_to_Twitter, Post_to_LinkedIn, Publication_Status Fill in a test row with some sample data, ensuring TRUE/FALSE values for the posting flags. Gather Your API Keys & Credentials: Google Sheets Credential: You'll need an OAuth2 credential for Google Sheets in n8n to allow read/write access to your sheet. Facebook Credential: An OAuth2 credential for Facebook with permissions to post to your selected Page. Twitter Credential: A Twitter API credential (API Key, API Secret, Access Token, Access Token Secret) from your Twitter Developer App. LinkedIn Credential: An OAuth2 credential for LinkedIn with permissions to share updates to your profile or organization page. Slack Credential: A Slack API token (Bot User OAuth Token) for sending messages to your channel. Build the n8n Workflow Manually (10 Nodes): Start a new workflow in n8n. Drag and drop each of the following nodes onto the canvas and connect them as described below: Google Sheets Trigger Name: Google Sheets Trigger Parameters: Authentication: Select your Google Sheets credential. Spreadsheet ID: [Copy the ID from your Google Sheet's URL] Sheet Name: [Your Sheet Name, e.g., 'Sheet1' or 'Content'] Watch For: Rows Events: Added Connections: Output to Set Content Parameters. Set Name: Set Content Parameters Parameters: Values to Set: Add a new value: Type: String Name: social_media_text_core Value: ={{ $json.Title }} - {{ $json.Short_Description }}\nRead more: {{ $json.URL }}\n{{ $json.Hashtags }} Connections: Output to Check Facebook Post. If Name: Check Facebook Post Parameters: Value 1: ={{ $json.Post_to_Facebook }} Operation: is true Connections: True output to Post Facebook Message. False output to Check Twitter Post. Facebook Name: Post Facebook Message Parameters: Authentication: Select your Facebook credential. Page ID: [YOUR_FACEBOOK_PAGE_ID] Message: ={{ $json.social_media_text_core }} Link: ={{ $json.URL }} Picture: ={{ $json.Image_URL }} Options: Published (checked) Connections: Output to Check Twitter Post. If Name: Check Twitter Post Parameters: Value 1: ={{ $json.Post_to_Twitter }} Operation: is true Connections: True output to Create Tweet. False output to Check LinkedIn Post. Twitter Name: Create Tweet Parameters: Authentication: Select your Twitter credential. Tweet: ={{ $json.social_media_text_core }} Image URL: ={{ $json.Image_URL }} Connections: Output to Check LinkedIn Post. If Name: Check LinkedIn Post Parameters: Value 1: ={{ $json.Post_to_LinkedIn }} Operation: is true Connections: True output to Share LinkedIn Update. False output to Update Publication Status. LinkedIn Name: Share LinkedIn Update Parameters: Authentication: Select your LinkedIn credential. Resource: Share Update Type: Organization or Personal (Choose as appropriate) Organization ID: [YOUR_LINKEDIN_ORG_ID] (If Organization type selected) Content: ={{ $json.social_media_text_core }} Content URL: ={{ $json.URL }} Image URL: ={{ $json.Image_URL }} Connections: Output to Update Publication Status. Google Sheets Name: Update Publication Status Parameters: Authentication: Select your Google Sheets credential. Spreadsheet ID: [YOUR_GOOGLE_SHEET_CONTENT_ID] Sheet Name: [Your Sheet Name, e.g., 'Sheet1' or 'Content'] Operation: Update Row Key Column: URL Key Value: ={{ $json.URL }} Values: Add a new value: Column: Publication_Status Value: Published Connections: Receives connections from both Share LinkedIn Update and the False branch of Check LinkedIn Post. Slack Name: Send Slack Notification Parameters: Authentication: Select your Slack credential. Chat ID: [YOUR_SLACK_CHANNEL_ID] Text: New content "{{ $json.Title }}" successfully published to social media! 🎉 Check: {{ $json.URL }} Connections: Output to Update Publication Status. Final Steps & Activation: Test the Workflow: Before activating, manually add a new row to your Google Sheet or use n8n's "Execute Workflow" button (if available for triggers). Observe the flow through each node to ensure it behaves as expected and posts to your social media accounts. Activate Workflow: Once you are confident it's working correctly, turn the workflow "Active" in the top right corner of your n8n canvas.
by WeblineIndia
📊 Generate Weekly Energy Consumption Reports with API, Email and Google Drive This workflow automates the process of retrieving energy consumption data, formatting it into a CSV report, and distributing it every week via email and Google Drive. ⚡ Quick Implementation Steps: Import the workflow into your n8n instance. Configure your API, email details and Google Drive folder. (Optional) Adjust the CRON schedule if you need a different time or frequency. Activate workflow—automated weekly reports begin immediately. 🎯 Who’s It For Energy providers, sustainability departments, facility managers, renewable energy operators. 🛠 Requirements n8n instance Energy Consumption API access Google Drive account Email SMTP access ⚙️ How It Works Workflow triggers every Monday at 8 AM, fetches consumption data, emails CSV report and saves a copy to Google Drive. 🔄 Workflow Steps 1. Schedule Weekly (Mon 8:00 AM) Type: Cron Node Runs every Monday at 8:00 AM. Triggers the workflow execution automatically. 2. Fetch Energy Data Type: HTTP Request Node Makes a GET request to: https://api.energidataservice.dk/dataset/ConsumptionDE35Hour (sample API) The API returns JSON data with hourly electricity consumption in Denmark. Sample Response Structure: { "records": [ { "HourDK": "2025-08-25T01:00:00", "MunicipalityNo": _, "MunicipalityName": "Copenhagen", "ConsumptionkWh": 12345.67 } ] } 3. Normalize Records Type: Code Node Extracts the records array from the API response and maps each entry into separate JSON items for easier handling downstream. Code used: const itemlist = $input.first().json.records; return itemlist.map(r => ({ json: r })); 4. Convert to File Type: Convert to File Node Converts the array of JSON records into a CSV file. The CSV is stored in a binary field called data. 5. Send Email Weekly Report Type: Email Send Node Sends the generated CSV file as an attachment. Parameters: fromEmail: Sender email address (configure in node). toEmail: Recipient email address. subject: "Weekly Energy Consumption Report". attachments: =data (binary data from the previous node). 6. Report File Upload to Google Drive Type: Google Drive Node Uploads the CSV file to your Google Drive root folder. Filename pattern: energy_report_{{ $now.format('yyyy_MM_dd_HH_ii_ss') }} Requires valid Google Drive OAuth2 credentials. ✨ How To Customize Change report frequency, email template, data format (CSV/Excel) or add-ons. ➕ Add-ons Integration with analytics tools (Power BI, Tableau) Additional reporting formats (Excel, PDF) Slack notifications 🚦 Use Case Examples Automated weekly/monthly reporting for compliance Historical consumption tracking Operational analytics and forecasting 🔍 Troubleshooting Guide | Issue | Cause | Solution | |-------|-------|----------| | Data not fetched | API endpoint incorrect | Verify URL | | Email delivery issues | SMTP configuration incorrect | Verify SMTP | | Drive save fails | Permissions/Drive ID incorrect | Check Drive permissions | 📞 Need Assistance? Contact WeblineIndia for additional customization and support, we're happy to help.
by Khairul Muhtadin
Stop wasting hours watching long videos. This n8n workflow acts as your personal "TL;DW" (Too Long; Didn't Watch) assistant. It automatically pulls YouTube transcripts using Decodo, analyzes them with Google Gemini, and sends a detailed summary straight to your Telegram. Why You Need This Save Time:** Turn a 2-hour video into a 5-minute read (95% faster). Don't Miss a Thing:** Captures key points, chapters, tools mentioned, and quotes that you might miss while skimming. Instant Results:** Get a structured summary in Telegram within 30-60 seconds. Multi-Language:** Works with any video language that has YouTube captions. Who Is This For? Creators & Marketers:** Spy on competitor strategies and extract tools without watching endless footage. Students:** Turn lecture recordings into instant study notes. Busy Pros:** Digest conference talks and webinars on the go. How It Works Send Link: You message a YouTube link to your Telegram bot. Scrape: The bot uses the Decodo API to grab the video transcript and metadata (views, chapters, etc.). Analyze: Google Gemini reads the text and writes a structured summary (overview, takeaways, tools). Deliver: You receive the formatted summary in chat. Setup Guide What You Need n8n instance** (to run the workflow) Telegram Bot Token** (free via @BotFather) Decodo Scraper API Key** (for YouTube data - Get it here) Google Gemini API Key** (for the AI - Get it here) Quick Installation Import: Load the JSON file into your n8n instance. Credentials: Add your API keys for Telegram, Decodo, and Google Gemini in the n8n credentials section. Configure: In the "Alert Admin" node, set the chatId to your Telegram User ID (find it via @userinfobot). (Optional) Change the languageCode in the Config node if you want non-English transcripts. Test: Send a YouTube link to your bot. You should see a "Processing..." message followed by your summary! Troubleshooting & Tips "Not a YouTube URL":** Make sure you are sending a standard youtube.com or youtu.be link. No Transcript:** The video must have captions (auto-generated or manual) for this to work. Customization:** You can edit the AI Prompt in the "Generate TLDR" node to change how the summary looks (e.g., "Make it funny" or "Focus on technical details"). Created by: Khaisa Studio Category: AI-Powered Automation Tags: YouTube, AI, Telegram, Summarization, Decodo, Gemini Need custom workflows? Contact us Connect with the creator: Portfolio • Workflows • LinkedIn • Medium • Threads
by Cuong Nguyen
Why I built this? (The Problem) Most expense tracker apps (like Money Lover, Spendee, or Wallet) have a common friction point: Data Entry. You have to unlock your phone, find the app, wait for it to load, navigate menus, and manually select categories. It’s tedious, so we often forget to log small expenses. I wanted a solution that lives where I already spend my time: Telegram. This workflow allows you to log expenses in seconds—just by sending a text, a voice note while driving, or a photo of a receipt. No UI navigation required. Comparison: This Workflow vs. Traditional Apps |Feature|Traditional Expense Apps|This n8n Workflow| |-|-|-| |Data Ownership|Data is locked in their proprietary database.|100% Yours. It lives in your Google Sheet. You can export, pivot, or connect it to Looker Studio.| |Input Speed|Slow. Requires multiple taps/clicks.|Instant. Send a text/voice/photo to a Telegram bot.| |Flexibility|Rigid categories & logic.|Infinite. You can customize the AI prompt, categories, and currency logic.| |Cost|Often requires monthly subscriptions for premium features.|Low Cost. Runs on your n8n instance + Gemini Flash API (which is currently free/very cheap).| |UI/UX|Beautiful, pre-built mobile dashboards.|Raw Data. You view data in Google Sheets (though you can build a dashboard there)| Key Features Multi-Modal Input:** Just send what you have. -- Text: "Lunch 50k, Taxi 30k" (Splits into 2 rows). -- Voice: Speak naturally, AI transcribes and extracts data. -- Photo: OCRs receipts and parses details. Global Currency Support:** Uses Gemini AI to intelligently detect currency. You can set a default currency (e.g., USD, VND) in the Config node. Smart Extraction & Categorization:** Automatically splits* multiple items in one message (e.g., "Lunch 20k and Grab 50k" → 2 separate rows). AI automatically* assigns categories (Food, Transport, Bills, etc.) based on the item name. Budget Management:** Use the command /add budget 500 to instantly top up your monthly budget. "Quiet" Reporting:* Instead of spamming you after every message, the system waits for *30 minutes of inactivity** before sending a daily summary report (Debounce logic). Setup Instructions 1. Prerequisites Google Sheet:* You *MUST* make a copy of this template: *[[Google Sheet Template here]](https://docs.google.com/spreadsheets/d/1bVdxslvMkTA1DDQv6kQ5j-COC8ZOH8AfEWmn31Rq6l4/edit?usp=sharing)** n8n Data Table:* This workflow requires a Data Table named ReportTokens for the reporting feature. Please read the setup guide below. *Setup Guide: AI Expense Tracker** 2. Configure the Workflow Credentials:* Connect *Telegram, **Google Sheets, and Google Gemini (PaLM). Config Node:** Open the CONFIG - User Settings node and update these fields. -- spreadsheet_id: The ID of your copied Google Sheet -- sheet_gid_dashboard: The ID of your sheet Dashboard -- sheet_gid_budget: he ID of your sheet Budget_Topups -- currency_code: Your currency code (e.g., USD, EUR, VND). -- currency_symbol: Your currency symbol (e.g., $, €, ₫). -- locale: Your locale for number formatting (e.g., en-US, vi-VN). Data Table**: Create a table in n8n with columns: chat_id, report_token, updated_at (All type: String). Link this table to the relevant nodes in the workflow. 3. Usage Log Expense*: Send *"Coffee $5" or a photo. Add Budget**: Send command /add budget 1000 Need Help or Want to Customize This? Contact me for consulting and support: Email: cuongnguyen@aiops.vn
by Vasyl Pavlyuchok
What this template does This workflow turns the arXiv AI feed into a daily research assistant. Every morning it fetches the latest Artificial Intelligence papers from arXiv.org, deduplicates them, stores one page per paper in Notion (with metadata + PDF link), generates a Deep Research Summary of each PDF using Gemini, and finally posts a short update to a Telegram channel with links to both the paper and the Notion summary. Who is this for & what problem it solves This template is designed for founders, researchers, builders and curious professionals who want to stay up-to-date with AI research without reading every paper in full. It solves the “information overload” problem: instead of manually checking arXiv and skimming PDFs, you get a curated daily feed with human-style explanations stored in your own Notion database. Use Cases Daily AI research digest for solo founders or small teams. Private “AI research hub” in Notion for your company or lab. Telegram channel that shares the latest AI papers plus plain-English summaries. Personal learning pipeline: track, tag and revisit important papers over time. How it works (workflow overview) Scheduled trigger runs every day at 08:00. HTTP Request pulls the latest AI papers from arXiv’s API. Results are converted from XML to JSON and cleaned. A time-window filter keeps only recent papers and removes duplicates. For each paper, a Notion page is created (metadata + PDF URL). Gemini reads the PDF and returns a structured, multi-chunk summary. Each chunk is appended to the same Notion page as rich-text blocks. A Telegram message is sent with title, short abstract, PDF link and Notion link. Setup (step-by-step) Create a Notion database and connect your Notion integration. Map the properties in the Register to Notion Database node (title, arxiv_id, abstract, authors, categories/tags, published date, pdf URL). Add your Gemini API key and model in the Analyze doc (Prompt Ultra-Pro) node. Add your Telegram bot token and chat_id in the Send a text message node. (Optional) Adjust the arXiv query in HTTP Request to focus on your preferred AI categories or keywords. Enable the Scheduled Daily Trigger when you’re ready to run it in production. Customization options Change the arXiv search query (keywords, categories, max_results). Modify the time window logic (e.g. 24h, 48h, or no filter). Adapt the Notion properties to your own schema (status, tags, priority, etc.). Switch the messaging channel (Telegram, Discord, Slack) using similar nodes.
by David Olusola
📉 Buy the Dip Alert (Telegram/Slack/SMS) 📌 Overview This workflow automatically notifies you when Bitcoin or Ethereum drops more than a set percentage in the last 24 hours. It’s ideal for traders who want to stay ready for buy-the-dip opportunities without constantly refreshing charts. ⚙️ How it works Schedule Trigger — runs every 30 minutes (adjustable). HTTP Request (CoinGecko) — fetches BTC & ETH prices and 24h % change. Code Node (“Dip Check”) — compares changes against your dip threshold. IF Node — continues only if dip condition is true. Notification Node — sends alert via Telegram, Slack, or SMS (Twilio). Example Output: Dip Alert — BTC –3.2%, ETH –2.8% Not financial advice. 🛠 Setup Guide 1) Dip threshold Open the Code node. Change the line: const DIP = -2.5; // trigger if 24h drop <= -2.5% Set your preferred dip value (e.g., –5 for a 5% drop). 2) Choose your alert channel Telegram: add your bot token & chat ID. Slack: connect Slack API & set channel name. Twilio: configure SID, token, from/to numbers. 3) Test Temporarily set DIP to 0 to force an alert. Run once from the Code node → confirm alert message text. Execute the Notification node → confirm delivery to your channel. 🎛 Customization Cadence: change Schedule Trigger (every 5m, 15m, hourly, etc.). Coins: extend the CoinGecko call (add solana, bnb) and update Code node logic. Multiple alerts: duplicate IF → Notification branch for different thresholds (minor vs major dip). Combine with “Threshold Alerts” workflow to cover both upside breakouts and downside dips. Storage: log alerts into Google Sheets for tracking dip history. 🧩 Troubleshooting No alerts firing: check CoinGecko API response in Execution Data. Wrong %: CoinGecko returns usd_24h_change directly — no math needed. Duplicate alerts: add a debounce using a Sheet/DB to store last fired time. Telegram not posting: confirm bot has access to your channel/group.
by Trung Tran
Beginner’s Tutorial: Manage Azure Storage Account Container & Blob with n8n > This beginner-friendly n8n workflow shows you how to generate AI images using OpenAI, store them in Azure Blob Storage, and manage blob containers, all with zero code. 👤 Who’s it for This workflow is perfect for: Beginners learning Azure + OpenAI integration** No-code developers** experimenting with image generation Cloud learners** who want hands-on Blob Storage use cases Anyone who wants to automate storing AI-generated content in the cloud ⚙️ How it works / What it does 🖱️ Trigger the workflow manually using the Execute Workflow node. ✏️ Use the Edit Fields node to input: containerName (e.g., demo-images) imageIdea (e.g., "a robot holding a coffee cup") 📦 Create a new Azure Blob container (Create container). 🤖 Use an OpenAI-powered Prompt Generation Agent to craft the perfect image prompt. 🎨 Generate an image using OpenAI’s DALL·E model. ☁️ Upload the generated image to Azure Blob Storage (Create Blob). 📂 List blobs in the container (Get many blobs). 🧹 Delete any blob as needed (Delete Blob). (Optional) 🗑️ Remove the entire container (Delete container). 🔧 How to set up 🧠 Set up OpenAI Create an OpenAI account and get your API key. In n8n, go to Credentials → OpenAI and paste your key. 🪣 Set up Azure Blob Storage Log in to your Azure Portal. Create a Storage Account (e.g., mystorageaccount). Go to Access Keys tab and copy: Storage Account Name Key1 In n8n, create a new Azure Blob Storage Credential using: Account Name = your storage account name Access Key = key1 value > 📝 This demo uses Access Key authentication. You can also configure Shared Access Signatures (SAS) or OAuth in production setups. Run the Workflow Enter your image idea and container name. Click “Execute Workflow” to test it. 📋 Requirements | Requirement | Description | |------------------------|--------------------------------------------------| | Azure Storage Account | With container-level read/write access | | OpenAI API Key | For image and prompt generation | | n8n Version | v1.0+ recommended | | Image Credits | OpenAI charges tokens for DALL·E image creation | 🛠️ How to customize the workflow 🧠 Adjust Prompt Generation Update the Prompt Agent to include: Specific style (3D, anime, cyberpunk) Brand elements Multiple language options 📁 Organize by Date/User Modify the containerName to auto-include: Date (e.g., images-2025-08-20) Username or session ID 📤 Send Image Output Add Slack, Telegram, or Email nodes to deliver the image Create public links using Azure’s blob permissions 🔁 Cleanup Logic Auto-delete blobs after X days Add versioning or backup logic
by Sergey Skorobogatov
GiggleGPTBot — Witty Telegram Bot with AI & Postgres 📝 Overview GiggleGPTBot is a witty Telegram bot built with n8n, OpenRouter, and Postgres. It delivers short jokes, motivational one-liners, and playful roasts, responds to mentions, and posts scheduled witty content. The workflow also tracks user activity and provides lightweight statistics and leaderboards. ✨ Features 🤖 AI-powered humor engine — replies with jokes, motivation, random witty lines, or sarcastic roasts. 💬 Command support — /joke, /inspire, /random, /roast, /help, /stats, /top. 🎯 Mention detection — replies when users tag @GiggleGPTBot. ⏰ Scheduled posts — morning jokes, daily motivation, and random wisdom at configured times. 📊 User analytics — counts messages, commands, reactions, and generates leaderboards. 🗄️ Postgres persistence — robust schema with tables for messages, responses, stats, and schedules. 🛠️ How It Works Triggers Telegram Trigger — receives all messages and commands from a chat. Schedule Trigger — runs hourly to check for planned posts. Processing Switch routes commands (/joke, /inspire, /random, /roast, /help, /stats, /top). Chat history fetches the latest context. Mention Analysis determines if the bot was mentioned. Generating an information response builds replies for /help, /stats, /top. AI nodes (AI response to command, AI response to mention, AI post generation) craft witty content via OpenRouter. Persistence Init Database ensures tables exist (user_messages, bot_responses, bot_commands, message_reactions, scheduled_posts, user_stats). Logging nodes update stats and store every bot/user interaction. Delivery Replies are sent back via Telegram Send nodes (Send AI response, Send info reply, Reply to Mention, Submit scheduled post). ⚙️ Setup Instructions Create a Telegram Bot with @BotFather and get your API token. Add credentials in n8n: Telegram API (your bot token) OpenRouter (API key from openrouter.ai) Postgres (use your DB, Supabase works well). Run the Init Database node once to create all required tables. (Optional) Seed schedule with the Adding a schedule node — it inserts: Morning joke at 06:00 Daily motivation at 09:00 Random wisdom at 17:00 (Adjust chat_id to your group/channel ID.) Activate workflow and connect Telegram Webhook or Polling. 📊 Database Schema user\_messages** — stores user chat messages. bot\_responses** — saves bot replies. bot\_commands** — logs command usage. message\_reactions** — tracks reactions. scheduled\_posts** — holds scheduled jokes/wisdom/motivation. user\_stats** — aggregates per-user message/command counts and activity. 🔑 Example Commands /joke → witty one-liner with light irony. /inspire → short motivational phrase. /random → unexpected witty remark. /roast → sarcastic roast (no offensive targeting). /stats → shows your personal stats. /top → displays leaderboard. /help → lists available commands. @GiggleGPTBot + message → bot replies in context. 🚀 Customization Ideas Add new command categories (/quote, /fact, /news). Expand analytics with reaction counts or streaks. Localize prompts into multiple languages. Adjust CRON schedules for posts. ✅ Requirements Telegram Bot token OpenRouter API key Postgres database 📦 Import this workflow, configure credentials, run the DB initializer — and your witty AI-powered Telegram companion is ready!
by NAZIA AI ACADEMY
How it works This workflow lets users generate AI images directly from Telegram messages using: Google Gemini API – to convert text to detailed, high-quality image prompts. Pollinations API – to generate free AI images from the prompts. Telegram Bot – to interact with users and return generated images instantly. It’s fully automated using n8n — from text message to stunning image, all in one flow. Perfect for creators, content marketers, or anyone wanting quick visuals on the go. Set up steps 🧩 Estimated setup time: ~10–15 minutes Create a Telegram Bot via @BotFather, copy your token, and set up the Telegram Trigger node in n8n with your credentials. Set up Google Gemini API via Google AI Studio or Cloud Console. Make sure your API key is added in the credentials section of the Gemini node. Customize prompt structure or image size in the Fields - Set Values or Prompt Agent node. (Optional) Enable Save to Disk if you want to keep a local copy of every image. Deploy and run the workflow — done 🎉 🛠️ All technical details and logic are fully documented inside the workflow using sticky notes. ⚠️ Requirements n8n (Self-hosted or Cloud) Telegram Bot Token Google Gemini API key (with billing enabled — includes some free usage) No key needed for Pollinations API — it's 100% free 🆓