by Baptiste Fort
Who is it for? This workflow is for marketers, sales teams, and local businesses who want to quickly collect leads (business name, phone, website, and email) from Google Maps and store them in Airtable. You can use it for real estate agents, restaurants, therapists, or any local niche. How it works Scrape Google Maps with Apify Google Maps Extractor. Clean and structure the data (name, address, phone, website). Visit each website and retrieve the raw HTML. Use GPT to extract the most relevant email from the site content. Save everything to Airtable for easy filtering and future outreach. It works for any location or keyword – just adapt the input in Apify. Requirements Before running this workflow, you’ll need: ✅ Apify account (to use the Google Maps Extractor) ✅ OpenAI API key (for GPT email extraction) ✅ Airtable account & base with the following fields: Business Name Address Website Phone Number Email Google Maps URL Airtable Structure Your Airtable base should contain these columns: Airtable Structure | Title | Street | Website | Phone Number | Email | URL | |-------------------------|-------------------------|--------------------|-----------------|------------------------|----------------------| | Paris Real Estate Agency| 10 Rue de Rivoli, Paris | https://agency.fr | +33 1 23 45 67 | contact@agency.fr | maps.google.com/... | | Example Business 2 | 25 Avenue de l’Opéra | https://example.fr | +33 1 98 76 54 | info@example.fr | maps.google.com/... | | Example Business 3 | 8 Boulevard Haussmann | https://demo.fr | +33 1 11 22 33 | contact@demo.fr | maps.google.com/... | Error Handling Missing websites:** If a business has no website, the flow skips the scraping step. No email found:** GPT returns Null if no email is detected. API rate limits:** Add a Wait node between requests to avoid Apify/OpenAI throttling. Now let’s take a detailed look at how to set up this automation, using real estate agencies in Paris as an example. Step 1 – Launch the Google Maps Scraper Start with a When clicking Execute workflow trigger to launch the flow manually. Then, add an HTTP Request node with the method set to POST. 👉 Head over to Apify: Google Maps Extractor On the page: https://apify.com/compass/google-maps-extractor Enter your business keyword (e.g., real estate agency, hairdresser, restaurant) Set the location you want to target (e.g., Paris, France) Choose how many results to fetch (e.g., 50) Optionally, use filters (only places with a website, by category, etc.) ⚠️ No matter your industry, this works — just adapt the keyword and location. Once everything is filled in: Click Run to test. Then, go to the top right → click on API. Select the API endpoints tab. Choose Run Actor synchronously and get dataset items. Copy the URL and paste it into your HTTP Request (in the URL field). Then enable: ✅ Body Content Type → JSON ✅ Specify Body Using JSON` Go back to Apify, click on the JSON tab, copy the entire code, and paste it into the JSON body field of your HTTP Request. At this point, if you run your workflow, you should see a structured output similar to this: title subTitle price categoryName address neighborhood street city postalCode ........ Step 2 – Clean and structure the data Once the raw data is fetched from Apify, we clean it up using the Edit Fields node. In this step, we manually select and rename the fields we want to keep: Title → {{ $json.title }} Address → {{ $json.address }} Website → {{ $json.website }} Phone → {{ $json.phone }} URL → {{ $json.url }}* This node lets us keep only the essentials in a clean format, ready for the next steps. On the right: a clear and usable table, easy to work with. Step 3 – Loop Over Items Now that our data is clean (see step 2), we’ll go through it item by item to handle each contact individually. The Loop Over Items node does exactly that: it takes each row from the table (each contact pulled from Apify) and runs the next steps on them, one by one. 👉 Just set a Batch Size of 20 (or more, depending on your needs). Nothing tricky here, but this step is essential to keep the flow dynamic and scalable. Step 4 – Edit Field (again) After looping through each contact one by one (thanks to Loop Over Items), we're refining the data a bit more. This time, we only want to keep the website. We use the Edit Fields node again, in Manual Mapping mode, with just: Website → {{ $json.website }} The result on the right? A clean list with only the URLs extracted from Google Maps. 🔧 This simple step helps isolate the websites so we can scrape them one by one in the next part of the flow. Step 5 – Scrape Each Website with an HTTP Request Let’s continue the flow: in the previous step, we isolated the websites into a clean list. Now, we’re going to send a request to each URL to fetch the content of the site. ➡️ To do this, we add an HTTP Request node, using the GET method, and set the URL as: {{ $json.website }} This value comes from the previous Edit Fields input This node will simply “visit” each website automatically and return the raw HTML code (as shown on the right). 📄 That’s the material we’ll use in the next step to extract email addresses (and any other useful info). We’re not reading this code manually — we’ll scan through it line by line to detect patterns that matter to us. This is a technical but crucial step: it’s how we turn a URL into real, usable data. Step 6 – Extract the Email with GPT Now that we've retrieved all the raw HTML from the websites using the HTTP Request node, it's time to analyze it. 💡 Goal: detect the most relevant email address on each site (ideally the main contact or owner). 👉 To do that, we’ll use an OpenAI node (Message a Model). Here’s how to configure it: ⚙️ Key Parameters: Model: GPT-4-1-MINI (or any GPT-4+ model available) Operation: Message a Model Resource: Text Simplify Output: ON Prompt (message you provide): Look at this website content and extract only the email I can contact this business. In your output, provide only the email and nothing else. Ideally, this email should be of the business owner, so if you have 2 or more options, try for most authoritative one. If you don't find any email, output 'Null'. Exemplary output of yours: name@examplewebsite.com {{ $json.data }} Step 7 – Save the Data in Airtable Once we’ve collected everything — the business name, address, phone number, website… and most importantly the email extracted via ChatGPT — we need to store all of this somewhere clean and organized. 👉 The best place in this workflow is Airtable. 📦 Why Airtable? Because it allows you to: Easily view and sort the leads you've scraped Filter, tag, or enrich them later And most importantly… reuse them in future automations ⚙️ What we're doing here We add an Airtable → Create Record node to insert each lead into our database. Inside this node, we manually map each field with the data collected in the previous steps: | Airtable Field | Description | Value from n8n | | -------------- | ------------------------ | ------------------------------------------ | | Title | Business name | {{ $('Edit Fields').item.json.Title }} | | Street | Full address | {{ $('Edit Fields').item.json.Address }} | | Website | Website URL | {{ $('Edit Fields').item.json.Website }} | | Phone Number | Business phone number | {{ $('Edit Fields').item.json.Phone }} | | Email | Email found by ChatGPT | {{ $json.message.content }} | | URL | Google Maps listing link | {{ $('Edit Fields').item.json.URL }} | 🧠 Reminder: we’re keeping only clean, usable data — ready to be exported, analyzed, or used in cold outreach campaigns (email, CRM, enrichment, etc.). ➡️ And the best part? You can rerun this workflow automatically every week or month to keep collecting fresh leads 🔁.
by Krishna Kumar Eswaran
🧠 Problem This Solves Managing credit card expenses can be tricky, especially when you want to stay transparent and keep your spouse in the loop. Most banks don't offer real-time notification sharing with family members, and manually updating expenses takes time and effort. This n8n workflow automates the entire process: tracking your HDFC credit card usage, logging it in Google Sheets, and sending an instant Telegram notification to your spouse. 👥 Who This Template Is For Couples who want shared visibility of credit card spending Individuals looking for automated personal finance tracking Anyone using HDFC Credit Card with email alerts enabled n8n users who want to integrate Gmail, Google Sheets, and Telegram ⚙️ Workflow Breakdown Here’s how the automation works: Gmail Trigger – Monitors your Gmail inbox for credit card transaction alerts from HDFC Bank. Email Parser – Extracts transaction details like amount, merchant name, date, and card type. Google Sheets Node – Logs the parsed transaction data into a structured Google Sheet for record-keeping. Telegram Node – Sends a message to your wife’s Telegram account with transaction details for instant notification. Step-by-Step Setup Instructions Prerequisites An HDFC Credit Card with email alerts enabled A Gmail account connected to n8n A Google Sheet created with columns like Date, Amount, Merchant, Card, etc. A Telegram Bot and your wife’s Telegram Chat ID Set up Gmail Trigger Use the Gmail Trigger Node to monitor incoming emails from alerts@hdfcbank.net or similar. Filter emails with subject line containing keywords like Credit Card Transaction Alert. Extract Email Content Use the HTML Extract or Regex node to parse out transaction amount, merchant name, date, and card number from the email body. Log to Google Sheets Connect your Google Sheets account in n8n Use the Append Row node to add each transaction as a new row in your finance sheet. Send Telegram Message Set up a Telegram Bot and get the Chat ID of your wife’s Telegram account Format a message like: "💳 HDFC Transaction Alert: ₹5,000 at Amazon on 17 May via XXXX1234" Send it via the Telegram node 🛠️ Customization Tips 💡 Add Spending Limits: Add a condition node to alert only if the transaction exceeds a certain amount. 🧾 Category Mapping: Use additional logic to classify expenses (e.g., Shopping, Dining) based on keywords. 📊 Weekly Summary: Create another workflow that sends a weekly Telegram summary using data from Google Sheets. 🔐 Security Tip: Mask part of the card number before sending the Telegram message for added security.
by Halfbit 🚀
Jura Coffee Counter: Webhook API & Google Sheets Logger ☕️ Track how many coffees your Jura E8 espresso machine makes — fully automated via webhook and Google Sheets. This workflow exposes a custom API endpoint that can be called by smart devices, such as an ESP8266 or ESP32 reading data from a Jura E8 coffee machine via Bluetooth Low Energy (BLE). The incoming data (including total coffee count) is timestamped and appended to a Google Sheet, making it easy to visualize or analyze your machine usage. ☕ Originally built for a Jura E8, based on AlexxIT/Jura reverse-engineering project. > 📝 This workflow uses Google Sheets as a logging backend. You can easily switch it to Airtable, Notion, or a database of your choice. Live example available at: https://halfbitstudio.com/o-nas/ > 🖥️ In our setup, this workflow is used to provide real-time coffee consumption stats displayed directly on our website. > 🔌 Some Jura machines require an accessory Bluetooth transmitter to enable connectivity. Communication is based on the Bluetooth Low Energy (BLE) protocol. Use Case Tracking usage of a Jura coffee machine Logging IoT sensor data into Google Sheets Creating dashboards for daily consumption Smart office setups with coffee stats! Features ☁️ Two Webhook endpoints: POST /{{WEBHOOK_POST_PATH}} — receives JSON from ESP (coffee machine reader) GET /{{WEBHOOK_GET_PATH}} — returns latest records as JSON 📅 Timestamping via Date & Time node 🔹 Coffee counter extraction from incoming JSON 🧾 Appends structured rows to Google Sheets 📤 Webhook response for external status or dashboards Setup Instructions Jura Coffee Machine Integration (Hardware) Use an ESP device (e.g. ESP8266 or ESP32) to connect to the Jura E8 via Bluetooth Low Energy (BLE). Send POST requests with JSON payload: { "total_coffees": 123 } Reverse-engineered protocol reference: AlexxIT/Jura Google Sheets Configuration Create a new Google Sheet with column headers like: date | time | coffee counter Connect your Google account in n8n and authorize access to this sheet. Replace the documentId and sheetName fields in the Google Sheets nodes: Use full URL to your spreadsheet Use the actual sheet name (e.g. Sheet1) Environment Variables & Placeholders | Placeholder | Description | | ------------------------ | ----------------------------------------------- | | {{WEBHOOK_POST_PATH}} | Endpoint to receive coffee counter data | | {{WEBHOOK_GET_PATH}} | Endpoint to return latest data (for dashboards) | | {{SHEET_ID}} | Google Spreadsheet ID | | {{GOOGLE_CREDENTIALS}} | OAuth2 credentials for Google Sheets | | {{DATA_COLUMNS}} | Column names in the target sheet | Testing the Workflow Send test request: Use Postman or ESP to send a POST request to /{{WEBHOOK_POST_PATH}} Body should include total_coffees value Check Google Sheet: Open your sheet and verify that a new row was appended Test GET endpoint: Access the second webhook URL (e.g. /{{WEBHOOK_GET_PATH}}) in browser or fetch via API Optional: Use Respond to Webhook output in a dashboard or frontend Customization Tips Sheet format**: Add more columns if you want to track additional data (e.g. machine temperature, errors) Output format**: Replace Google Sheets with any other storage (e.g. MySQL, Notion) Auth layer**: Add basic auth or token verification if needed for public exposure Notifications**: Send alerts to Discord/Slack when reaching thresholds (e.g. 200 coffees brewed) Tags: google-sheets, iot, webhook, jura, coffee, api, automation
by Emmanuel Bernard
🎥 AI Video Generator with HeyGen 🚀 Create AI-Powered Videos in n8n with HeyGen This workflow enables you to generate realistic AI videos using HeyGen, an advanced AI platform for video automation. Simply input your text, choose an AI avatar and voice, and let HeyGen generate a high-quality video for you – all within n8n! ✅ Ideal for: Content creators & marketers 🏆 Automating personalized video messages 📩 AI-powered video tutorials & training materials 🎓 🔧 How It Works 1️⃣ Provide a text script – This will be spoken in the AI-generated video. 2️⃣ Select an Avatar & Voice – Choose from a variety of AI-generated avatars and voices. 3️⃣ Run the workflow – HeyGen processes your request and generates a video. 4️⃣ Download your video – Get the direct link to your AI-powered video! ⚡ Setup Instructions 1️⃣ Get Your HeyGen API Key Sign up for a HeyGen account. Go to your account settings and retrieve your API Key. 2️⃣ Configure n8n Credentials In n8n, create new credentials and select "Custom Auth" as the authentication type. In the Name provide : X-Api-Key And in the value paste your API key from Heygen Update the 2 http node with the right credentials. 3️⃣ Select an AI Avatar & Voice Browse available avatars & voices in your HeyGen account. Copy the Avatar ID and Voice ID for your video. 4️⃣ Run the Workflow Enter your text, avatar ID, and voice ID. Execute the workflow – your video will be generated automatically! 🎯 Why Use This Workflow? ✔️ Fully Automated – No manual editing required! ✔️ Realistic AI Avatars – Choose from a variety of digital avatars. ✔️ Seamless Integration – Works directly within your n8n workflow. ✔️ Scalable & Fast – Generate multiple videos in minutes. 🔗 Start automating AI-powered video creation today with n8n & HeyGen!
by OneClick IT Consultancy P Limited
Automate Customer Feedback Analysis with Google Sheets, WhatsApp, and Email Introduction: Drowning in Data, Starving for Insight? Imagine this: Your team launches a new feature. Feedback starts pouring in emails, support tickets, social media mentions, and survey responses. You know gold is buried in there, but manually reading, tagging, and summarising hundreds, maybe thousands, of comments? It takes days, maybe weeks. By the time you have a clear picture, the moment might have passed. Sounds exhausting, right? What if you could have an AI assistant tirelessly working 24/7, instantly analysing every piece of feedback the moment it arrives? This isn't science fiction anymore. AI-powered automation can transform this slow, manual chore into a real-time insight engine, giving you the pulse of your customer base almost instantly. Let's explore how. What's the Goal? Understanding the Workflow Objective The core challenge is transforming raw, unstructured customer feedback into actionable intelligence quickly and efficiently. The Problem: Manual Overload: Sifting through vast amounts of feedback manually is incredibly time-consuming and prone to human error or bias. Delayed Insights: The lag between receiving feedback and understanding it means missed opportunities and slow responses to critical issues. Inconsistent Analysis: Different team members might interpret or categorize feedback differently, leading to unreliable trend spotting. The AI Solution: Automated Data Collection: Connects directly to feedback sources (surveys, social media, review sites, helpdesks). AI-Powered Analysis: Uses Large Language Models (LLMs) like GPT-4 or Claude to analyze sentiment, extract key topics, and summarize comments. Intelligent Categorization: Automatically tags feedback based on predefined or dynamically identified themes (e.g., "bug report," "feature request," "pricing issue"). Real-time Reporting: Pushes structured insights into dashboards, databases, or triggers notifications for immediate awareness. Outcome: You move from reactive problem-solving based on stale data to proactive, strategic decisions driven by a near real-time understanding of customer sentiment and needs. Why Does It Matter? Achieving 100X Productivity and Efficiency Look, automating feedback isn't just about saving time; it's about scaling your ability to listen and respond smarter, not harder. When you leverage AI, the gains aren't incremental - they're exponential. Here’s why this is a game changer: Blazing Speed: Analyse feedback 100x Faster (or more!) than manual methods. Insights appear in minutes or hours, not days or weeks. Unhuman Scalability: Process virtually unlimited volumes of feedback without needing to scale your human team proportionally. AI doesn't get tired or bored. Consistent Accuracy: AI applies analysis rules consistently, reducing human bias and ensuring reliable categorisation and sentiment scoring over time. Proactive Trend Spotting: Identify emerging issues or popular requests much earlier by analysing aggregated data automatically. Spot patterns humans might miss. Free Up Your Team: Let your talented team focus on acting on insights – improving products, fixing issues, engaging customers – instead of drowning in data entry. How It Works: AI Automation Step by Step Getting this set up is more straightforward than you might think, especially with tools like n8n acting as the central hub. Automated Feedback Triggering CRM/Website Event Node Trigger feedback requests after: Purchases (eCommerce) Support ticket resolution Feature usage (SaaS) Time-Based Node Schedule recurring NPS surveys Customer health check-ups Chat App Node (WhatsApp/Telegram/Messenger) Send conversational feedback prompts: "How was your recent experience with [specific interaction]?" Multi-Channel Feedback Collection Email Node (SendGrid/Mailchimp) Send personalized feedback requests Embed 1-5 rating widgets SMS Node (Twilio) Short mobile surveys: "Reply 1-5: How satisfied with your purchase?" Webhook Node Capture in-app feedback Process chatbot responses Social Media Node Monitor Twitter/X, Instagram mentions Analyze comments for unsolicited feedback AI-Powered Real-Time Analysis OpenAI/ChatGPT Node (Sentiment Analysis) Prompt: "Analyze sentiment (positive/neutral/negative) and key themes from: [customer feedback]" Output fields: Sentiment score (1-5) Urgency flag (high/medium/low) Key topics (billing, support, product, etc.) Translation Node (Optional) Convert multilingual feedback into a consistent language Instant AI Response System Conditional Node (Routing Logic) Positive feedback → Send thank-you + referral ask Neutral feedback → Follow-up question for details Negative feedback → Escalate to the human team AI Response Generator Node Prompt: "Create a personalized response to [feedback type] about [topic] with sentiment [score]" Adjust tone (professional/friendly/empathetic) Escalation Node Route critical issues to the support team with full context Automated Insights & Alerts Dashboard Node Real-time sentiment tracking Emerging issue detection Alert Node (Slack/Teams/Email) Notify teams of negative trends: "3+ complaints about checkout flow in the past hour!" Report Node Auto-generate weekly/monthly summaries: "Top 5 customer pain points this week" Product Board Integration Auto-create feature requests Prioritize based on feedback volume Tools of the Trade: AI & Automation Tech Stack You don't need a massive, complex tech stack. Focus on a few core, powerful tools: n8n: The workflow automation platform. This is the 'glue' that connects everything and orchestrates the process without needing deep coding knowledge. Honestly, it's incredibly versatile. OpenAI (GPT-4/GPT-4o): State-of-the-art LLM for high-quality text analysis, summarization, and classification. Great for complex understanding. Anthropic (Claude 3 Sonnet/Opus): Another top-tier LLM, known for strong performance in analysis and handling large contexts. Often, a great alternative or complement to GPT models. Feedback Sources APIs: Connectors for where your feedback lives (e.g., Typeform, SurveyMonkey, Twitter API, Zendesk API, Google Play/App Store review APIs). Data Storage/Destination: Where the processed insights go (e.g., Google Sheets, Airtable, Notion, PostgreSQL database, BigQuery). (Optional) Visualization Tool: Tools like Metabase, Grafana, Looker Studio, or Power BI to create dashboards from your structured feedback data. What's the Cost? Estimated Budget Let's talk investment. You're mainly looking at: Setup Costs: Primarily your time (or a consultant's) to design and build the initial workflow in n8n. Depending on complexity, this could range from a few hours to a few days. No major software licenses are usually needed upfront if using self-hosted n8n or starting with free/low-tier cloud plans. AI API Calls: You pay per usage to OpenAI/Anthropic. Costs depend heavily on volume but can start from $20-$50/month for moderate usage and scale up. Newer models are getting more cost-effective. n8n Hosting: Free if self-hosted (requires a server), or tiered cloud pricing starting around $20/month. Feedback Source APIs: Some platforms might have API access costs or rate limits on free tiers. Total Estimated Monthly Cost: For many businesses, ongoing costs can range from $50 - $500+ per month, highly dependent on feedback volume and AI model choice. The Return on Investment (ROI) is typically rapid. Consider the hours saved from manual analysis, the value of faster issue resolution, preventing churn, and the benefits of making product decisions based on real-time data. It often pays for itself very quickly. Who Benefits? Target Users and Industries This automated feedback loop isn't niche; it's valuable across many sectors and roles: Top Industries: SaaS (Software as a Service): Understanding user friction, feature requests, bug reports. E-commerce & Retail: Analyzing product reviews, post-purchase surveys, and support chats. Hospitality & Travel: Processing guest reviews, survey feedback. Mobile Apps: Monitoring app store reviews, in-app feedback. Financial Services: Gauging customer satisfaction with services, identifying pain points. Key Roles: Product Managers: Prioritizing features, understanding user needs, tracking launch reception. Customer Experience (CX) / Success Managers: Monitoring customer health, identifying churn risks, and improving support processes. Marketing Teams: Understanding brand perception, campaign feedback, and voice of the customer. Support Leads: Identifying recurring issues, measuring support quality, spotting training needs. This approach works for businesses of all sizes, from startups wanting to stay lean and agile to large enterprises needing to manage massive feedback volumes. How to use workflow? Importing a workflow in n8n is a straightforward process that allows you to use pre-built or shared workflows to save time. Below is a step-by-step guide to import a workflow in n8n, based on the official documentation and community resources. Steps to Import a Workflow in n8n 1. Obtain the Workflow JSON Source the Workflow:** Workflows are typically shared as JSON files or code snippets. You might receive them from: The n8n community (e.g., n8n.io workflows page). A colleague or tutorial (e.g., a .json file or copied JSON code). Exported from another n8n instance (see export instructions below if needed). Format:** Ensure you have the workflow in JSON format, either as a file (e.g., workflow.json) or as text copied to your clipboard. 2. Access the n8n Workflow Editor Log in to n8n:** Open your n8n instance (via n8n Cloud or your - self-hosted instance). Navigate to the Workflows tab in the n8n dashboard. Open a New Workflow:** Click Add Workflow to create a blank workflow, or open an existing workflow if you want to merge the imported workflow. 3. Import the Workflow Option 1: Import via JSON Code (Clipboard): In the n8n editor, click the three dots (⋯) in the top-right corner to open the menu. Select Import from Clipboard. Paste the JSON code of the workflow into the provided text box. Click Import to load the workflow into the editor. Option 2: Import via JSON File: In the n8n editor, click the three dots (⋯) in the top-right corner. Select Import from File. Choose the .json file from your computer. Click Open to import the workflow. Note: If the workflow includes nodes for apps requiring credentials (e.g., Google Sheets), you’ll need to configure those credentials separately after importing.
by Abhishek Patoliya
This powerful n8n automation sends you daily weather updates directly to your Telegram chat using live data from OpenWeatherMap. It supports automatic daily updates and manual lookups via form input. ✅ Prerequisites Before you begin, make sure you have: A working n8n instance (v1.0 or later recommended). An account with OpenWeatherMap (free plan is sufficient). A Telegram Bot created via @BotFather. Your Telegram user ID or chat ID. 🔐 API & Bot Setup 🧩 OpenWeatherMap API Go to https://openweathermap.org/api Sign up and verify your account. Navigate to API Keys in your account dashboard. Copy your API key (used later in the HTTP Request node). 🤖 Telegram Bot Open @BotFather in Telegram. Run /newbot and follow the prompts: Choose a name and username for your bot. You’ll get a bot token (copy this). Start a chat with your new bot to activate it. To get your Telegram User ID, use @userinfobot or an n8n Telegram Trigger node. 🔄 Trigger Options ⏰ Schedule Trigger (Automatic) Runs daily at 8:00 AM IST. Ideal for consistent, passive updates. 📝 Form Trigger (Manual) Input 🌆 City and 🌍 Country manually. Instantly receive weather info in Telegram. 🧠 How the Flow Works Trigger Activated (Scheduled or Form) City & Country fetched (default or from form) HTTP Request sent to OpenWeatherMap with API key Weather Data Parsed & Formatted: 📅 Current Date 📍 City & Country 🌤️ Weather Description 🌡️ Temperature (°C) 💧 Humidity (%) 🌬️ Wind Speed (m/s) 🔼 Atmospheric Pressure 🌅 Sunrise Time (IST) 🌇 Sunset Time (IST) Message Sent to Telegram 🧰 Nodes Used Schedule Trigger** – Runs every day at 8:00 AM IST Form Trigger** – Accepts user input Set Node** – Default city/country values and date formatting HTTP Request** – Calls OpenWeatherMap API Function Node** – Converts timestamps to IST Telegram Node** – Sends formatted weather message 📦 Example Telegram Output 📅 Wednesday, 10 July 2025 🌤 Weather in Mumbai, IN: Condition: Clear sky Temperature: 30°C 💧 Humidity: 70% 🌬 Wind Speed: 3 m/s 🔼 Pressure: 1013 hPa 🌅 Sunrise: 5:57:12 AM 🌇 Sunset: 6:53:45 PM 🛠️ Customization Tips 🏙️ Change Default City/Country Locate the Set Node (used before the API call). Replace "Mumbai" and "IN" with your preferred location. Or connect the Form Trigger input to allow dynamic values. 🕗 Change Schedule Time Open the Schedule Trigger node. Adjust to your preferred time zone and daily timing (e.g., 7 AM IST). 🧪 Add Extra Data OpenWeatherMap returns more fields like visibility, UV index, etc. You can include these in your Telegram message via the Function Node and Set Node.
by Airtop
Extracting LinkedIn Profile Information Use Case Manually copying data from LinkedIn profiles is time-consuming and error-prone. This automation helps you extract structured, detailed information from any public LinkedIn profile—enabling fast enrichment, hiring research, or lead scoring. What This Automation Does This automation extracts profile details from a LinkedIn URL using the following input parameters: airtop_profile**: The name of your Airtop Profile connected to LinkedIn. linkedin_url**: The URL of the LinkedIn profile you want to extract data from. How It Works Starts with a form trigger or via another workflow. Assigns the LinkedIn URL and Airtop profile variables. Opens the LinkedIn profile in a real browser session using Airtop. Uses an AI prompt to extract structured information, including: Name, headline, location Current company and position About section, experience, and education history Skills, certifications, languages, connections, and recommendations Returns structured JSON ready for further use or storage. Setup Requirements Airtop API Key — free to generate. An Airtop Profile connected to LinkedIn (requires one-time login). Next Steps Sync with CRM**: Push extracted data into HubSpot, Salesforce, or Airtable for lead enrichment. Combine with Search Automation**: Use with a LinkedIn search scraper to process profiles in bulk. Adapt to Other Platforms**: Customize the prompt to extract structured data from GitHub, Twitter, or company sites. Read more about the Extract Linkedin Profile Information automation.
by Hubschrauber
A single workflow with 2 flows/paths that combine to handle the backup sequence for Zigbee device configuration from HomeAssistant / zigbee2mqtt. This provides a way to automate a periodic capture of Zigbee coordinators and device pairings to speed the recovery process when/if the HomeAssistant instance needs to be rebuilt. Setting up similar automation without n8n (e.g. shell scripts and system timers) is consiterably more challenging. n8n makes it easy and this template should remove any other excuse not to do it. Flow 1 Triggered by Cron/Timer set whatever interval for backups sends mqtt message to request zigbee2mqtt backup (via separate message) Flow 2 Triggered by zigbee2mqtt backup message Extracts zip file from the message and stores somewhere, with a date-stamp in the filename, via sftp Setup Create a MQTT connection named "MQTT Account" with the appropriate protocol (mqtt), host, port (1883), username, and password Create an sftp connection named "SFTP Zigbee Backups" with the appropriate host, port (22), username, and password or key. Reference This article describes the mqtt parts.
by Yaron Been
Automated pipeline that extracts job listings from Upwork and exports them to Google Sheets for better organization, analysis, and team collaboration. 🚀 What It Does Fetches job postings based on saved searches Extracts key job details (title, budget, description) Organizes data in Google Sheets Updates in real-time Supports multiple search criteria 🎯 Perfect For Freelancers tracking opportunities Teams managing multiple projects Agencies monitoring client needs Market researchers Business analysts ⚙️ Key Benefits ✅ Centralized job board ✅ Easy sharing with team members ✅ Advanced filtering and sorting ✅ Historical data tracking ✅ Customizable data points 🔧 What You Need Upwork account Google account n8n instance Google Sheets setup 📊 Data Exported Job title and description Budget and hourly rate Client information Posted date Required skills Job URL 🛠️ Setup & Support Quick Setup Get started in 15 minutes with our step-by-step guide 📺 Watch Tutorial 💼 Get Expert Support 📧 Direct Help Streamline your job search and opportunity tracking with automated data collection and organization.
by Roshan Ramani
Overview An intelligent automation workflow that monitors your Gmail inbox and sends AI-powered summaries of important emails directly to your Telegram chat. Perfect for staying updated on critical communications without constantly checking your email. 🌟 Key Features Real-time Email Monitoring**: Checks Gmail every minute for new emails Smart Content Filtering**: Only processes emails containing important keywords AI-Powered Summarization**: Uses GPT-4o-mini to create concise, human-readable summaries Instant Telegram Notifications**: Delivers summaries directly to your preferred Telegram chat Customizable Keywords**: Easily modify filters to match your specific needs 🔧 How It Works Workflow Steps: Email Trigger: Continuously monitors your Gmail inbox for new messages Smart Filter: Analyzes email subject and body for important keywords (sales, jobs, etc.) AI Processing: Sends relevant emails to OpenAI for intelligent summarization Telegram Delivery: Sends formatted summary to your Telegram chat Sample Output: 📦 Your Flipkart order "Bluetooth Speaker" was delivered today. Enjoy! 💰 Invoice from AWS for $23.50 is due by July 20. Check billing portal. ✅ HR shared your July payslip. No action needed unless there's an error. 🛠 Setup Requirements Gmail account with OAuth2 credentials OpenAI API key Telegram bot token and chat ID N8N instance (cloud or self-hosted) 📋 Use Cases Business Alerts**: Payment due notices, invoice reminders E-commerce**: Order confirmations, delivery updates HR Communications**: Payslips, policy updates, announcements Security**: Login alerts, security notifications Job Hunting**: Application responses, interview invitations ⚙️ Customization Options Keyword Filters**: Add/remove keywords in the filter node (invoice, payment, security, delivery, etc.) AI Prompt**: Modify the summarization style and format Polling Frequency**: Adjust email checking interval Multiple Chats**: Send to different Telegram chats based on email type 🔒 Privacy & Security Processes emails locally through n8n No email content stored permanently Uses secure OAuth2 authentication Respects Gmail API rate limits 📊 Performance Lightweight and efficient Minimal resource usage Fast AI processing with GPT-4o-mini Reliable Telegram delivery 💡 Pro Tips Start with broad keywords and refine based on results Use multiple condition branches for different email types Set up different Telegram chats for work vs personal emails Monitor your OpenAI usage to avoid unexpected costs
by Airtop
Automating Company ICP Scoring via LinkedIn Use Case This automation scores companies based on their LinkedIn profile using custom Ideal Customer Profile (ICP) criteria. It’s ideal for qualifying B2B leads and prioritizing outreach based on fit. What This Automation Does Inputs required: Company LinkedIn URL**: Public LinkedIn profile of the company. Airtop Profile (connected to LinkedIn)**: Airtop Profile authenticated to access and extract profile data. The automation analyzes the LinkedIn page and calculates a score based on: Scoring Criteria | Category | Classification | Points | |--------------------|---------------------------|------------| | AI Focus | Low | 5 | | | Medium | 10 | | | High | 25 | | Technical Level | Basic | 5 | | | Intermediate | 15 | | | Advanced | 25 | | | Expert | 35 | | Employee Count | 0–9 | 5 | | | 10–150 | 25 | | | 150+ | 30 | | Agency Status | Not Automation Agency | 0 | | | Automation Agency | 20 | | Geography | Outside US/Europe | 0 | | | US/Europe Based | 10 | The result includes: Total ICP score Detailed justifications for each score component How It Works Opens the company’s LinkedIn page using Airtop. Analyzes metadata including employee count, headquarters, services, and keywords. Applies the scoring rubric and returns structured JSON with scores and reasons. Optionally flattens the result for storage or CRM integration. Setup Requirements Airtop API Key LinkedIn-authenticated Airtop Profile Next Steps Combine with Lead Lists**: Score companies from outreach lists. Push to CRM**: Add scores to HubSpot or Salesforce records. Adjust Scoring Weights**: Modify rubric to reflect your ICP strategy. Read more about company ICP scoring automation with Airtop and n8n
by bangank36
This workflow restores all n8n instance credentials from GitHub backups using the n8n API node. It complements the Backup Your Credentials to GitHub template by allowing users to seamlessly restore previously saved credentials. How It Works The workflow fetches credentials stored in a GitHub repository and imports them into your n8n instance. Setup Instructions To configure the workflow, update the Globals node with the following values: repo.owner** – Your GitHub username repo.name** – The name of your GitHub repository storing the credentials repo.path** – The folder path within the repository where credentials are stored For example, if your GitHub username is john-doe, your repository is named n8n-backups, and credentials are stored in a credentials/ folder, you would set: repo.owner → john-doe repo.name → n8n-backups repo.path → credentials/ Required Credentials GitHub API** – Access to your repository n8n API** – To import credentials into your n8n instance Who Is This For? This template is ideal for users who want to restore their credentials from GitHub backups, ensuring easy migration and recovery in case of data loss. Check out my other templates: 👉 My n8n Templates