by Halfbit 🚀
Jura Coffee Counter: Webhook API & Google Sheets Logger ☕️ Track how many coffees your Jura E8 espresso machine makes — fully automated via webhook and Google Sheets. This workflow exposes a custom API endpoint that can be called by smart devices, such as an ESP8266 or ESP32 reading data from a Jura E8 coffee machine via Bluetooth Low Energy (BLE). The incoming data (including total coffee count) is timestamped and appended to a Google Sheet, making it easy to visualize or analyze your machine usage. ☕ Originally built for a Jura E8, based on AlexxIT/Jura reverse-engineering project. > 📝 This workflow uses Google Sheets as a logging backend. You can easily switch it to Airtable, Notion, or a database of your choice. Live example available at: https://halfbitstudio.com/o-nas/ > 🖥️ In our setup, this workflow is used to provide real-time coffee consumption stats displayed directly on our website. > 🔌 Some Jura machines require an accessory Bluetooth transmitter to enable connectivity. Communication is based on the Bluetooth Low Energy (BLE) protocol. Use Case Tracking usage of a Jura coffee machine Logging IoT sensor data into Google Sheets Creating dashboards for daily consumption Smart office setups with coffee stats! Features ☁️ Two Webhook endpoints: POST /{{WEBHOOK_POST_PATH}} — receives JSON from ESP (coffee machine reader) GET /{{WEBHOOK_GET_PATH}} — returns latest records as JSON 📅 Timestamping via Date & Time node 🔹 Coffee counter extraction from incoming JSON 🧾 Appends structured rows to Google Sheets 📤 Webhook response for external status or dashboards Setup Instructions Jura Coffee Machine Integration (Hardware) Use an ESP device (e.g. ESP8266 or ESP32) to connect to the Jura E8 via Bluetooth Low Energy (BLE). Send POST requests with JSON payload: { "total_coffees": 123 } Reverse-engineered protocol reference: AlexxIT/Jura Google Sheets Configuration Create a new Google Sheet with column headers like: date | time | coffee counter Connect your Google account in n8n and authorize access to this sheet. Replace the documentId and sheetName fields in the Google Sheets nodes: Use full URL to your spreadsheet Use the actual sheet name (e.g. Sheet1) Environment Variables & Placeholders | Placeholder | Description | | ------------------------ | ----------------------------------------------- | | {{WEBHOOK_POST_PATH}} | Endpoint to receive coffee counter data | | {{WEBHOOK_GET_PATH}} | Endpoint to return latest data (for dashboards) | | {{SHEET_ID}} | Google Spreadsheet ID | | {{GOOGLE_CREDENTIALS}} | OAuth2 credentials for Google Sheets | | {{DATA_COLUMNS}} | Column names in the target sheet | Testing the Workflow Send test request: Use Postman or ESP to send a POST request to /{{WEBHOOK_POST_PATH}} Body should include total_coffees value Check Google Sheet: Open your sheet and verify that a new row was appended Test GET endpoint: Access the second webhook URL (e.g. /{{WEBHOOK_GET_PATH}}) in browser or fetch via API Optional: Use Respond to Webhook output in a dashboard or frontend Customization Tips Sheet format**: Add more columns if you want to track additional data (e.g. machine temperature, errors) Output format**: Replace Google Sheets with any other storage (e.g. MySQL, Notion) Auth layer**: Add basic auth or token verification if needed for public exposure Notifications**: Send alerts to Discord/Slack when reaching thresholds (e.g. 200 coffees brewed) Tags: google-sheets, iot, webhook, jura, coffee, api, automation
by Kunsh
A streamlined AI-powered tool that extracts actionable technical insights from HackerOne security reports for advanced bug bounty hunters. How It Works Send any HackerOne report URL (e.g., https://hackerone.com/reports/123456) to the chat interface. The AI agent will: Fetch the report JSON automatically Analyze for unique techniques, payloads, and root causes Extract reusable insights in a structured format Summarize with practical pentesting value Setup Requirements Google Gemini API credentials configured Chat interface deployed and accessible HackerOne report URLs Output Format Summary: One-liner impact statement Techniques: Payloads, code snippets, exploitation steps Pro Tips: Reusable insights for future hunts Perfect for rapid triage and building your personal exploit knowledge base.
by Abhishek Patoliya
This powerful n8n automation sends you daily weather updates directly to your Telegram chat using live data from OpenWeatherMap. It supports automatic daily updates and manual lookups via form input. ✅ Prerequisites Before you begin, make sure you have: A working n8n instance (v1.0 or later recommended). An account with OpenWeatherMap (free plan is sufficient). A Telegram Bot created via @BotFather. Your Telegram user ID or chat ID. 🔐 API & Bot Setup 🧩 OpenWeatherMap API Go to https://openweathermap.org/api Sign up and verify your account. Navigate to API Keys in your account dashboard. Copy your API key (used later in the HTTP Request node). 🤖 Telegram Bot Open @BotFather in Telegram. Run /newbot and follow the prompts: Choose a name and username for your bot. You’ll get a bot token (copy this). Start a chat with your new bot to activate it. To get your Telegram User ID, use @userinfobot or an n8n Telegram Trigger node. 🔄 Trigger Options ⏰ Schedule Trigger (Automatic) Runs daily at 8:00 AM IST. Ideal for consistent, passive updates. 📝 Form Trigger (Manual) Input 🌆 City and 🌍 Country manually. Instantly receive weather info in Telegram. 🧠 How the Flow Works Trigger Activated (Scheduled or Form) City & Country fetched (default or from form) HTTP Request sent to OpenWeatherMap with API key Weather Data Parsed & Formatted: 📅 Current Date 📍 City & Country 🌤️ Weather Description 🌡️ Temperature (°C) 💧 Humidity (%) 🌬️ Wind Speed (m/s) 🔼 Atmospheric Pressure 🌅 Sunrise Time (IST) 🌇 Sunset Time (IST) Message Sent to Telegram 🧰 Nodes Used Schedule Trigger** – Runs every day at 8:00 AM IST Form Trigger** – Accepts user input Set Node** – Default city/country values and date formatting HTTP Request** – Calls OpenWeatherMap API Function Node** – Converts timestamps to IST Telegram Node** – Sends formatted weather message 📦 Example Telegram Output 📅 Wednesday, 10 July 2025 🌤 Weather in Mumbai, IN: Condition: Clear sky Temperature: 30°C 💧 Humidity: 70% 🌬 Wind Speed: 3 m/s 🔼 Pressure: 1013 hPa 🌅 Sunrise: 5:57:12 AM 🌇 Sunset: 6:53:45 PM 🛠️ Customization Tips 🏙️ Change Default City/Country Locate the Set Node (used before the API call). Replace "Mumbai" and "IN" with your preferred location. Or connect the Form Trigger input to allow dynamic values. 🕗 Change Schedule Time Open the Schedule Trigger node. Adjust to your preferred time zone and daily timing (e.g., 7 AM IST). 🧪 Add Extra Data OpenWeatherMap returns more fields like visibility, UV index, etc. You can include these in your Telegram message via the Function Node and Set Node.
by Dariusz Koryto
FTP to Google Drive Transfer Template What This Template Does This workflow automatically transfers files from an FTP server to Google Drive. It's perfect for: Backing up files from remote servers Migrating data from FTP to cloud storage Automating file synchronization tasks Creating scheduled backups of server content How It Works The workflow follows these steps: Manual Trigger - You start the process by clicking "Execute" Lists FTP Directory - Scans the specified FTP folder for all items Filters Files Only - Separates actual files from directories (folders) Downloads Files - Retrieves each file as binary data from the FTP server Uploads to Google Drive - Stores all downloaded files in your specified Google Drive folder Requirements Before using this template, you'll need: FTP Server Access**: Server address, username, and password Google Drive Account**: With OAuth2 authentication set up in n8n n8n Instance**: Self-hosted or cloud version Setup Instructions Step 1: Configure FTP Credentials In n8n, go to Settings → Credentials Create a new FTP credential Enter your FTP server details: Host: Your FTP server address Port: Usually 21 for FTP Username: Your FTP username Password: Your FTP password Test the connection and save Step 2: Set Up Google Drive Authentication Create a new Google Drive OAuth2 credential Follow n8n's Google Drive setup guide: Create a Google Cloud project Enable Google Drive API Create OAuth2 credentials Add your n8n callback URL Authorize the connection in n8n Step 3: Configure the Workflow Update FTP Path: Open the "List FTP Directory" node Change the path parameter from /_instalki to your desired FTP folder Set Google Drive Folder: Open the "Upload to Google Drive" node Replace the folderId with your target Google Drive folder ID To find folder ID: Open the folder in Google Drive and copy the ID from the URL Assign Credentials: Ensure both FTP nodes use your FTP credential Assign your Google Drive credential to the upload node How to Use Test First: Run the workflow manually with a few test files Monitor Execution: Check the execution log for any errors Verify Upload: Confirm files appear in your Google Drive folder Schedule (Optional): Add a schedule trigger if you want automatic runs Customization Options Filter Specific File Types Add a condition after "Filter Files Only" to process only certain file extensions: {{ $json.name.endsWith('.pdf') || $json.name.endsWith('.jpg') }} Add Error Handling Insert error-handling nodes to manage failed downloads or uploads gracefully. Organize by Date Modify the Google Drive upload to create date-based folders automatically. File Size Limits Add checks for file size before attempting upload (Google Drive has limits). Troubleshooting Common Issues: FTP Connection Failed**: Check server address, port, and credentials Google Drive Upload Error**: Verify OAuth2 setup and folder permissions Files Not Found**: Ensure the FTP path exists and contains files Large Files**: Consider Google Drive's file size limitations (15GB for free accounts) Tips: Test with small files first Check n8n execution logs for detailed error messages Ensure your Google Drive has sufficient storage space Verify FTP server allows multiple concurrent connections Security Notes Never hardcode credentials in the workflow Use n8n's credential system for all authentication Consider using SFTP instead of FTP for better security Regularly rotate your FTP passwords Review Google Drive sharing permissions Next Steps Once you have this basic transfer working, you might want to: Add email notifications for successful/failed transfers Implement file deduplication checks Create logs of transferred files Set up automatic cleanup of old files Add file compression before upload
by Roshan Ramani
Overview An intelligent automation workflow that monitors your Gmail inbox and sends AI-powered summaries of important emails directly to your Telegram chat. Perfect for staying updated on critical communications without constantly checking your email. 🌟 Key Features Real-time Email Monitoring**: Checks Gmail every minute for new emails Smart Content Filtering**: Only processes emails containing important keywords AI-Powered Summarization**: Uses GPT-4o-mini to create concise, human-readable summaries Instant Telegram Notifications**: Delivers summaries directly to your preferred Telegram chat Customizable Keywords**: Easily modify filters to match your specific needs 🔧 How It Works Workflow Steps: Email Trigger: Continuously monitors your Gmail inbox for new messages Smart Filter: Analyzes email subject and body for important keywords (sales, jobs, etc.) AI Processing: Sends relevant emails to OpenAI for intelligent summarization Telegram Delivery: Sends formatted summary to your Telegram chat Sample Output: 📦 Your Flipkart order "Bluetooth Speaker" was delivered today. Enjoy! 💰 Invoice from AWS for $23.50 is due by July 20. Check billing portal. ✅ HR shared your July payslip. No action needed unless there's an error. 🛠 Setup Requirements Gmail account with OAuth2 credentials OpenAI API key Telegram bot token and chat ID N8N instance (cloud or self-hosted) 📋 Use Cases Business Alerts**: Payment due notices, invoice reminders E-commerce**: Order confirmations, delivery updates HR Communications**: Payslips, policy updates, announcements Security**: Login alerts, security notifications Job Hunting**: Application responses, interview invitations ⚙️ Customization Options Keyword Filters**: Add/remove keywords in the filter node (invoice, payment, security, delivery, etc.) AI Prompt**: Modify the summarization style and format Polling Frequency**: Adjust email checking interval Multiple Chats**: Send to different Telegram chats based on email type 🔒 Privacy & Security Processes emails locally through n8n No email content stored permanently Uses secure OAuth2 authentication Respects Gmail API rate limits 📊 Performance Lightweight and efficient Minimal resource usage Fast AI processing with GPT-4o-mini Reliable Telegram delivery 💡 Pro Tips Start with broad keywords and refine based on results Use multiple condition branches for different email types Set up different Telegram chats for work vs personal emails Monitor your OpenAI usage to avoid unexpected costs
by Agent Studio
Automatically store Retell transcripts in Google Sheets/Airtable/Notion from webhook Overview This workflow stores the results of a Retell voice call (transcript, analysis, etc.) once it has ended and been analyzed. It listens for call_analyzed webhook events from Retell and stores the data in Airtable, Google Sheets, and Notion (choose based on your stack). Useful for anyone building Retell agents who want to keep a detailed history of analyzed calls in structured tools. Who is it for For builders of Retell's Voice Agents who want to store call history and essential analytic data. Prerequisites Have a Retell AI Account Create a Retell agent Associate a phone number with your Retell agent Set up one of the following: An Airtable base and table (example: "Transcripts") A Google Sheet with a “Transcripts” tab A Notion database with columns to match the transcript fields Templates: Airtable Google Sheets Notion How it works Receives a webhook POST request from Retell when a call has been analyzed. Filters out any event that is not call_analyzed (Retell sends webhooks for call_started, call_ended and call_analyzed) Extracts useful fields like: Call ID, start/end time, duration, total cost Transcript, summary, sentiment Stores this data in your preferred tool: Airtable Google Sheets Notion How to use it Copy the webhook URL (e.g., https://your-instance.app.n8n.cloud/webhook/poc-retell-analysis) and paste it in your Retell agent under "Webhook settings" then "Agent Level Webhook URL". Make sure your Airtable, Google Sheet, or Notion databases are correctly configured to receive the fields. After each call, once Retell finishes the analysis, this workflow will automatically log the results. Extension If you use any "Post-Call Analysis" fields, you can add columns to your Airtable, Google Sheet, or Notion database. Then fetch the data from the call.call_analysis.custom_analysis_data object. Additional Notes Phone numbers are extracted depending on the call direction (from_number or to_number). Cost is converted from cents to dollars before saving. Dates are converted from timestamps to local ISO strings. You can remove any of the outputs (Airtable, Google Sheets, Notion) if you're only using one. 👉 Reach out to us if you're interested in analysing your Retell Agent conversations.
by David Olusola
This plug-and-play n8n workflow automates medical record digitization using Mistral’s OCR API and stores clean, structured data in Google Sheets. Whether you run a clinic or healthtech product, this no-code solution simplifies data entry from scanned or uploaded medical documents. 📌 Works seamlessly on both self-hosted and cloud-based n8n environments. 👥 Who is this for? Hospitals and private clinics Healthtech platforms & startups Medical admin and document processing teams Clinical researchers and labs 😓 What problem does it solve? ❌ Manual entry from printed forms ❌ Unstructured, scattered records ❌ Errors in data transcription ❌ Inconsistent document storage ✅ This automation brings consistency, structure, and speed to the way you handle medical documents. ✅ What this workflow does Captures uploaded documents through a public form Uploads file to Mistral for OCR processing Extracts clean text from each page (PDF or image) Parses patient fields (Name, DOB, Diagnosis, Medications, etc.) Saves records into a structured Google Sheet 🛠️ Setup Instructions Step 1: Google Sheet Prep Create a Google Sheet with these columns (case-sensitive): Name, Date of Birth, Patient ID, Date of Visit, Referring Physician, Department, Symptoms, Blood Pressure, Heart Rate, Temperature, Lab Results, Diagnosis, Medications, Next Appointment, Notes Step 2: Mistral API Access Sign up at Mistral AI Get your API key Ensure your plan supports file upload & OCR endpoints Step 3: Google OAuth Credentials (Self-hosted or Cloud) Go to n8n → Settings → Credentials, and add: Google Sheets OAuth2 Scopes needed: https://www.googleapis.com/auth/spreadsheets Step 4: Import Workflow Go to Workflows > Import from File Upload your JSON file Replace: Google Sheet document ID in the "Google Sheets" node Your Mistral API key in HTTP Header Auth Step 5: (Optional) Make Form Public In Cloud-based n8n: You can expose the form as a public page Otherwise, connect it to your website form via webhook 🧩 Customization Tips Extract More Fields Update the "Data cleaning" node and extend the list of fields: const fields = ["Name", "Diagnosis", "Medications", "Symptoms", ...]; Add EHR or Database Integration After Google Sheets, chain your custom system: PostgreSQL Airtable Supabase MongoDB Change Output Format Want JSON or Markdown output for internal tools? Use the Set or Code node before the final output step. 🧪 Troubleshooting Issue Fix File upload fails Check Mistral API key and file type Google Sheets not updating Verify credentials and document ID No data parsed Check OCR quality; verify field labels in document Workflow not triggering Ensure webhook or form is configured correctly 🌐 Self-Hosted vs Cloud Comparison Feature Self-Hosted n8n Cloud Public Form Access Manual setup Built-in OAuth App Config Required Pre-configured Storage Limits Depends on server Included with plan Scalability Fully customizable Scales automatically 📣 Getting Support n8n Docs Mistral API Docs n8n Community Or reach out to: David Olusola (dimejicole21@gmail.com) 🌟 Like this template? Give it a star in the template library and help other no-code builders discover it. "Turn scanned documents into structured data with zero code."
by Yaron Been
Automated system to track and analyze technology stacks used by target companies, helping identify decision-makers and technology trends. 🚀 What It Does Tracks technology stack of target companies Identifies key decision-makers (CTOs, Tech Leads) Monitors technology changes and updates Provides competitive intelligence Generates actionable insights 🎯 Perfect For B2B SaaS companies Technology vendors Sales and business development teams Competitive intelligence analysts Market researchers ⚙️ Key Benefits ✅ Identify potential customers ✅ Stay ahead of technology trends ✅ Target decision-makers effectively ✅ Monitor competitor technology stacks ✅ Data-driven sales strategies 🔧 What You Need BuiltWith API key n8n instance CRM integration (optional) Email/Slack for alerts 📊 Data Tracked Company technologies Hosting providers Frameworks and libraries Analytics tools Marketing technologies 🛠️ Setup & Support Quick Setup Deploy in 20 minutes with our step-by-step guide 📺 Watch Tutorial 💼 Get Expert Support 📧 Direct Help Gain a competitive edge by understanding the technology landscape of your target market.
by Manuel
Effortlessly optimize your workflow by automatically importing hundreds of manufacturers from a Google Sheet into your Shopware online store, saving countless hours of manual work. How it works Retrieve all manufactures from a Google Sheet Add each manufacture via Shopware sync API Endpoint to Shopware Upload a logo for each manufacture from a provided public URL to Shopware Set Up Steps Add your Shopware url to first node called Settings Create a Google Sheet in your Google account with the following columns (Demo Sheet) name (the name of the manufacturer which has to be unique and is required) website (url to the manufacturer website) description logo_url (public manufcaturer logo url. Have to be a png, jpg or svg file) translation_language_code_1 (optional. Language Code of your language. For example 'es-ES' for spanish. You have to make sure a language with this code exists in your Shopware shop.) translation_name_1 (optional. Manufacturer name translated to the language you defined at translation_language_code_1) translation_description_1 (optional. Manufacturer description translated to the language you defined at translation_language_code_1) translation_language_code_2 (optional. Same as translation_language_code_1 for another language) translation_name_2 (optional. Same as translation_name_1 for another language) translation_description_2 (optional. Same as translation_description_1 for another language) translation_language_code_3 (optional. Same as translation_language_code_1 for another language) translation_name_3 (optional. Same as translation_name_1 for another language) translation_description_3 (optional. Same as translation_description_1 for another language) Connect to your Google account Connect to your Shopware account Create a Shopware Integration Connect to Shopware at the nodes "Import Manufacturer" and "Upload Manufacturer Logo" using a Generic OAuth2 API Authentication with Grant Type "Client Credentials". The Access Token URL is https://your-shopware-domain.com/api/oauth/token. Run the workflow
by Airtop
Scoring LinkedIn Profiles Against Your ICP Use Case This automation scores individual LinkedIn profiles against your Ideal Customer Profile (ICP) based on interest in AI, technical depth, and seniority level. It's ideal for prioritizing leads and understanding how well a person fits your ICP criteria. What This Automation Does Given a LinkedIn profile and an Airtop profile, it: Extracts relevant data from the person's profile Determines levels of AI interest, seniority, and technical depth Calculates an ICP score based on weighted criteria Returns the full enriched profile with the score Input parameters: LinkedIn Profile URL** (e.g., https://linkedin.com/in/janedoe) Airtop Profile** connected to LinkedIn ICP scoring method** in the Airtop node prompt Output fields in JSON format: Full name, job title, employer, company LinkedIn URL, location, number of connections and followers, about section content and more Calculated ICP Score (out of 100) How It Works Form Trigger or Workflow Trigger: Accepts input from either a form or another workflow. Parameter Assignment: Ensures proper variable names for downstream nodes. Airtop Enrichment Tool: Extracts and scores the person based on a detailed prompt. Scoring: Uses this point system: AI Interest: beginner (5), intermediate (10), advanced (25), expert (35) Technical Depth: basic (5), intermediate (15), advanced (25), expert (35) Seniority Level: junior (5), mid-level (15), senior (25), executive (30) Output Formatting: Cleans and returns the result as JSON. Setup Requirements IMPORTANT: Enter your ICP scoring method in the prompt field of the Airtop node Airtop Profile connected to LinkedIn. Airtop API credentials configured in n8n. Optional: a front-end form to collect profile URLs and trigger the automation. Next Steps Embed in CRM**: Trigger this automation on new leads to auto-score them. Batch Process Leads**: Run it over a list of profile URLs for segmentation. Customize Scoring**: Adjust point weights based on your sales priorities. Read more about Scoring LinkedIn Profiles Against Your ICP
by Martijn Smit
This workflow template helps Todoist users get a weekly overview of their completed tasks via email, making it easier to review their past week. Why use this workflow? Todoist doesn’t provide completed task reports or filters in its built-in reports or n8n app. This workflow solves that by using Todoist’s public API to fetch your completed tasks. How it works Runs every Friday afternoon (or manually). Uses the Todoist public API to retrieve completed tasks. Excludes specific projects you set (e.g., a grocery list). Sends an email summary, grouping tasks by the day they were completed. Set up steps Copy your Todoist API token (found here). Create a Todoist API credential in n8n. Create an SMTP credential in n8n. Alternatively, use a preferred email service like Brevo, Mailjet, etc. Import this workflow template. In the Get completed tasks via Todoist API step, select your Todoist API credential. In the Send Email step: Select your SMTP credential. Set the sender and recipient email addresses. Run the workflow manually and check your inbox! Ignoring specific projects If you do not want your grocery list, workouts, or other tasks from specific Todoist projects showing up in your weekly summary, modify the step called Optional: Ignore specific projects and change this line: const ignoredProjects = ['2335544024']; This should be an array with the id of each project you'd like to ignore. You can find a list of your projects (inc. their Ids) by visiting this link: https://api.todoist.com/rest/v2/projects
by Lucas Peyrin
How it works This workflow demonstrates a fundamental pattern for securing a webhook by requiring an API key. It acts as a gatekeeper, checking for a valid key in the request header before allowing the request to proceed. Incoming Request: The Secured Webhook node receives an incoming POST request. It expects an API key to be sent in the x-api-key header. API Key Verification: The Check API Key node takes the key from the incoming request's header. It then makes an internal HTTP request to a second webhook (Get API Key) which acts as a mock database. This second webhook retrieves a list of registered API keys (from the Registered API Keys node) and filters it to find a match for the key that was provided. Conditional Response: If a match is found, the API Key Identified node routes the execution to the "success" path, returning a 200 OK response with the identified user's ID. If no match is found, it routes to the "unauthorized" path, returning a 401 Unauthorized error. This pattern separates the public-facing endpoint from the data source, which is a good security practice. Set up steps Setup time: ~2 minutes This workflow is designed to be a self-contained example. Set up Credentials: This workflow uses "Header Auth" for its internal communication. Go to Credentials and create a new Header Auth credential. You can use any name and value (e.g., Name: X-N8N-Auth, Value: my-secret-password). Select this credential in all four webhook/HTTP Request nodes. Add Your API Keys: Open the Registered API Keys node. This is your mock database. Edit the array to include the user_id and api_key pairs you want to authorize. Activate the workflow. Test it: Use the Test Secure Webhook node to send a request. Try it with a valid key from your list to see the success response. Change the x-api-key header to an invalid key to see the 401 Unauthorized error. For Production: Replace the mock database part of this workflow (the Get API Key webhook and Registered API Keys node) with a real database node like Supabase, Postgres, or Baserow to look up keys.