by EmailListVerify
This workflow allows to : scrape Google Maps data using SerpAPI discovery generic email addresses like contact@ using EmailListVerify API Who’s it for This template is designed to prepare cold outreach for local businesses like restaurants or hotels (you need to target a type of business that is listed on Google Maps). This template will generate a list of leads with phone numbers and email addresses. The email addresses you will get are generic, like contact@. This isn’t a problem if you are targeting small businesses, as the owner will most likely monitor those emails. However, if your ideal customer profile has more than 20 employees, I do not recommend using those email addresses for cold outreach. Requirement This template uses: Google Sheet to handle input and output data SerpAPI to scrape Google Maps (250 searches/month in the free plan) EmailListVerify to discover email (from $0.05 per email) Notes This template is an extension of Lucas Perret template (adding email discovery module). If there is some interest in it, I can make a similar template using Apify as an alternative to SerpAPI for Google Map scraping.
by Marth
How It Works: The 5-Node Anomaly Detection Flow This workflow efficiently processes logs to detect anomalies. Scheduled Check (Cron Node): This is the primary trigger. It schedules the workflow to run at a defined interval (e.g., every 15 minutes), ensuring logs are routinely scanned for suspicious activity. Fetch Logs (HTTP Request Node): This node is responsible for retrieving logs from an external source. It sends a request to your log API endpoint to get a batch of the most recent logs. Count Failed Logins (Code Node): This is the core of the detection logic. The JavaScript code filters the logs for a specific event ("login_failure"), counts the total, and identifies unique IPs involved. This information is then passed to the next node. Failed Logins > Threshold? (If Node): This node serves as the final filter. It checks if the number of failed logins exceeds a threshold you set (e.g., more than 5 attempts). If it does, the workflow is routed to the notification node; if not, the workflow ends safely. Send Anomaly Alert (Slack Node): This node sends an alert to your team if an anomaly is detected. The Slack message includes a summary of the anomaly, such as the number of failed attempts and the IPs involved, enabling a swift response. How to Set Up Implementing this essential log anomaly detector in your n8n instance is quick and straightforward. Prepare Your Credentials & API: Log API: Make sure you have an API endpoint or a way to get logs from your system (e.g., a server, CMS, or application). The logs should be in JSON format, and you'll need any necessary API keys or tokens. Slack Credential: Set up a Slack credential in n8n and get the Channel ID of your security alert channel (e.g., #security-alerts). Import the Workflow JSON: Create a new workflow in n8n and choose "Import from JSON." Paste the JSON code (which was provided in a previous response). Configure the Nodes: Scheduled Check (Cron): Set the schedule according to your preference (e.g., every 15 minutes). Fetch Logs (HTTP Request): Update the URL and header/authentication to match your specific log API endpoint. Count Failed Logins (Code): Verify that the JavaScript code matches your log's JSON format. You may need to adjust log.event === 'login_failure' if your log events use a different name. Failed Logins > Threshold? (If): Adjust the threshold value (e.g., 5) based on your risk tolerance. Send Anomaly Alert (Slack): Select your Slack credential and enter the correct Channel ID. Test and Activate: Manual Test: Run the workflow manually to confirm it fetches logs and processes them correctly. You can temporarily lower the threshold to 0 to ensure the alert is triggered. Verify Output: Check your Slack channel to confirm that alerts are formatted and sent correctly. Activate: Once you're confident in its function, activate the workflow. n8n will now automatically monitor your logs on the schedule you set.
by Meelioo
How it Works This workflow creates automated daily backups of your n8n workflows to a GitLab repository: Scheduled Trigger - Runs automatically at noon each day to initiate the backup process Fetch Workflows - Retrieves all active workflows from your n8n instance, filtering out archived ones Compare & Process - Checks existing files in GitLab and compares them with current workflows Smart Upload - For each workflow, either updates the existing file in GitLab (if it exists) or creates a new one Notification System - Sends success/failure notifications to a designated Slack channel with execution details >The workflow intelligently handles each file individually, cleaning up unnecessary metadata before converting workflows to formatted JSON files ready for version control. Set up Steps Estimated setup time: 15-20 minutes You'll need to configure three credential connections and customize the Configuration node: GitLab API**: Create a project access token with write permissions to your backup repository n8n Internal API**: Generate an API key from your n8n user settings Slack Bot**: Set up a Slack app with bot token permissions for posting messages to your notification channel > Once credentials are configured, update the Configuration node with your GitLab project owner, repository name, and target branch. The workflow includes detailed setup instructions in the sticky notes for each credential type. After setup, activate the workflow to begin daily automated backups.
by Harsh Maniya
Build an AI Research Assistant for WhatsApp with Perplexity and Claude 💡 Ever wished you could get a deep, multi-source research report on any topic, delivered directly to your WhatsApp chat in seconds? This workflow transforms your WhatsApp into a powerful, on-demand research assistant, perfect for students, professionals, and curious minds. Leveraging the deep research capabilities of Perplexity, the nuanced formatting skills of Anthropic's Claude, and the messaging power of Twilio, this workflow automates the entire process from query to polished answer. Ask it anything, and receive a well-structured, easy-to-read summary moments later. What This Workflow Does 🚀 📲 Listens for Incoming Queries: The workflow starts when a user sends a message to your configured Twilio WhatsApp number. 🧠 Performs Deep Research: It takes the user's query and feeds it to the Perplexity node, which uses the sonar-pro model to conduct a comprehensive, multi-source analysis on the topic. 🎨 Polishes the Content: The raw, detailed research report from Perplexity is then passed to an Anthropic Claude model. This crucial step refines the text, adds engaging emojis, ensures it's under the WhatsApp character limit, and formats it perfectly for mobile viewing. 💬 Sends an Instant Reply: The final, beautifully formatted summary is sent directly back to the user's WhatsApp chat via Twilio, completing the entire request. Nodes Used 🛠️ Webhook: To receive the initial message from Twilio and trigger the workflow. Perplexity: To perform the AI-powered deep research on the user's query. Anthropic (via LangChain): To connect to and use the Claude model for reformatting and polishing the content. Twilio: To send the final, formatted message back to the user on WhatsApp. How to Set Up This Workflow ⚙️ This workflow requires careful setup between n8n and Twilio to function correctly. Follow these steps closely. 1\. Prerequisites ✅ You will need accounts for the following services: n8n (Cloud or self-hosted) Twilio Perplexity AI (for an API key) Anthropic (for a Claude API key) 2\. Configure Credentials 🔑 In your n8n instance, add your API keys for Twilio, Perplexity, and Anthropic. You can add credentials in n8n by going to the Credentials tab in the left-hand menu. Learn more about managing credentials in n8n. 3\. Set Up Your Twilio WhatsApp Number 📱 Log in to your Twilio account. Either purchase a phone number that is WhatsApp-enabled or use the free Twilio Sandbox for WhatsApp for testing. Follow Twilio's guide to connect your number or sandbox to the WhatsApp Business API. Read Twilio's Getting Started with WhatsApp Guide. 4\. Expose Your n8n Webhook URL 🌐 For Twilio to communicate with n8n, your n8n webhook URL must be publicly accessible. Open the Fetch Whatsapp Request (Webhook) node in the workflow. You will see two URLs: Test and Production. For this guide, we will use the Test URL. If you are running n8n locally or on a private server, you must expose this URL to the public internet. You can do this easily using n8n's built-in tunnel feature. Start n8n from your computer's command line with the following command: n8n start --tunnel After starting, n8n will provide you with a public "tunnel URL". It will look something like https://[subdomain].hooks.n8n.cloud/. Copy this public tunnel URL. Learn more about tunneling in the n8n docs. 5\. Connect Twilio to Your n8n Webhook 🔗 In your Twilio Console, navigate to the settings for the phone number (or sandbox) you configured in Step 3. Scroll down to the Messaging section. Find the field labeled "A MESSAGE COMES IN". Paste your n8n Test URL (or your public tunnel URL from the previous step) into this field. Ensure the dropdown next to it is set to HTTP POST. Click Save. 6\. Activate and Test Your Workflow ▶️ Go back to your n8n canvas. Click "Test workflow" in the top right corner. This will put your webhook in a "listening" state. Now, send a message from your personal WhatsApp to your configured Twilio WhatsApp number. You should see the workflow execute successfully in n8n and receive a formatted reply on WhatsApp\! Once you've confirmed it works, save and activate the workflow to have it run permanently. Further Reading 📚 n8n Webhook Node Documentation n8n Twilio Node Documentation Perplexity API Documentation Anthropic API Documentation
by Luis Acosta
🎧 Upload Podcast Episodes to Spotify via RSS & Google Drive Skip the manual steps and publish your podcast episodes to Spotify in minutes — fully automated. This workflow takes your finished audio, uploads it to Google Drive, updates your podcast’s RSS feed in GitHub, and pushes it live on Spotify and other platforms linked to that feed. No more copy-pasting links or manually editing XML files — everything happens in one click. It’s perfect for podcasters who already have an RSS feed connected to Spotify for Podcasters and want a repeatable, hands-free publishing process. 💡 What this workflow does ✅ Reads your finished MP3 from a local path or previous automation step ☁️ Uploads the audio to Google Drive and creates a public share link 📄 Fetches your existing rss.xml file from GitHub ➕ Appends a new <item> entry with title, description, publication date, and MP3 link 🔄 Commits the updated RSS file back to GitHub, triggering updates on Spotify 🎯 Ensures your episode appears on Spotify once your RSS is already linked in Spotify for Podcasters 🛠 What you’ll need A Google Drive account with OAuth credentials and a target folder ID A GitHub repository containing your rss.xml file An RSS feed connected to Spotify for Podcasters (set this up once before running the workflow) An MP3 file that meets Spotify’s audio format requirements ✨ Use cases Automate weekly or daily podcast publishing to Spotify Push your AI-generated podcast episodes live without manual editing Maintain a single source of truth for your feed in GitHub while streaming across multiple platforms 📬 Contact & Feedback Need help customizing this? Have ideas for improvement? 📩 Luis.acosta@news2podcast.com Or DM me on Twitter @guanchehacker If you’re building something more advanced with audio + AI, like fully automated podcast creation and publishing, let’s talk — I might have the missing piece you need.
by DataMinex
Dynamic Search Interface with Elasticsearch and Automated Report Generation 🎯 What this workflow does This template creates a comprehensive data search and reporting system that allows users to query large datasets through an intuitive web form interface. The system performs real-time searches against Elasticsearch, processes results, and automatically generates structured reports in multiple formats for data analysis and business intelligence. Key Features: 🔍 Interactive web form for dynamic data querying ⚡ Real-time Elasticsearch data retrieval with complex filtering 📊 Auto-generated reports (Text & CSV formats) with custom formatting 💾 Automatic file storage system for data persistence 🎯 Configurable search parameters (amounts, time ranges, entity filters) 🔧 Scalable architecture for handling large datasets 🛠️ Setup requirements Prerequisites Elasticsearch cluster** running on https://localhost:9220 Transaction dataset** indexed in bank_transactions index Sample dataset**: Download from Bank Transaction Dataset File system access** to /tmp/ directory for report storage HTTP Basic Authentication** credentials for Elasticsearch Required Elasticsearch Index Structure This template uses the Bank Transaction Dataset from GitHub: https://github.com/dataminexcode/n8n-workflow/blob/main/Dynamic%20Search%20Interface%20with%20Elasticsearch%20and%20Automated%20Report%20Generation/data You can use this python script for importing the csv file into elasticsearch: Python script for importing data Your bank_transactions index should contain documents with these fields: { "transaction_id": "TXN_123456789", "customer_id": "CUST_000001", "amount": 5000, "merchant_category": "grocery_net", "timestamp": "2025-08-10T15:30:00Z" } Dataset Info: This dataset contains realistic financial transaction data perfect for testing search algorithms and report generation, with over 1 million transaction records including various transaction patterns and data types. Credentials Setup Create HTTP Basic Auth credentials in n8n Configure with your Elasticsearch username/password Assign to the "Search Elasticsearch" node ⚙️ Configuration 1. Form Customization Webhook Path**: Update the webhook ID if needed Form Fields**: Modify amounts, time ranges, or add new filters Validation**: Adjust required fields based on your needs 2. Elasticsearch Configuration URL**: Change localhost:9220 to your ES cluster endpoint Index Name**: Update bank_transactions to your index name Query Logic**: Modify search criteria in "Build Search Query" node Result Limit**: Adjust the size: 100 parameter for more/fewer results 3. File Storage Directory**: Change /tmp/ to your preferred storage location Filename Pattern**: Modify fraud_report_YYYY-MM-DD.{ext} format Permissions**: Ensure n8n has write access to the target directory 4. Report Formatting CSV Headers**: Customize column names in the Format Report node Text Layout**: Modify the report template for your organization Data Fields**: Add/remove transaction fields as needed 🚀 How to use For Administrators: Import this workflow template Configure Elasticsearch credentials Activate the workflow Share the webhook URL with data analysts For Data Analysts: Access the search interface via the webhook URL Set parameters: Minimum amount, time range, entity filter Choose format: Text report or CSV export Submit form to generate instant data report Review results in the generated file Sample Use Cases: Data analysis**: Search for transactions > $10,000 in last 24 hours Entity investigation**: Filter all activity for specific customer ID Pattern analysis**: Quick analysis of transaction activity patterns Business reporting**: Generate CSV exports for business intelligence Dataset testing**: Perfect for testing with the transaction dataset 📊 Sample Output Text Report Format: DATA ANALYSIS REPORT Search Criteria: Minimum Amount: $10000 Time Range: Last 24 Hours Customer: All Results: 3 transactions found TRANSACTIONS: Transaction ID: TXN_123456789 Customer: CUST_000001 Amount: $15000 Merchant: grocery_net Time: 2025-08-10T15:30:00Z CSV Export Format: Transaction_ID,Customer_ID,Amount,Merchant_Category,Timestamp "TXN_123456789","CUST_000001",15000,"grocery_net","2025-08-10T15:30:00Z" 🔧 Customization ideas Enhanced Analytics Features: Add data validation and quality checks Implement statistical analysis (averages, trends, patterns) Include data visualization charts and graphs Generate summary metrics and KPIs Advanced Search Capabilities: Multi-field search with complex boolean logic Fuzzy search and text matching algorithms Date range filtering with custom periods Aggregation queries for data grouping Integration Options: Email notifications**: Alert teams of significant data findings Slack integration**: Post analytics results to team channels Dashboard updates**: Push metrics to business intelligence systems API endpoints**: Expose search functionality as REST API Report Enhancements: PDF generation**: Create formatted PDF analytics reports Data visualization**: Add charts, graphs, and trending analysis Executive summaries**: Include key metrics and business insights Export formats**: Support for Excel, JSON, and other data formats 🏷️ Tags elasticsearch, data-search, reporting, analytics, automation, business-intelligence, data-processing, csv-export 📈 Use cases Business Intelligence**: Organizations analyzing transaction patterns and trends E-commerce Analytics**: Detecting payment patterns and customer behavior analysis Data Science**: Real-time data exploration and pattern recognition systems Operations Teams**: Automated reporting and data monitoring workflows Research & Development**: Testing search algorithms and data processing techniques Training & Education**: Learning Elasticsearch integration with realistic datasets Financial Technology**: Transaction data analysis and business reporting systems ⚠️ Important notes Security Considerations: Never expose Elasticsearch credentials in logs or form data Implement proper access controls for the webhook URL Consider encryption for sensitive data processing Regular audit of generated reports and access logs Performance Tips: Index optimization improves search response times Consider pagination for large result sets Monitor Elasticsearch cluster performance under load Archive old reports to manage disk usage Data Management: Ensure data retention policies align with business requirements Implement audit trails for all search operations Consider data privacy requirements when processing datasets Document all configuration changes for maintenance This template provides a production-ready data search and reporting system that can be easily customized for various data analysis needs. The modular design allows for incremental enhancements while maintaining core search and reporting functionality.
by spencer owen
How it works Uses the rentcast.io api to get approximate value of real estate. Updates the asset in YNAB. Setup Get Rentcast.io api key Get YNAB API Key Get YNAB Buget ID and Account ID This can be done by navigating to your budget in the browser, then extracting ID from the URL https://app.ynab.com/XXXX/accounts/YYYY xxxx = Budget ID yyyy = Account ID If you don't already have an account to track your property, create a new Unbudgeted tracking asset. Set the veriables in the 'Set Fields' node (Or setup subworkflow if you have multiple properties). | Variable| Explination | Example | | --- | --- | --- | |rentcast_api | api key for rentcast | | | ynab_api | apk key for ynab | | | address | Exact address. Its recomended to look it up in rentcast first since they use non standard values like 'srt' , 'ave', ect... |1600 Pennsylvania Ave NW, Washington, DC 20500 | | propertyType | one of 'Single Family', 'Condo', 'Apartment', see api docs for all options | Single Family | | bedrooms | Number of bedrooms (whole number) | 3 | | bathrooms | Number of bathrooms, while fractions (2.5) are probably supported, they haven't been tested | 2 | | squareFootage | Total square feet | 1500 | | ynab_budget | Budget ID (derive from URL) |xxxx| | ynab_account | Account ID (derive from URL ) | yyyy |
by Evilasio Ferreira
This workflow allows users to validate an address and generate a Street View image using Google Maps APIs. It starts with a simple form where the user enters an address. The workflow validates the input using the Geocoding API, extracts the coordinates, and then requests a Street View image for that location. If the image is available, it is stored in Google Drive and presented through a dynamic HTML response page. The workflow also includes error handling to manage invalid addresses or locations where Street View is not available, ensuring a reliable and user-friendly experience. How it works A Form Trigger collects the address input from the user The address is validated using the Google Geocoding API Latitude and longitude are extracted from the response The workflow requests a Street View image using the coordinates The image availability is validated The image is uploaded to Google Drive A response page is generated (success or error) Setup steps Create a Google Maps API key (Geocoding + Street View enabled) Add your API key in the workflow (or use environment variables) Connect your Google Drive credentials Adjust image parameters if needed (size, resolution, etc.) Activate the workflow and test using the form After setup, users can input any address and receive a validated result with a corresponding Street View image or a clear error message if the request cannot be fulfilled.
by higashiyama
Personal Daily Morning Briefing Automation Who’s it for Busy professionals who want a quick daily update combining their calendar, weather, and top news. How it works Every morning at 7 AM, this workflow gathers: Today’s Google Calendar events Current weather for Tokyo Top 3 news headlines (from Google News RSS) Then it formats everything into a single Slack message. How to set up Connect your Google Calendar and Slack accounts in the Credentials section. Update rssUrl or weatherApiUrl if you want different sources. Set your Slack channel in the "Post to Slack" node. Requirements Google Calendar and Slack accounts RSS feed and weather API (no authentication required) How to customize You can modify: The trigger time (in the Schedule Trigger node) City for the weather RSS feed source Message format in the “Format Briefing Message” node
by dave
Temporary solution using the undocumented REST API for backups with file versioning (Nextcloud)
by Hermilio
Executes schedule routines, and triggers alerts via telegram
by isa024787bel
Automatically adding expense receipts to Google Sheets with Telegram, Mindee API, Twilio, and n8n.