by Arthur Braghetto
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Clean Web Content Extraction with Anti-Bot Fallback Extract clean and structured text from any webpage with optional fallback to an anti-bot scraping service. Ideal for AI tools and content workflows. 🧠 How it Works This sub-workflow enables reliable and clean scraping of any public webpage by simply passing a url parameter. It is designed to be embedded into other workflows or used as a tool for AI agents. It supports two output modes: fulltext:* true — returns *{ title, text } with full page content fulltext:* false — returns *{ title, url, content } with a short excerpt 💡 If the site is protected by anti-bot systems (like Cloudflare), it will automatically fallback to Scrape.do, a scraping API with a generous free plan. 🧩 This template requires the n8n-nodes-webpage-content-extractor community node, so it only works in self-hosted n8n environments. 🚀 Use Cases As a reusable sub-workflow, via Execute Sub-workflow node. As a tool for an AI Agent, compatible with Call n8n Workflow Tool. Perfect for chatbots, summarization workflows, or RSS/feed enrichment. Empowers your AI Agent with the ability to browse and extract readable content from websites automatically. 🔖 Parameters url (string): the webpage URL to scrape fulltext (boolean): set true for full page content, false for summarized output ⚙️ Setup Install the community node n8n-nodes-webpage-content-extractor in your self-hosted n8n instance. Create a free account at Scrape.do and obtain your API Token. In the workflow, locate the Scrape.do HTTP Request node and configure the credentials using your API Token. Detailed step-by-step instructions are available in the workflow notes. The Scrape.do API is only used as a fallback when conventional scraping fails, helping you preserve your API credits.
by Federico De Ponte
🔁 Loop & Optimize Meta Tags with Google Gemini This workflow automates the shortening of meta titles and descriptions for SEO—directly from your Google Sheet, row by row, using Google Gemini. ✅ What it does Reads rows from a Google Sheet (meta_title, meta_description, row_index) Loops through each row and checks if content exists Sends the data to Google Gemini for length-optimized output Cleans and parses the response Updates the original sheet with the shortened results 🛠️ Setup Requirements Google Sheets (OAuth2 credentials connected in n8n) Google Gemini API key (configured in n8n credentials) Sheet must contain: row_index meta_title meta_description Output will be written into: meta_titleFixed meta_descriptionFixed
by David Olusola
🤖 AI-Powered Lead Enrichment with Explorium MCP & Telegram Who it's for Sales reps, agencies, and growth teams who want to turn basic company info into qualified leads with automated research . Perfect for B2B prospecting. What it does This workflow lets you send a company name or domain via Telegram, and instantly returns: ✅ Enriched company profile (industry, size, tech, pain points) ✅ A clean, structured JSON — ready for your CRM or sales tools How it works 💬 Send company info to your Telegram bot 🔎 Workflow pulls data from Explorium MCP + Tavily 🧠 AI analyzes model, tools, pain points & goals 📤 JSON response sent back via Telegram or logged to your database Requirements 🔐 OpenAI API (GPT-4) 🧠 Explorium MCP API 🌐 Tavily Web Search API 🤖 Telegram Bot API 🗃️ PostgreSQL (for memory/logging) How to set up Add API keys in n8n Connect Telegram bot to webhook Set up PostgreSQL for memory persistence Customize prompts (tone, niche, etc.) Test by sending a company name via Telegram Customization Options 🎯 Focus enrichment on specific industries or keywords 💬 Adjust the email sequence structure & style 🧩 Add extra data sources (e.g. Clearbit, Crunchbase) 🧾 Format JSON to match your CRM schema ⚙️ Add approval step before sending emails Highlights ✅ Uses multi-source enrichment ✅ Works 100% from Telegram ✅ Integrates into any sales pipeline
by explorium
Salesforce Lead Enrichment with Explorium Template Download the following json file and import it to a new n8n workflow: salesforce\_Workflow.json Overview This n8n workflow monitors your Salesforce instance for new leads and automatically enriches them with missing contact information. When a lead is created, the workflow: Detects the new lead via Salesforce trigger Matches the lead against Explorium's database using name and company Enriches the lead with professional email addresses and phone numbers Updates the Salesforce lead record with the discovered contact information This automation ensures your sales team always has the most up-to-date contact information for new leads, improving reach rates and accelerating the sales process. Key Features Real-time Processing**: Triggers automatically when new leads are created in Salesforce Intelligent Matching**: Uses lead name and company to find the correct person in Explorium's database Contact Enrichment**: Adds professional emails, mobile phones, and office phone numbers Batch Processing**: Efficiently handles multiple leads to optimize API usage Error Handling**: Continues processing other leads even if some fail to match Selective Updates**: Only updates leads that successfully match in Explorium Prerequisites Before setting up this workflow, ensure you have: n8n instance (self-hosted or cloud) Salesforce account with: OAuth2 API access enabled Lead object permissions (read/write) API usage limits available Explorium API credentials (Bearer token) - Get explorium api key Basic understanding of Salesforce lead management Salesforce Requirements Required Lead Fields The workflow expects these standard Salesforce lead fields: FirstName - Lead's first name LastName - Lead's last name Company - Company name Email - Will be populated/updated by the workflow Phone - Will be populated/updated by the workflow MobilePhone - Will be populated/updated by the workflow API Permissions Your Salesforce integration user needs: Read access to Lead object Write access to Lead object fields (Email, Phone, MobilePhone) API enabled on the user profile Sufficient API calls remaining in your org limits Installation & Setup Step 1: Import the Workflow Copy the workflow JSON from the template In n8n: Navigate to Workflows → Add Workflow → Import from File Paste the JSON and click Import Step 2: Configure Salesforce OAuth2 Credentials Click on the Salesforce Trigger node Under Credentials, click Create New Follow the OAuth2 flow: Client ID: From your Salesforce Connected App Client Secret: From your Salesforce Connected App Callback URL: Copy from n8n and add to your Connected App Authorize the connection Save the credentials as "Salesforce account connection" Note: Use the same credentials for all Salesforce nodes in the workflow. Step 3: Configure Explorium API Credentials Click on the Match\_prospect node Under Credentials, click Create New (HTTP Header Auth) Configure the header: Name: Authorization Value: Bearer YOUR_EXPLORIUM_API_TOKEN Save as "Header Auth account" Apply the same credentials to the Explorium Enrich Contacts Information node Step 4: Verify Node Settings Salesforce Trigger: Trigger On: Lead Created Poll Time: Every minute (adjust based on your needs) Salesforce Get Leads: Operation: Get All Condition: CreatedDate = TODAY (fetches today's leads) Limit: 20 (adjust based on volume) Loop Over Items: Batch Size: 6 (optimal for API rate limits) Step 5: Activate the Workflow Save the workflow Toggle the Active switch to ON The workflow will now monitor for new leads every minute Detailed Node Descriptions Salesforce Trigger: Polls Salesforce every minute for new leads Get Today's Leads: Retrieves all leads created today to ensure none are missed Loop Over Items: Processes leads in batches of 6 for efficiency Match Prospect: Searches Explorium for matching person using name + company Filter: Checks if a valid match was found Extract Prospect IDs: Collects all matched prospect IDs Enrich Contacts: Fetches detailed contact information from Explorium Merge: Combines original lead data with enrichment results Split Out: Separates individual enriched records Update Lead: Updates Salesforce with new contact information Data Mapping The workflow maps Explorium data to Salesforce fields as follows: | Explorium Field | Salesforce Field | Fallback Logic | | ------------------- | ---------------- | --------------------------------- | | emails[0].address | Email | Falls back to professions_email | | mobile_phone | MobilePhone | Falls back to phone_numbers[1] | | phone_numbers[0] | Phone | Falls back to mobile_phone | Usage & Monitoring Automatic Operation Once activated, the workflow runs automatically: Checks for new leads every minute Processes any leads created since the last check Updates leads with discovered contact information Continues running until deactivated Manual Testing To test the workflow manually: Create a test lead in Salesforce Click "Execute Workflow" in n8n Monitor the execution to see each step Verify the lead was updated in Salesforce Monitoring Executions Track workflow performance: Go to Executions in n8n Filter by this workflow Review successful and failed executions Check logs for any errors or issues Troubleshooting Common Issues No leads are being processed Verify the workflow is activated Check Salesforce API limits haven't been exceeded Ensure new leads have FirstName, LastName, and Company populated Confirm OAuth connection is still valid Leads not matching in Explorium Verify company names are accurate (not abbreviations) Check that first and last names are properly formatted Some individuals may not be in Explorium's database Try testing with known companies/contacts Contact information not updating Check Salesforce field-level security Verify the integration user has edit permissions Ensure Email, Phone, and MobilePhone fields are writeable Check for validation rules blocking updates Authentication errors Salesforce: Re-authorize OAuth connection Explorium: Verify Bearer token is valid and not expired Check API quotas haven't been exceeded Error Handling The workflow includes built-in error handling: Failed matches don't stop other leads from processing Each batch is processed independently Failed executions are logged for review Partial successes are possible (some leads updated, others skipped) Best Practices Data Quality Ensure complete lead data: FirstName, LastName, and Company should be populated Use full company names: "Microsoft Corporation" matches better than "MSFT" Standardize data entry: Consistent formatting improves match rates Performance Optimization Adjust batch size: Lower if hitting API limits, higher for efficiency Modify polling frequency: Every minute for high volume, less frequent for lower volume Set appropriate limits: Balance between processing speed and API usage Compliance & Privacy Data permissions: Ensure you have rights to enrich lead data GDPR compliance: Consider privacy regulations in your region Data retention: Follow your organization's data policies Audit trail: Monitor who has access to enriched data Customization Options Extend the Enrichment Add more Explorium enrichment by: Adding firmographic data (company size, revenue) Including technographic information Appending social media profiles Adding job title and department verification Modify Trigger Conditions Change when enrichment occurs: Trigger on lead updates (not just creation) Add specific lead source filters Process only leads from certain campaigns Include lead score thresholds Add Notifications Enhance with alerts: Email sales reps when leads are enriched Send Slack notifications for high-value matches Create tasks for leads that couldn't be enriched Log enrichment metrics to dashboards API Considerations Salesforce Limits API calls: Each execution uses \~4 Salesforce API calls Polling frequency: Consider your daily API limit Batch processing: Reduces API usage vs. individual processing Explorium Limits Match API: One call per batch of leads Enrichment API: One call per batch of matched prospects Rate limits: Respect your plan's requests per minute Integration Architecture This workflow can be part of a larger lead management system: Lead Capture → This Workflow → Lead Scoring → Assignment Can trigger additional workflows based on enrichment results Compatible with existing Salesforce automation (Process Builder, Flows) Works alongside other enrichment tools Security Considerations Credentials**: Stored securely in n8n's credential system Data transmission**: Uses HTTPS for all API calls Access control**: Limit who can modify the workflow Audit logging**: All executions are logged with details Support Resources For assistance with: n8n issues**: Consult n8n documentation or community forum Salesforce integration**: Reference Salesforce API documentation Explorium API**: Contact Explorium support for API questions Workflow logic**: Review execution logs for debugging
by Cyril Nicko Gaspar
Amazon Price Monitoring Workflow This workflow enables you to monitor the prices of Amazon product listings directly from a Google Sheet, using data provided by Bright Data’s Amazon Scraper API. It automates the retrieval of price data for specified products and is ideal for market research, competitor analysis, or personal price tracking. ✅ Requirements Before using this template, ensure you have the following: A Bright Data account and access to the Amazon Scraper API. An active API key from Bright Data. A Google Sheet set up with the required columns. N8N account (self-host or cloud version) ⸻ ⚙️ Setup 1. Create a Google Sheet with the following columns: Product URL ZIP Code (used for regional price variations) ASIN (Amazon Standard Identification Number) 2. Extract ASIN Automatically using the following formula in the ASIN column: =REGEXEXTRACT(A2, "/(?:dp|gp/product|product)/([A-Z0-9]{10})") Replace A2 with the appropriate cell reference 3. Obtain an API Key: Sign in to your Bright Data account. Go to the API section to generate an API key. Create a Bearer Authentication Credential using this key in your automation tool. 4. Configure the Workflow: Use a node (e.g., “Google Sheets”) to read data from your sheet. Use an HTTP Request node to send a query to Bright Data’s Amazon API with the ASIN and ZIP code. Parse the returned JSON response to extract product price and other relevant data. Optionally write the output (e.g., current price, timestamp) back into the sheet or another data store. ⸻ Workflow Functionality The workflow is triggered periodically (or manually) and reads product details from your Google Sheet. For each row, it extracts the Product URL and ZIP code and sends a request to the Bright Data API. The API returns product price information, which is then logged or updated back into the sheet using ASIN. You can also map the product URL to the product URL, but ensure that the URL has no parameters. If the URL has appended parameters, refer to the input field from the Bright Data snapshot result. ⸻ 💡 Use Cases E-commerce sellers monitoring competitors’ prices. Consumers tracking price drops on wishlist items. Market researchers collecting pricing data across ZIP codes. Affiliate marketers ensuring accurate product pricing on their platforms. ⸻ 🛠️ Customization Add columns for additional product data such as rating, seller, or stock availability. Schedule the workflow to run hourly, daily, or weekly depending on your needs. Implement email or Slack alerts for significant price changes. Filter by product category or brand to narrow your tracking focus.
by Jean-Marie Rizkallah
🧩 Jamf Patch Summary to Slack Stay on top of software patch compliance by automatically posting Jamf patch summaries to Slack. This helps IT and security teams quickly identify outdated installs and take action—without logging into Jamf. ✅ Prerequisites • A Jamf Pro API key with permissions to read software titles and patch summary • A Slack app or incoming webhook URL with permission to post messages to your desired channel 🔍 How it works • Manually trigger the flow or Add a webhook • Fetch a list of software titles from Jamf Pro • Filter to select the software you're tracking (e.g. Chrome, Edge) • Retrieve the patch summary for that software (latest version, up-to-date, out-of-date counts) • Format the summary into Slack Block Kit • Post the formatted summary into a Slack channel ⚙️ Set up steps • Takes ~5–10 minutes to configure • Set your server BaseURL variable in the Set Node • Add your Jamf Pro API credentials in the HTTP Request nodes (Get & Retrieve) • Set the target software ID in the Filter node • Add your Slack webhook URL or token in the final HTTP node • Optional: Adjust Slack formatting inside the Function node
by Sira Ekabut
This workflow automates AI-based image generation using the Fal.ai Flux API. Define custom prompts, image parameters, and effortlessly generate, monitor, and save the output directly to Google Drive. Streamline your creative automation with ease and precision. Who is this for? This template is for content creators, developers, automation experts, and creative professionals looking to integrate AI-based image generation into their workflows. It’s ideal for generating custom visuals with the Fal.ai Flux API and automating storage in Google Drive. What problem is this workflow solving? Manually generating AI-based images, checking their status, and saving results can be tedious. This workflow automates the entire process — from requesting image generation, monitoring its progress, downloading the result, and saving it directly to a Google Drive folder. What this workflow does Sets Custom Image Parameters: Allows you to define the prompt, resolution, guidance scale, and steps for AI image generation. Sends a Request to Fal.ai: Initiates the image generation process using the Fal.ai Flux API. Monitors Image Status: Checks for completion and waits if needed. Downloads the Generated Image: Fetches the completed image once ready. Saves to Google Drive: Automatically uploads the generated image to a specified Google Drive folder. Setup 1. Prerequisites: • Fal.ai API Key: Obtain it from the Fal.ai platform and set it as the Authorization header in HTTP Header Auth credentials. • Google Drive OAuth Credentials: Connect your Google Drive account in n8n. 2. Configuration: • Update the “Edit Fields” node with your desired image parameters: • Prompt: Describe the image (e.g., “Thai young woman net idol 25 yrs old, walking on the street”). • Width/Height: Define image resolution (default: 1024x768). • Steps: Number of inference steps (e.g., 30). • Guidance Scale: Controls image adherence to the prompt (e.g., 3.5). • Set your Google Drive folder ID in the “Google Drive” node to save the image. 3. Run the Workflow: • Trigger the workflow manually to generate the image. • The workflow waits, checks status, and saves the final output seamlessly. Customization • Modify Image Parameters: Adjust the prompt, resolution, steps, and guidance scale in the “Edit Fields” node. • Change Storage Location: Update the Google Drive node with a different folder ID. • Add Notifications: Integrate an email or messaging node to alert you when the image is ready. • Additional Outputs: Expand the workflow to send the generated image to Slack, Dropbox, or other platforms. This workflow streamlines AI-based image generation and storage, offering flexibility and customization for creative automation.
by Ranjan Dailata
Who this is for? This workflow enables automated, scalable collection of high-quality, AI-ready data from websites using Bright Data’s Web Unlocker, with a focus on preparing that data for LLM training. Leveraging LLM Chains and AI agents, the system formats and extracts key information, then stores the structured embeddings in a Pinecone vector database. This workflow is tailored for: ML Engineers & Researchers building or fine-tuning domain-specific LLMs. AI Startups needing clean, structured content for product training. Data Teams preparing knowledge bases for enterprise-grade AI apps. LLM-as-a-Service Providers sourcing dynamic web content across niches. What problem is this workflow solving? Training a large language model (LLM) requires vast amounts of clean, relevant, and structured data. Manual collection is slow, error-prone, and lacks scalability. This workflow: Automatically extracts web data from specified URLs. Bypasses anti-bot measures using Bright Data’s Web Unlocker. Formats, cleans, and transforms raw content using LLM agents. Stores semantically searchable vectors in Pinecone. Makes datasets AI-ready for fine-tuning, RAG, or domain-specific training. What this workflow does This workflow automates the process of collecting, cleaning, and vectorizing web content to create structured, high-quality datasets that are ready to be used for LLM (Large Language Model) training or retrieval-augmented generation (RAG). Web Crawling with Bright Data Web Unlocker. AI Information Extraction and Data Formatting. AI Data Formatting to produce a JSON structured data. Persistence in Pinecone Vector DB. Handle Webhook notification of structured data. Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. A Google Gemini API key (or access through Vertex AI or proxy). Update the LinkedIn URL by navigating to the Set LinkedIn URL node. Update the Set Fields - URL and Webhook URL node with the URL for web data extraction and the Webhook notification URL. How to customize this workflow to your needs Set Your Target URLs. Target sites that are high-quality, domain-specific, and relevant to your LLM's purpose. Adjust Bright Data Web Unlocker Settings. Geo-location, Headers / User-Agent strings, Retry rules and proxies. Modify the Information Extraction Logic. Change prompts to extract specific attributes. Use structured templates or few-shot examples in prompts. Swap the Embedding Model. Use OpenAI, Hugging Face or other your own hosted embedding model API. Customize Pinecone Metadata Fields. Store extra fields in Pinecone for better filtering & semantic querying. Add Data Validation or Deduplication. Skip duplicates or low-quality content.
by Adam Crafts
🎥 n8n Workflow: Generate AI Videos with HeyGen 🚀 Overview This automation connects directly to HeyGen's powerful AI video generation platform. It allows you to programmatically create videos with digital avatars and voiceovers, perfect for scaling your content creation for social media, marketing campaigns, or personalized messages without ever opening a video editor. 😩 The Problem Creating video content is incredibly time-consuming and expensive. You have to write scripts, record audio, find actors or create complex animations, piece everything together in an editor, and then wait for it to render. Every minor change or personalization requires repeating the entire frustrating process. This manual work is a major bottleneck, making it nearly impossible to produce large volumes of high-quality video content quickly and affordably. ✨ The Solution This workflow acts as your personal, automated video production assistant! When you provide a script, the automation instantly sends instructions to HeyGen to begin creating your video. It tells the AI which avatar and voice to use and starts the generation process. Then, it cleverly waits and periodically checks the status until your new video is finished and ready. It’s a completely hands-off process that transforms simple text into professional AI videos on demand. 🔧 What It Does Send a request to HeyGen's API to generate a video with: Custom avatar Scripted voice-over Background color and dimension Wait 30 seconds Check video status Loop until video is completed, failed, or still processing ⚙️ Simple Setup This workflow is a pre-built blueprint, designed to be up and running in minutes! Upload:** Simply upload the provided JSON file into your n8n instance. Connect:** Connect your app credentials (e.g., your HeyGen account). The workflow will show you exactly where. Activate:** Turn the workflow on, and it's ready to go! Let your new automated employee get to work. This free n8n workflow allows you to generate AI videos using HeyGen via their API. 🌐 Explore more workflows ❤️ Buy more workflows at: adamcrafts 🦾 Custom workflows at: adamcrafts@cloudysoftwares.com adamaicrafts@gmail.com > Build once, customize endlessly, and scale your video content like never before. 🚀
by ist00dent
This n8n workflow provides a simple yet powerful utility to convert Unix timestamps (seconds since epoch) into the universally recognized ISO 8601 date and time format. This is crucial for harmonizing date data across different systems, databases, and applications. 🔧 How it works Receive Timestamp Webhook: This node acts as the entry point, listening for incoming POST requests. It expects a JSON body containing a single property: timestamp, which should be a Unix timestamp in seconds (e.g., 1678886400). Convert to ISO 8601: This node takes the timestamp received from the webhook. Since JavaScript's Date object typically uses milliseconds, it multiplies the Unix timestamp by 1000. It then uses new Date(...).toISOString() to convert this into an ISO 8601 formatted string (e.g., 2023-03-15T00:00:00.000Z) and assigns it to a new property called convertedTime. Respond with Converted Time: This node sends the convertedTime property back as the response to the original webhook caller. 👤 Who is it for? This workflow is extremely useful for: Developers & Integrators: When working with APIs or databases that return dates as Unix timestamps, and you need to display them in a human-readable or standardized format in your applications or dashboards. Data Analysts & Scientists: For cleaning and transforming raw timestamp data from logs, event streams, or legacy systems into a consistent format for analysis. System Administrators: For debugging logs where timestamps are often in Unix format. Anyone Managing Data Imports/Exports: Ensuring date compatibility when moving data between different platforms. Automators: As a building block in larger workflows where incoming data has Unix timestamps that need to be normalized before further processing (e.g., adding to a spreadsheet, sending in an email, or performing date calculations). 📑 Data Structure When you trigger the webhook, send a POST request with a JSON body structured as follows: { "timestamp": 1678886400 } The workflow will return a JSON response similar to this: { "convertedTime": "2023-03-15T00:00:00.000Z" } ⚙️ Setup Instructions Import Workflow: In your n8n editor, click "Import from JSON" and paste the provided workflow JSON. Configure Webhook Path: Double-click the Receive Timestamp Webhook node. In the 'Path' field, set a unique and descriptive path (e.g., /convert-timestamp or /unix-to-iso). Activate Workflow: Save and activate the workflow. 📝 Tips This simple conversion workflow can be drastically enhanced and leveraged in many ways: Dynamic Output Formats: Upgrade: Modify the Convert to ISO 8601 node (or add a Function node after it) to accept an optional format parameter in the webhook. Leverage: Allow users to request formats like MM/DD/YYYY HH:mm:ss, YYYY-MM-DD, DD-MM-YYYY, or just the time, making the output directly usable in various contexts without further processing. Example using a Function node: const date = new Date($json.timestamp * 1000); const format = $json.format || 'iso'; // Default to ISO let output; switch (format.toLowerCase()) { case 'iso': output = date.toISOString(); break; case 'locale': // e.g., "3/15/2023, 12:00:00 AM UTC" output = date.toLocaleString('en-US', { timeZone: 'UTC' }); break; case 'dateonly': // e.g., "2023-03-15" output = date.toISOString().split('T')[0]; break; case 'timeonly': // e.g., "00:00:00 UTC" output = date.toLocaleTimeString('en-US', { timeZone: 'UTC', hour12: false }); break; default: output = date.toISOString(); // Fallback } return [{ json: { convertedTime: output } }]; Timezone Conversion: Upgrade: Combine this with the Time Zone Converter workflow (or integrate moment-timezone.js if using a Code node and have a self-hosted instance). Accept an optional targetTimeZone parameter in the webhook. Leverage: Convert the Unix timestamp directly into a human-readable date and time in a specific target timezone, which is incredibly valuable for global scheduling or reporting. Error Handling and Input Validation: Upgrade: Add an IF node after the Receive Timestamp Webhook. Check if isNaN($json.body.timestamp) or if typeof $json.body.timestamp !== 'number'. Leverage: If the input timestamp is invalid, branch to a Respond to Webhook node that returns a clear error message (e.g., "Invalid timestamp provided. Please provide a numeric Unix timestamp in seconds."). This makes your API more robust. Reverse Conversion (ISO to Unix): Upgrade: Create a separate workflow, or add another branch to this one, to convert an ISO 8601 string back to a Unix timestamp. This provides a complete conversion utility. Example Set node value: ={{ new Date($json.body.isoString).getTime() / 1000 }} Integration with Data Pipelines: Upgrade: Use this workflow as a microservice in larger ETL (Extract, Transform, Load) pipelines. Leverage: If you're pulling data from a source that provides Unix timestamps (e.g., a logging system, IoT device, certain databases), send that data through this workflow to normalize the dates before loading them into your analytics database, CRM, or data warehouse. Automated Reporting: Upgrade: If you have a system that generates reports with Unix timestamps, trigger this webhook for each timestamp. Leverage: Produce reports with human-readable dates for better readability and decision-making for non-technical stakeholders. This workflow is a cornerstone for any automation involving diverse date and time data. By implementing the suggested upgrades, you can transform it from a basic converter into a highly flexible and reliable date-time processing hub.
by Jimleuk
This n8n workflow demonstrates how to automate indexing of images to build a object-based image search. By utilising a Detr-Resnet-50 Object Classification model, we can identify objects within an image and store these associations in Elasticsearch along with a reference to the image. How it works An image is imported into the workflow via HTTP request node. The image is then sent to Cloudflare's Worker AI API where the service runs the image through the Detr-Resnet-50 object classification model. The API returns the object associations with their positions in the image, labels and confidence score of the classification. Confidence scores of less the 0.9 are discarded for brevity. The image's URL and its associations are then index in an ElasticSearch server ready for searching. Requirements A Cloudflare account with Workers AI enabled to access the object classification model. An ElasticSearch instance to store the image url and related associations. Extending this workflow Further enrich your indexed data with additional attributes or metrics relevant to your users. Use a vectorstore to provide similarity search over the images.
by Yaron Been
🚀 Automated Job Hunter: Upwork Opportunity Aggregator & AI-Powered Notifier! Workflow Overview This cutting-edge n8n automation is a sophisticated job discovery and notification tool designed to transform freelance job hunting into a seamless, intelligent process. By intelligently connecting Apify, OpenAI, Google Sheets, and Gmail, this workflow: Discovers Job Opportunities: Automatically scrapes Upwork job listings Tracks recent freelance opportunities Eliminates manual job searching efforts Intelligent Data Processing: Filters and extracts key job details Structures job information Ensures comprehensive opportunity tracking AI-Powered Summarization: Generates concise job summaries Creates human-readable job digests Provides quick, actionable insights Seamless Notification: Automatically logs jobs to Google Sheets Sends personalized email digests Enables rapid opportunity assessment Key Benefits 🤖 Full Automation: Zero-touch job discovery 💡 Smart Filtering: Targeted job opportunities 📊 Comprehensive Tracking: Detailed job market insights 🌐 Multi-Platform Synchronization: Seamless data flow Workflow Architecture 🔹 Stage 1: Job Discovery Scheduled Trigger**: Daily job scanning Apify Integration**: Upwork job scraping Intelligent Filtering**: Recent job postings Specific keywords Relevant opportunities 🔹 Stage 2: Data Extraction Comprehensive Job Metadata Parsing** Key Information Retrieval** Structured Data Preparation** 🔹 Stage 3: AI Summarization OpenAI GPT Processing** Professional Summary Generation** Contextual Job Insight Creation** 🔹 Stage 4: Multi-Platform Distribution Google Sheets Logging** Gmail Integration** Automated Job Digest Delivery** Potential Use Cases Freelancers**: Opportunity tracking Job Seekers**: Automated job discovery Recruitment Agencies**: Market intelligence Skill Development Professionals**: Trend monitoring Career Coaches**: Client opportunity identification Setup Requirements Apify Upwork scraping actor API token Configured scraping parameters OpenAI API GPT model access Summarization configuration API key management Google Sheets Connected Google account Prepared job tracking spreadsheet Appropriate sharing settings Gmail Account Connected email Job digest configuration Appropriate sending permissions n8n Installation Cloud or self-hosted instance Workflow configuration API credential management Future Enhancement Suggestions 🤖 Advanced job matching algorithms 📊 Multi-platform job aggregation 🔔 Customizable alert mechanisms 🌐 Expanded job category tracking 🧠 Machine learning job recommendation Technical Considerations Implement robust error handling Use secure API authentication Maintain flexible data processing Ensure compliance with platform guidelines Ethical Guidelines Respect job poster privacy Use data for legitimate job searching Maintain transparent information gathering Provide proper attribution Hashtag Performance Boost 🚀 #FreelanceJobHunting #CareerAutomation #JobDiscovery #AIJobSearch #WorkflowAutomation #FreelanceTech #CareerIntelligence #JobMarketInsights #ProfessionalNetworking #TechJobSearch Workflow Visualization [Daily Trigger] ⬇️ [Fetch Upwork Jobs] ⬇️ [Format Job Fields] ⬇️ [Log to Google Sheets] ⬇️ [AI Summarization] ⬇️ [Send Email Digest] Connect With Me Ready to revolutionize your job hunting strategy? 📧 Email: Yaron@nofluff.online 🎥 YouTube: @YaronBeen 💼 LinkedIn: Yaron Been Transform your job search with intelligent, automated workflows!