by M Sayed
Stop guessing currency rates! Get a quick and clean exchange rate summary sent right to your phone. 📲 This workflow automatically checks the latest rates and builds a simple report for you. What it does: 🤑 Fetches the very latest exchange rates from an API. 🌍 Shows you what major currencies (like USD & EUR) are worth in your chosen local currency. ✍️ Creates a simple, easy-to-read report. 🚀 Delivers it straight to your Telegram! Setup is easy: All you need to do is set your base currency (e.g., 'EGP') and your Telegram Chat ID. Done! ✅
by Laura Piraux
Use case This automation is for teams working in Notion. When you have a lot of back and forth in the comment section, it’s easy to lose track of what is going on in the conversation. This automation relies on AI to generate a summary of the comment section. How it works Every hour (the trigger can be adapted to your need and usecase), the automation checks if new comments have been added to the pages of your Notion database. If there are new comments, the comments are sent to an AI model to write a summary. The summary is then added to a predefined page property. The automation also updates a “Last execution” property. This prevents to re-generate the AI summary when no new comments have been received. Setup Define your Notion variables: Notion database, property that will hold the AI summary, property that will hold the last execution date of the automation. Set up your Notion credentials. Set up your AI model credentials (API key). How to adjust it to your needs Use the LLM model of your choice. In this template, I used Gemini but you can easily replace it by ChatGPT, Claude, etc. Adapt the prompt to your use case to get better summaries: specify the maximum number of characters, give an example, etc. Adapt the trigger to your needs. You could use Notion webhooks as trigger in order to run the automation only when a new comment is added (this setup is advised if you’re on n8n cloud version).
by Yaron Been
Automated pipeline that extracts job listings from Upwork and exports them to Google Sheets for better organization, analysis, and team collaboration. 🚀 What It Does Fetches job postings based on saved searches Extracts key job details (title, budget, description) Organizes data in Google Sheets Updates in real-time Supports multiple search criteria 🎯 Perfect For Freelancers tracking opportunities Teams managing multiple projects Agencies monitoring client needs Market researchers Business analysts ⚙️ Key Benefits ✅ Centralized job board ✅ Easy sharing with team members ✅ Advanced filtering and sorting ✅ Historical data tracking ✅ Customizable data points 🔧 What You Need Upwork account Google account n8n instance Google Sheets setup 📊 Data Exported Job title and description Budget and hourly rate Client information Posted date Required skills Job URL 🛠️ Setup & Support Quick Setup Get started in 15 minutes with our step-by-step guide 📺 Watch Tutorial 💼 Get Expert Support 📧 Direct Help Streamline your job search and opportunity tracking with automated data collection and organization.
by PollupAI
LinkedIn Profile Enrichment Workflow Who is this for? This workflow is ideal for recruiters, sales professionals, and marketing teams who need to enrich LinkedIn profiles with additional data for lead generation, talent sourcing, or market research. What problem is this workflow solving? Manually gathering detailed LinkedIn profile information can be time-consuming and prone to errors. This workflow automates the process of enriching profile data from LinkedIn, saving time and ensuring accuracy. What this workflow does Input: Reads LinkedIn profile URLs from a Google Sheet. Validation: Filters out already enriched profiles to avoid redundant processing. Data Enrichment: Uses RapidAPI's Fresh LinkedIn Profile Data API to retrieve detailed profile information. Output: Updates the Google Sheet with enriched profile data, appending new information efficiently. Setup Google Sheet: Create a sheet with a column named linkedin_url and populate it with the profile URLs to enrich. RapidAPI Account: Sign up at RapidAPI and subscribe to the Fresh LinkedIn Profile Data API. API Integration: Replace the x-rapidapi-key and x-rapidapi-host values with your credentials from RapidAPI. Run the Workflow: Trigger the workflow and monitor the updates to your Google Sheet. How to customize this workflow Filter Criteria**: Modify the filter step to include additional conditions for processing profiles. API Configuration**: Adjust API parameters to retrieve specific fields or extend usage. Output Format**: Customize how the enriched data is appended to the Google Sheet (e.g., format, column mappings). Error Handling**: Add steps to handle API rate limits or missing data for smoother automation. This workflow streamlines LinkedIn profile enrichment, making it faster and more effective for data-driven decision-making.
by Hubschrauber
A single workflow with 2 flows/paths that combine to handle the backup sequence for Zigbee device configuration from HomeAssistant / zigbee2mqtt. This provides a way to automate a periodic capture of Zigbee coordinators and device pairings to speed the recovery process when/if the HomeAssistant instance needs to be rebuilt. Setting up similar automation without n8n (e.g. shell scripts and system timers) is consiterably more challenging. n8n makes it easy and this template should remove any other excuse not to do it. Flow 1 Triggered by Cron/Timer set whatever interval for backups sends mqtt message to request zigbee2mqtt backup (via separate message) Flow 2 Triggered by zigbee2mqtt backup message Extracts zip file from the message and stores somewhere, with a date-stamp in the filename, via sftp Setup Create a MQTT connection named "MQTT Account" with the appropriate protocol (mqtt), host, port (1883), username, and password Create an sftp connection named "SFTP Zigbee Backups" with the appropriate host, port (22), username, and password or key. Reference This article describes the mqtt parts.
by Thomas Janssen
Build an MCP Server which has access to a semantic database to perform Retrieval Augmented Generation (RAG) Tutorial Click here to watch the full tutorial on YouTube How it works This MCP Server has access to a local semantic database (Qdrant) and answers questions being asked to the MCP Client. AI Agent Template Click here to navigate to the AI Agent n8n workflow which uses this MCP server Warning This flow only runs local and cannot be executed on the n8n cloud platform because of the MCP Client Community Node. Installation Install n8n + Ollama + Qdrant using the Self-hosted AI starter kit Make sure to install Llama 3.2 and mxbai-embed-large as embeddings model. Activate the n8n flow Run the "RAG Ingestion Pipeline" and upload some PDF documents How to use it Run the MCP Client workflow and ask a question. It will be either answered by using the semantic database or the search engine API. More detailed instructions Missed a step? Find more detailed instructions here: https://brightdata.com/blog/ai/news-feed-n8n-openai-bright-data
by Yaron Been
Workflow Overview This cutting-edge n8n automation is a sophisticated market research and intelligence gathering tool designed to transform web content discovery into actionable insights. By intelligently combining web crawling, AI-powered filtering, and smart summarization, this workflow: Discovers Relevant Content: Automatically crawls target websites Identifies trending topics Extracts comprehensive article details Intelligent Content Filtering: Applies custom keyword matching Filters for most relevant articles Ensures high-quality information capture AI-Powered Summarization: Generates concise, meaningful summaries Extracts key insights Provides quick, digestible information Seamless Delivery: Sends summaries directly to Slack Enables instant team communication Facilitates rapid information sharing Key Benefits 🤖 Full Automation: Continuous market intelligence 💡 Smart Filtering: Precision content discovery 📊 AI-Powered Insights: Intelligent summarization 🚀 Instant Delivery: Real-time team updates Workflow Architecture 🔹 Stage 1: Content Discovery Scheduled Trigger**: Daily market research FireCrawl Integration**: Web content crawling Comprehensive Site Scanning**: Extracts article metadata Captures full article content Identifies key information sources 🔹 Stage 2: Intelligent Filtering Keyword-Based Matching** Relevance Assessment** Custom Domain Optimization**: AI and technology focus Startup and innovation tracking 🔹 Stage 3: AI Summarization OpenAI GPT Integration** Contextual Understanding** Concise Insight Generation**: 3-point summary format Captures essential information 🔹 Stage 4: Team Notification Slack Integration** Instant Information Sharing** Formatted Insight Delivery** Potential Use Cases Market Research Teams**: Trend tracking Innovation Departments**: Technology monitoring Startup Ecosystems**: Competitive intelligence Product Management**: Industry insights Strategic Planning**: Rapid information gathering Setup Requirements FireCrawl API Web crawling credentials Configured crawling parameters OpenAI API GPT model access Summarization configuration API key management Slack Workspace Channel for insights delivery Appropriate app permissions Webhook configuration n8n Installation Cloud or self-hosted instance Workflow configuration API credential management Future Enhancement Suggestions 🤖 Multi-source crawling 📊 Advanced sentiment analysis 🔔 Customizable alert mechanisms 🌐 Expanded topic tracking 🧠 Machine learning refinement Technical Considerations Implement robust error handling Use exponential backoff for API calls Maintain flexible crawling strategies Ensure compliance with website terms of service Ethical Guidelines Respect content creator rights Use data for legitimate research Maintain transparent information gathering Provide proper attribution Workflow Visualization [Daily Trigger] ⬇️ [Web Crawling] ⬇️ [Content Filtering] ⬇️ [AI Summarization] ⬇️ [Slack Delivery] Connect With Me Ready to revolutionize your market research? 📧 Email: Yaron@nofluff.online 🎥 YouTube: @YaronBeen 💼 LinkedIn: Yaron Been Transform your information gathering with intelligent, automated workflows! #AIResearch #MarketIntelligence #AutomatedInsights #TechTrends #WebCrawling #AIMarketing #InnovationTracking #BusinessIntelligence #DataAutomation #TechNews
by n8n Team
This workflow adds a new product in Stripe whenever a new product has been added to Pipedrive. Prerequisites Stripe account and Stripe credentials Pipedrive account and Pipedrive credentials How it works Pipedrive trigger node starts the workflow when a new product is added. HTTP Request node creates a new product in Stripe using previuos input. Merge node combines data of both Pipedrive and Stripe inputs. The output will contain the data of Pipedrive input merged with the data of Stripe input. The merge occurs based on the index of the items. The Item Lists node splits prices to separate items. HTTP Request node creates price records in Stripe.
by Parnain
What This Workflow Does: This n8n workflow automatically generates an AI-powered summary and relevant tags whenever a new row is added to your Notion database. Simply save any URL to your Notion database using the [Notion Web Clipper] Chrome extension or [Save to Notion]—on both desktop and mobile. This keeps all your saved content organized in one place instead of scattered across different platforms. How it works: The workflow is triggered when a new row is added to your Notion database (it checks for updates every minute). It retrieves the content from the saved URL. An AI agent analyzes the content to generate a summary and relevant tags. The AI output is then formatted properly. Finally, the formatted summary and tags are saved into the appropriate columns in your Notion database. Notes: Make sure your Notion database includes the following columns: URL – Stores the content URL you want to summarize. AI Summary – Where the AI-generated summary will be added. Tags – Where the AI-generated tags will be saved.
by phil
This workflow automates web scraping of Amazon search result pages by retrieving raw HTML, cleaning it to retain only the relevant product elements, and then using an LLM to extract structured product data (name, description, rating, reviews, and price), before saving the results back to Google Sheets. It integrates Google Sheets to supply and collect URLs, BrightData to fetch page HTML, a custom n8n Function node to sanitize the HTML, LangChain (OpenRouter GPT-4) to parse product details, and Google Sheets again to store the output. URL to scape . Result Who Needs Amazon Search Result Scraping? This scraping workflow is ideal for teams and businesses that need to monitor Amazon product listings at scale: E-commerce Analysts** – Track competitor pricing, ratings, and inventory trends. Market Researchers** – Collect data on product popularity and reviews for market analysis. Data Teams** – Automate ingestion of product metadata into BI pipelines or data lakes. Affiliate Marketers** – Keep affiliate catalogs up to date with latest product details and prices. If you need reliable, structured data from Amazon search results delivered directly into your spreadsheets, this workflow saves you hours of manual copy-and-paste. Why Use This Workflow? End-to-End Automation** – From URL list to clean JSON output in Sheets. Robust HTML Cleaning** – Strips scripts, styles, unwanted tags, and noise. Accurate Structured Parsing** – Leverages GPT-4 via LangChain for reliable extraction. Scalable & Repeatable** – Processes thousands of URLs in batches. Step-by-Step: How This Workflow Scrapes Amazon Get URLs from Google Sheets – Reads a list of search result URLs. Loop Over Items – Iterates through each URL in controlled batches. Fetch Raw HTML – Uses BrightData’s Web Unlocker proxy to retrieve the page. Clean HTML – A Function node removes doctype, scripts, styles, head, comments, classes, and non-whitelisted tags, collapsing extra whitespace. Extract with LLM – Passes cleaned HTML into LangChain → GPT-4 to output JSON for each product: name, description, rating, reviews, price Save Results – Appends the JSON fields as columns back into a “results” sheet in Google Sheets. Customization: Tailor to Your Needs Adaptable Sites** – This workflow can be adapted to any e-commerce or other website, for example Walmart or eBay. Whitelist Tags** – Modify the allowedTags array in the Code node to keep additional HTML elements. Schema Changes** – Update the Structured Output Parser schema to include more fields (e.g., availability, SKU). Alternate Data Sink** – Instead of Sheets, route output to a database, CSV file, or webhook. 🔑 Prerequisites Google Sheets Credentials** – OAuth credentials configured in n8n. BrightData API token** – Stored in n8n credentials as BRIGHTDATA_TOKEN. OpenRouter API Key** – Configured for the LangChain node to call GPT-4. n8n Instance** – Self-hosted or cloud with sufficient quota for HTTP requests and LLM calls. 🚀 Installation & Setup Configure Credentials** In n8n, set up Google Sheets OAuth under “Credentials.” Add BrightData token as a new HTTP Request credential. Create an OpenRouter API key credential for the LangChain node. Import the Workflow** Copy the JSON workflow into n8n’s “Import” dialog. Map your Google Sheet IDs and GIDs to the {{WEB_SHEET_ID}}, {{TRACK_SHEET_GID}}, and {{RESULTS_SHEET_GID}} placeholders. Ensure the BRIGHTDATA_TOKEN credential is selected on the HTTP Request node. Test & Run** Add a few Amazon search URLs to your “track” sheet. Execute the workflow and verify product data appears in your “results” sheet. Tweak batch size or parser schema as needed. ⚠ Important API Rate Limits** – Monitor your BrightData and OpenRouter usage to avoid throttling. Amazon’s Terms** – Ensure your scraping complies with Amazon’s policies and legal requirements. Summary This workflow delivers a fully automated, scalable solution to extract structured product data from Amazon search pages directly into Google Sheets—streamlining your competitive analysis and data collection. 🚀 Phil | Inforeole
by Yaron Been
🚀 Automated Investor Intelligence: CrunchBase to Google Sheets Data Harvester! Workflow Overview This cutting-edge n8n automation is a sophisticated investor intelligence tool designed to transform market research into actionable insights. By intelligently connecting CrunchBase, data processing, and Google Sheets, this workflow: Discovers Investor Insights: Automatically retrieves latest investor data Tracks key investment organizations Eliminates manual market research efforts Intelligent Data Processing: Filters investor-specific organizations Extracts critical investment metrics Ensures comprehensive market intelligence Seamless Data Logging: Automatically updates Google Sheets Creates real-time investor database Enables rapid market trend analysis Scheduled Intelligence Gathering: Daily automated tracking Consistent investor insight updates Zero manual intervention required Key Benefits 🤖 Full Automation: Zero-touch investor research 💡 Smart Filtering: Targeted investment insights 📊 Comprehensive Tracking: Detailed investor intelligence 🌐 Multi-Source Synchronization: Seamless data flow Workflow Architecture 🔹 Stage 1: Investor Discovery Scheduled Trigger**: Daily market scanning CrunchBase API Integration** Intelligent Filtering**: Investor-specific organizations Key investment metrics Most recent data 🔹 Stage 2: Data Extraction Comprehensive Metadata Parsing** Key Information Retrieval** Structured Data Preparation** 🔹 Stage 3: Data Logging Google Sheets Integration** Automatic Row Appending** Real-Time Database Updates** Potential Use Cases Venture Capitalists**: Investment ecosystem mapping Startup Scouts**: Investor trend analysis Market Researchers**: Comprehensive investment insights Business Development**: Strategic partnership identification Investment Analysts**: Market intelligence gathering Setup Requirements CrunchBase API API credentials Configured access permissions Investor organization tracking setup Google Sheets Connected Google account Prepared tracking spreadsheet Appropriate sharing settings n8n Installation Cloud or self-hosted instance Workflow configuration API credential management Future Enhancement Suggestions 🤖 Advanced investment trend analysis 📊 Multi-source investor aggregation 🔔 Customizable alert mechanisms 🌐 Expanded investment stage tracking 🧠 Machine learning insights generation Technical Considerations Implement robust error handling Use secure API authentication Maintain flexible data processing Ensure compliance with API usage guidelines Ethical Guidelines Respect business privacy Use data for legitimate research Maintain transparent information gathering Provide proper attribution Hashtag Performance Boost 🚀 #InvestorIntelligence #VentureCapital #MarketResearch #AIWorkflow #DataAutomation #StartupEcosystem #InvestmentTracking #BusinessIntelligence #TechInnovation #StartupFunding Workflow Visualization [Daily Trigger] ⬇️ [Fetch Investor Data] ⬇️ [Extract Investor Fields] ⬇️ [Log to Google Sheets] Connect With Me Ready to revolutionize your investor research? 📧 Email: Yaron@nofluff.online 🎥 YouTube: @YaronBeen 💼 LinkedIn: Yaron Been Transform your market intelligence with intelligent, automated workflows!
by Oneclick AI Squad
This n8n workflow automatically creates friendly, personalized travel itineraries based on messages received via email or WhatsApp. When a user says "I want to go to Dubai with friends for 5 days" or something similar, the AI agent understands the request, generates a detailed daily plan with suggested activities, transport tips, and hotel ideas — all in a warm, human tone. It saves time, adds value for travelers, and delivers ready-to-send itineraries without any manual effort. Good to know The AI agent uses advanced language processing to understand natural travel requests in multiple formats. Itineraries are generated with personalized recommendations based on travel preferences, group size, and duration. The workflow supports both email and WhatsApp communication channels for maximum accessibility. All responses maintain a warm, friendly tone to enhance user experience. How it works The Get Query from Email node captures travel requests sent via email, parsing the message content for trip details. The Get Query from WhatsApp node simultaneously monitors WhatsApp messages for travel planning requests. Both inputs feed into the Itinerary Creator Agent node, which uses AI to analyze the request and generate comprehensive travel plans including activities, accommodations, and transportation suggestions. The Check Proper Data node validates the generated itinerary to ensure all essential information is included and properly formatted. The Check where to send Answer node determines the appropriate response channel (email or WhatsApp) based on the original request source. If the request came via email, the Sending Itinerary from Email node sends the personalized itinerary back to the user's email address. If the request came via WhatsApp, the Send Itinerary from message node delivers the travel plan through WhatsApp messaging. How to use Import the workflow into n8n and configure the nodes with your email service credentials and WhatsApp API access. Set up the AI agent with your preferred travel data sources and recommendation algorithms. Test the workflow by sending sample travel requests through both email and WhatsApp channels. Monitor the generated itineraries to ensure quality and adjust the AI agent parameters as needed. Requirements Email service API credentials (SMTP or email provider API) WhatsApp Business API access or WhatsApp integration service AI/LLM service for the Itinerary Creator Agent (OpenAI, Anthropic, or similar) Access to travel data sources for recommendations (optional but recommended) Customising this workflow Modify the Itinerary Creator Agent node to include specific travel preferences, local recommendations, or branded content. Adjust the data validation rules in the Check Proper Data node to match your quality standards. Customize response templates in both sending nodes to align with your brand voice and style. Add additional input channels or integrate with other messaging platforms as needed.