by Marth
How it works This workflow runs on a daily schedule. It starts by scraping real estate-related queries from Google using Apify. The organic search results are parsed and summarized into a single text block. That text is then sent to an AI model (GPT-4o) which extracts the top 3 pain points faced by real estate agents based on current online sentiment. The workflow compares today's insights with yesterday's data stored in Airtable to detect recurring or new pain points. Finally, it sends a summary notification via Telegram and stores the current day's insights into Airtable for trend tracking. How to set up Clone or import the workflow into your n8n instance. Get an Apify API token and insert it into the HTTP Request node. Create an Airtable base with a table containing two fields: "Date" (text) and "Summary" (long text). Copy the Base ID and Table ID into the Airtable nodes. Connect your Telegram bot and replace the chat ID in the Telegram node. Set up OpenAI credentials with GPT-4o or GPT-4o-mini for the LLM node. Run once manually to test, then activate the schedule trigger to run daily. (Optional) Extend the flow to generate cold outreach emails based on pain points, or sync to Notion/CRM.
by Yaron Been
Scrape Indeed Job Listings for Hiring Signals Using Bright Data and LLMs How the flow runs Fill the form with job position you're hunting for. Bright data's scraper will scrape Indeed based on your requirments. Workflow waits for the snapshot. Data returns as JSON. Jobs append to Google Sheets. Each row goes to an LLM to analyze if you're a good fit for the job (based on your prompts). The LLMswrites YES or NO next to each job opportunity, helping you find job posts that are relevant to you. What you need Google Sheets with our template. Bright Data dataset and API key. OpenAI key for GPT‑4o mini (or any other LLM). n8n with required nodes. Form fields To Fill Job Location** – city or region. Keyword** – role or skills. Country** – two‑letter code. Setup steps Copy the sheet template link. Import the JSON workflow. Add your credentials in nodes. Test the form manually. Add a schedule if desired. Bright Data filter example [ { "country": "US", "domain": "indeed.com", "keyword_search": "Growth Marketer", "location": "Miami", "date_posted": "Last 24 hours" } ] Tips -Choose Last 24 hours often. -Increase wait time for big snapshots. -Narrow keywords to save credits. **Need help? **Email me anytime: Yaron@nofluff.online YouTube: @YaronBeen LinkedIn: https://www.linkedin.com/in/yaronbeen/ Bright Data Docs: https://docs.brightdata.com/introduction
by Jimleuk
This n8n workflow demonstrates a simple approach to improve chat UX by staggering an AI Agent's reply for users who send in a sequence of partial messages and in short bursts. How it works Twilio webhook receives user's messages which are recorded in a message stack powered by Redis. The execution is immediately paused for 5 seconds and then another check is done against the message stack for the latest message. The purpose of this check lets use know if the user is sending more messages or if they are waiting for a reply. The execution is aborted if the latest message on the stack differs from the incoming message and continues if they are the same. For the latter, the agent receives the buffered messages up to that point and is able to respond to them in a single reply. Requirements A Twilio account and SMS-enabled phone number to receive messages. Redis instance for the messages stack. OpenAI account for the language model. Customising the workflow This workflow should work for other common messaging platforms such as Whatsapp and Telegram. 5 seconds too long or too short? Adjust the wait threshold to suit your customers.
by Leonardo Grigorio
Youtube Video This n8n workflow is designed to assist YouTube content creators in identifying trending topics within a specific niche. By leveraging YouTube's search and data APIs, it gathers and analyzes video performance metrics from the past two days to provide insights into what content is gaining traction. Here's how the workflow operates: Trigger Setup: The workflow begins when a user sends a query through the chat_message_received node. If no niche is provided, the AI prompts the user to select or input one. AI Agent (Language Model): The central node utilizes a GPT-based AI agent to: Understand the user's niche or content preferences. Generate tailored search terms related to the niche. Process YouTube API responses and summarize trends using insights such as common themes, tags, and audience engagement metrics (views, likes, and comments). YouTube Search: The youtube_search node runs a secondary workflow to query YouTube for relevant videos published within the last two days. It retrieves basic video data such as video IDs, relevance scores, and publication dates. Video Details Retrieval: The workflow fetches additional details for each video: Video Snippet: Metadata like title, description, and tags. Video Statistics: Metrics such as views, likes, and comments. Content Details: Video duration, ensuring only content longer than 3 minutes and 30 seconds is analyzed. Data Processing: Video metadata is cleaned, sanitized, and stored in memory. Tags, titles, and descriptions are analyzed to identify patterns and trends across multiple videos. Output: The workflow compiles insights and presents them to the user, highlighting: The most common themes or patterns within the niche. URLs to trending videos and their respective channels. Engagement statistics, helping the user understand the popularity of the content. Key Notes for Setup: API Keys**: Ensure valid YouTube API credentials are configured in the get_videos, find_video_snippet, find_video_statistics, and find_video_data nodes. Memory Buffer**: The window_buffer_memory node ensures the AI agent retains context during analysis, enhancing the quality of the generated insights. Search Term Customization**: The AI agent dynamically creates search terms based on the user’s niche to improve search precision. Use Case: This workflow is ideal for YouTubers or marketers seeking data-driven inspiration for creating content that aligns with current trends, maximizing the potential to engage their audience. Example Output: For the niche "digital marketing": Trending Topic: Videos about "mental triggers" and "psychological marketing." Tags: "SEO," "Conversion Rates," "Social Proof." Engagement: Videos with over 200K views and high likes/comment ratios are leading trends. Video links: https://www.youtube.com/watch?v=video_id1 https://www.youtube.com/watch?v=video_id2
by Aditya Gaur
Who is this template for? This template can be used by any automator who wants to create a workitem(incident/user story/bugs) in azure devops whenever an alert raised by systems. How it works Each time an alert raised in system( for ex: Elastic raises an alert for missing host or domain). Workflow reads an alert and creates a workitem in azure devops Workflow can be customized to send any required information as possible in azure devops Setup Instructions Azure DevOps Organization and Project:** Make sure you have access to an Azure DevOps organization and a project where the work item will be created. Personal Access Token (PAT):** You need a Personal Access Token with permissions to create work items. You can generate a PAT from the Azure DevOps user settings.
by simonscrapes
What this workflow does: This flow uses an AI node to generate Seed Keywords to focus SEO efforts on based on your ideal customer profile. You can use these keywords to form part of your SEO strategy. Outputs: List of 20 Seed Keywords Setup Fill the Set Ideal Customer Profile (ICP) Connect with your credentials Replace the Connect to your own database with your own database Pre-requisites / Dependencies You know your ideal customer profile (ICP) An AI API account (either OpenAI or Anthropic recommended) More templates and n8n workflows >>> @simonscrapes
by Niklas Hatje
Use case When collecting leads via an online form, you often need to manually add those new leads into your Pipedrive CRM. This not only takes a lot of time but is also error-prone. This workflow automates this tedious work for you. What this workflow does The workflow is triggered each time a form is submitted in n8n. It validates the email address using Hunter.io. If the email is valid, the workflow checks for an existing person with that email in Pipedrive. If no existing person is found, it utilizes Clearbit to enrich the person's information. It then verifies if the person's organization already exists in Pipedrive, creating a new organization if necessary. The workflow then registers the person in Pipedrive. Lastly, it creates a lead in Pipedrive using information from the person and organization. Setup This workflow is very quick to set up. Add your Hunter.io, Clearbit and Pipedrive credentials Click the test workflow button Activate the workflow and use the form trigger production URL to collect your leads in a smart way How to adjust it to your needs Exchange the n8n form trigger with your form of choice (Typeform, Google Forms, SurveyMonkey...) Add a filter criteria to only add new leads if they match certain requirements Remove the email check with Hunter.io if you don't own this tool and expect new form submission to have a correct email anyways Add ways to handle invalid emails or existing Persons
by Federico De Ponte
🔁 Loop & Optimize Meta Tags with Google Gemini This workflow automates the shortening of meta titles and descriptions for SEO—directly from your Google Sheet, row by row, using Google Gemini. ✅ What it does Reads rows from a Google Sheet (meta_title, meta_description, row_index) Loops through each row and checks if content exists Sends the data to Google Gemini for length-optimized output Cleans and parses the response Updates the original sheet with the shortened results 🛠️ Setup Requirements Google Sheets (OAuth2 credentials connected in n8n) Google Gemini API key (configured in n8n credentials) Sheet must contain: row_index meta_title meta_description Output will be written into: meta_titleFixed meta_descriptionFixed
by David Olusola
🤖 AI-Powered Lead Enrichment with Explorium MCP & Telegram Who it's for Sales reps, agencies, and growth teams who want to turn basic company info into qualified leads with automated research . Perfect for B2B prospecting. What it does This workflow lets you send a company name or domain via Telegram, and instantly returns: ✅ Enriched company profile (industry, size, tech, pain points) ✅ A clean, structured JSON — ready for your CRM or sales tools How it works 💬 Send company info to your Telegram bot 🔎 Workflow pulls data from Explorium MCP + Tavily 🧠 AI analyzes model, tools, pain points & goals 📤 JSON response sent back via Telegram or logged to your database Requirements 🔐 OpenAI API (GPT-4) 🧠 Explorium MCP API 🌐 Tavily Web Search API 🤖 Telegram Bot API 🗃️ PostgreSQL (for memory/logging) How to set up Add API keys in n8n Connect Telegram bot to webhook Set up PostgreSQL for memory persistence Customize prompts (tone, niche, etc.) Test by sending a company name via Telegram Customization Options 🎯 Focus enrichment on specific industries or keywords 💬 Adjust the email sequence structure & style 🧩 Add extra data sources (e.g. Clearbit, Crunchbase) 🧾 Format JSON to match your CRM schema ⚙️ Add approval step before sending emails Highlights ✅ Uses multi-source enrichment ✅ Works 100% from Telegram ✅ Integrates into any sales pipeline
by Matt Chong
Who is this for? This workflow is ideal for: For freelancers, business owners, and finance teams who receive receipts via Gmail Automatically logs expenses for tax, bookkeeping and year-end audits What problem is this workflow solving? When tax season hits, missing receipts create panic. This workflow keeps everything in one place. It uses AI to extract details from Gmail attachments, logs them in a Google Sheet, and stores the PDFs in Google Drive. No digging. No copying. Just everything where it should be. How it works? Apply the label receipt to any incoming Gmail email. Do not mark it as read. On a schedule (e.g. daily at 8:00 AM), the workflow triggers. It searches for unread emails with the label receipt. For each matching email, it downloads the attached receipt file. It extracts text content from the receipt file. It uploads the original receipt file to a specified folder in Google Drive. It merges the extracted text with email metadata. It sends this combined data to OpenAI. OpenAI extracts structured fields: date merchant category description subtotal tax total The extracted data is appended as a new row in the specific Google Sheet. Finally, the email is marked as read to prevent it from being processed again. How to set up? Connect these services in your n8n credentials: Gmail (OAuth2) Google Drive Google Sheets OpenAI Configure the Google Drive upload: In the “Upload File” node, select the target folder where you want receipt PDFs stored. Set your execution schedule: Open the “Schedule Trigger” node and choose when it should run (default is once daily at 8:00 AM). Choose your Google Sheet and tab: In the “Append to Google Sheet” node, select your document and tab Ensure the sheet contains these columns: Date, Merchant, Category, Description, Subtotal, Tax, Total. How to customize this workflow to your needs? Change the Gmail label or search filter** to match your needs. Modify the OpenAI schema** to extract additional fields like currency, project, or notes.
by explorium
Salesforce Lead Enrichment with Explorium Template Download the following json file and import it to a new n8n workflow: salesforce\_Workflow.json Overview This n8n workflow monitors your Salesforce instance for new leads and automatically enriches them with missing contact information. When a lead is created, the workflow: Detects the new lead via Salesforce trigger Matches the lead against Explorium's database using name and company Enriches the lead with professional email addresses and phone numbers Updates the Salesforce lead record with the discovered contact information This automation ensures your sales team always has the most up-to-date contact information for new leads, improving reach rates and accelerating the sales process. Key Features Real-time Processing**: Triggers automatically when new leads are created in Salesforce Intelligent Matching**: Uses lead name and company to find the correct person in Explorium's database Contact Enrichment**: Adds professional emails, mobile phones, and office phone numbers Batch Processing**: Efficiently handles multiple leads to optimize API usage Error Handling**: Continues processing other leads even if some fail to match Selective Updates**: Only updates leads that successfully match in Explorium Prerequisites Before setting up this workflow, ensure you have: n8n instance (self-hosted or cloud) Salesforce account with: OAuth2 API access enabled Lead object permissions (read/write) API usage limits available Explorium API credentials (Bearer token) - Get explorium api key Basic understanding of Salesforce lead management Salesforce Requirements Required Lead Fields The workflow expects these standard Salesforce lead fields: FirstName - Lead's first name LastName - Lead's last name Company - Company name Email - Will be populated/updated by the workflow Phone - Will be populated/updated by the workflow MobilePhone - Will be populated/updated by the workflow API Permissions Your Salesforce integration user needs: Read access to Lead object Write access to Lead object fields (Email, Phone, MobilePhone) API enabled on the user profile Sufficient API calls remaining in your org limits Installation & Setup Step 1: Import the Workflow Copy the workflow JSON from the template In n8n: Navigate to Workflows → Add Workflow → Import from File Paste the JSON and click Import Step 2: Configure Salesforce OAuth2 Credentials Click on the Salesforce Trigger node Under Credentials, click Create New Follow the OAuth2 flow: Client ID: From your Salesforce Connected App Client Secret: From your Salesforce Connected App Callback URL: Copy from n8n and add to your Connected App Authorize the connection Save the credentials as "Salesforce account connection" Note: Use the same credentials for all Salesforce nodes in the workflow. Step 3: Configure Explorium API Credentials Click on the Match\_prospect node Under Credentials, click Create New (HTTP Header Auth) Configure the header: Name: Authorization Value: Bearer YOUR_EXPLORIUM_API_TOKEN Save as "Header Auth account" Apply the same credentials to the Explorium Enrich Contacts Information node Step 4: Verify Node Settings Salesforce Trigger: Trigger On: Lead Created Poll Time: Every minute (adjust based on your needs) Salesforce Get Leads: Operation: Get All Condition: CreatedDate = TODAY (fetches today's leads) Limit: 20 (adjust based on volume) Loop Over Items: Batch Size: 6 (optimal for API rate limits) Step 5: Activate the Workflow Save the workflow Toggle the Active switch to ON The workflow will now monitor for new leads every minute Detailed Node Descriptions Salesforce Trigger: Polls Salesforce every minute for new leads Get Today's Leads: Retrieves all leads created today to ensure none are missed Loop Over Items: Processes leads in batches of 6 for efficiency Match Prospect: Searches Explorium for matching person using name + company Filter: Checks if a valid match was found Extract Prospect IDs: Collects all matched prospect IDs Enrich Contacts: Fetches detailed contact information from Explorium Merge: Combines original lead data with enrichment results Split Out: Separates individual enriched records Update Lead: Updates Salesforce with new contact information Data Mapping The workflow maps Explorium data to Salesforce fields as follows: | Explorium Field | Salesforce Field | Fallback Logic | | ------------------- | ---------------- | --------------------------------- | | emails[0].address | Email | Falls back to professions_email | | mobile_phone | MobilePhone | Falls back to phone_numbers[1] | | phone_numbers[0] | Phone | Falls back to mobile_phone | Usage & Monitoring Automatic Operation Once activated, the workflow runs automatically: Checks for new leads every minute Processes any leads created since the last check Updates leads with discovered contact information Continues running until deactivated Manual Testing To test the workflow manually: Create a test lead in Salesforce Click "Execute Workflow" in n8n Monitor the execution to see each step Verify the lead was updated in Salesforce Monitoring Executions Track workflow performance: Go to Executions in n8n Filter by this workflow Review successful and failed executions Check logs for any errors or issues Troubleshooting Common Issues No leads are being processed Verify the workflow is activated Check Salesforce API limits haven't been exceeded Ensure new leads have FirstName, LastName, and Company populated Confirm OAuth connection is still valid Leads not matching in Explorium Verify company names are accurate (not abbreviations) Check that first and last names are properly formatted Some individuals may not be in Explorium's database Try testing with known companies/contacts Contact information not updating Check Salesforce field-level security Verify the integration user has edit permissions Ensure Email, Phone, and MobilePhone fields are writeable Check for validation rules blocking updates Authentication errors Salesforce: Re-authorize OAuth connection Explorium: Verify Bearer token is valid and not expired Check API quotas haven't been exceeded Error Handling The workflow includes built-in error handling: Failed matches don't stop other leads from processing Each batch is processed independently Failed executions are logged for review Partial successes are possible (some leads updated, others skipped) Best Practices Data Quality Ensure complete lead data: FirstName, LastName, and Company should be populated Use full company names: "Microsoft Corporation" matches better than "MSFT" Standardize data entry: Consistent formatting improves match rates Performance Optimization Adjust batch size: Lower if hitting API limits, higher for efficiency Modify polling frequency: Every minute for high volume, less frequent for lower volume Set appropriate limits: Balance between processing speed and API usage Compliance & Privacy Data permissions: Ensure you have rights to enrich lead data GDPR compliance: Consider privacy regulations in your region Data retention: Follow your organization's data policies Audit trail: Monitor who has access to enriched data Customization Options Extend the Enrichment Add more Explorium enrichment by: Adding firmographic data (company size, revenue) Including technographic information Appending social media profiles Adding job title and department verification Modify Trigger Conditions Change when enrichment occurs: Trigger on lead updates (not just creation) Add specific lead source filters Process only leads from certain campaigns Include lead score thresholds Add Notifications Enhance with alerts: Email sales reps when leads are enriched Send Slack notifications for high-value matches Create tasks for leads that couldn't be enriched Log enrichment metrics to dashboards API Considerations Salesforce Limits API calls: Each execution uses \~4 Salesforce API calls Polling frequency: Consider your daily API limit Batch processing: Reduces API usage vs. individual processing Explorium Limits Match API: One call per batch of leads Enrichment API: One call per batch of matched prospects Rate limits: Respect your plan's requests per minute Integration Architecture This workflow can be part of a larger lead management system: Lead Capture → This Workflow → Lead Scoring → Assignment Can trigger additional workflows based on enrichment results Compatible with existing Salesforce automation (Process Builder, Flows) Works alongside other enrichment tools Security Considerations Credentials**: Stored securely in n8n's credential system Data transmission**: Uses HTTPS for all API calls Access control**: Limit who can modify the workflow Audit logging**: All executions are logged with details Support Resources For assistance with: n8n issues**: Consult n8n documentation or community forum Salesforce integration**: Reference Salesforce API documentation Explorium API**: Contact Explorium support for API questions Workflow logic**: Review execution logs for debugging
by Cyril Nicko Gaspar
Amazon Price Monitoring Workflow This workflow enables you to monitor the prices of Amazon product listings directly from a Google Sheet, using data provided by Bright Data’s Amazon Scraper API. It automates the retrieval of price data for specified products and is ideal for market research, competitor analysis, or personal price tracking. ✅ Requirements Before using this template, ensure you have the following: A Bright Data account and access to the Amazon Scraper API. An active API key from Bright Data. A Google Sheet set up with the required columns. N8N account (self-host or cloud version) ⸻ ⚙️ Setup 1. Create a Google Sheet with the following columns: Product URL ZIP Code (used for regional price variations) ASIN (Amazon Standard Identification Number) 2. Extract ASIN Automatically using the following formula in the ASIN column: =REGEXEXTRACT(A2, "/(?:dp|gp/product|product)/([A-Z0-9]{10})") Replace A2 with the appropriate cell reference 3. Obtain an API Key: Sign in to your Bright Data account. Go to the API section to generate an API key. Create a Bearer Authentication Credential using this key in your automation tool. 4. Configure the Workflow: Use a node (e.g., “Google Sheets”) to read data from your sheet. Use an HTTP Request node to send a query to Bright Data’s Amazon API with the ASIN and ZIP code. Parse the returned JSON response to extract product price and other relevant data. Optionally write the output (e.g., current price, timestamp) back into the sheet or another data store. ⸻ Workflow Functionality The workflow is triggered periodically (or manually) and reads product details from your Google Sheet. For each row, it extracts the Product URL and ZIP code and sends a request to the Bright Data API. The API returns product price information, which is then logged or updated back into the sheet using ASIN. You can also map the product URL to the product URL, but ensure that the URL has no parameters. If the URL has appended parameters, refer to the input field from the Bright Data snapshot result. ⸻ 💡 Use Cases E-commerce sellers monitoring competitors’ prices. Consumers tracking price drops on wishlist items. Market researchers collecting pricing data across ZIP codes. Affiliate marketers ensuring accurate product pricing on their platforms. ⸻ 🛠️ Customization Add columns for additional product data such as rating, seller, or stock availability. Schedule the workflow to run hourly, daily, or weekly depending on your needs. Implement email or Slack alerts for significant price changes. Filter by product category or brand to narrow your tracking focus.