by scrapeless official
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Prerequisites A n8n account (free trial available) A Scrapeless account and API key A Google account to access Google Sheets 🛠️ Step-by-Step Setup 1. Create a New Workflow in n8n Start by creating a new workflow in n8n. Add a Manual Trigger node to begin. 2. Add the Scrapeless Node Add the Scrapeless node and choose the Scrape operation Paste in your API key Set your target website URL Execute the node to fetch data and verify results 3. Clean Up the Data Add a Code node to clean and format the scraped data. Focus on extracting key fields like: Title Description URL 4. Set Up Google Sheets Create a new spreadsheet in Google Sheets Name the sheet (e.g., Business Leads) Add columns like Title, Description, and URL 5. Connect Google Sheets in n8n Add the Google Sheets node Choose the operation Append or update row Select the spreadsheet and worksheet Manually map each column to the cleaned data fields 6. Run and Test the Workflow Click "Execute Workflow" in n8n Check your Google Sheet to confirm the data is properly inserted Results With this automated workflow, you can continuously extract business lead data, clean it, and push it directly into a spreadsheet — perfect for outbound sales, lead lists, or internal analytics. How to Use ⚙️ Open the Variables node and plug in your Scrapeless credentials. 📄 Confirm the Google Sheets node points to your desired spreadsheet. ▶️ Run the workflow manually from the Start node. Perfect For: Sales teams doing outbound prospecting Marketers building lead lists Agencies running data aggregation tasks
by Floyd Mahou
How it works • Allows users to manage their Google Calendar via WhatsApp using natural language • Handles event creation, updates, deletions, availability checks, and agenda overviews • AI agent interprets the user’s message and triggers the appropriate calendar action • Responses are sent back to the user via WhatsApp, with confirmation or schedule info Set up steps • Set up a WhatsApp Business Cloud account and configure your webhook • Connect your Google Calendar using n8n credentials • Deploy OpenAI API key for natural language understanding • Link each calendar action (create, update, delete, search) to the TimePilot agent • Customize confirmation messages and automate reply formatting Note: More detailed configuration and custom logic are described inside sticky notes within the workflow.
by Extruct AI
Who’s it for: Sales teams, marketers, and analysts who need to quickly access all the social media and public profile links for any company. How it works / What it does: When you enter a company into the form, this workflow automatically searches for and collects all available links to the company’s social media accounts, review sites, and public profiles from sources like Crunchbase and Zoominfo. All discovered URLs are added directly to your Google Sheet. How to set up: Create an Extruct account at www.extruct.ai/. Open the Extruct table template, find the table ID in your browser’s address bar, and copy it. Make a copy of the provided Google Sheets template to your own Google Drive. In n8n, paste the table ID into the variables node of your flow. Set up Bearer authentication in every HTTP Request node using your Extruct API token (found on the API page in Extruct). In the Google Sheets node, paste the link to your copied template and connect your Google account. Run the flow once to load the fields, then map the output fields to the correct columns in your sheet. Activate the flow and start adding companies via the form. Requirements: Extruct account and API token Extruct table template Google account with Google Sheets How to customize the workflow: You can add your own columns to the Extruct table and your Google Sheet. Just add the new column in both places and map it in the Google Sheets node in n8n.
by Extruct AI
Who’s it for: Sales and business development professionals who want to monitor company news, hiring trends, and business signals for their leads. How it works / What it does: Add a company to the form, and the workflow will automatically search for the latest news, recent hires, company stage, and LinkedIn activity. The results are sent straight to your Google Sheet, helping you stay up to date with your leads and prospects. How to set up: Register for Extruct at www.extruct.ai/. Open the Extruct table template, copy the table ID from the browser’s address bar. Make a copy of the Google Sheets template to your Drive. Enter the table ID into the variables node in your n8n flow. Set up Bearer authentication in all HTTP Request nodes using your Extruct API token. In the Google Sheets node, paste your template link and connect your Google account. Run the flow once to load the mapping fields, then match each output to the correct column. Activate the flow and start adding companies through the form. Requirements: Extruct account and API token Extruct table template Google account with Google Sheets How to customize the workflow: To track more business development signals, add new columns in both the Extruct table and your Google Sheet, then map them in the Google Sheets node.
by Arthur Braghetto
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Clean Web Content Extraction with Anti-Bot Fallback Extract clean and structured text from any webpage with optional fallback to an anti-bot scraping service. Ideal for AI tools and content workflows. 🧠 How it Works This sub-workflow enables reliable and clean scraping of any public webpage by simply passing a url parameter. It is designed to be embedded into other workflows or used as a tool for AI agents. It supports two output modes: fulltext:* true — returns *{ title, text } with full page content fulltext:* false — returns *{ title, url, content } with a short excerpt 💡 If the site is protected by anti-bot systems (like Cloudflare), it will automatically fallback to Scrape.do, a scraping API with a generous free plan. 🧩 This template requires the n8n-nodes-webpage-content-extractor community node, so it only works in self-hosted n8n environments. 🚀 Use Cases As a reusable sub-workflow, via Execute Sub-workflow node. As a tool for an AI Agent, compatible with Call n8n Workflow Tool. Perfect for chatbots, summarization workflows, or RSS/feed enrichment. Empowers your AI Agent with the ability to browse and extract readable content from websites automatically. 🔖 Parameters url (string): the webpage URL to scrape fulltext (boolean): set true for full page content, false for summarized output ⚙️ Setup Install the community node n8n-nodes-webpage-content-extractor in your self-hosted n8n instance. Create a free account at Scrape.do and obtain your API Token. In the workflow, locate the Scrape.do HTTP Request node and configure the credentials using your API Token. Detailed step-by-step instructions are available in the workflow notes. The Scrape.do API is only used as a fallback when conventional scraping fails, helping you preserve your API credits.
by Federico De Ponte
🔁 Loop & Optimize Meta Tags with Google Gemini This workflow automates the shortening of meta titles and descriptions for SEO—directly from your Google Sheet, row by row, using Google Gemini. ✅ What it does Reads rows from a Google Sheet (meta_title, meta_description, row_index) Loops through each row and checks if content exists Sends the data to Google Gemini for length-optimized output Cleans and parses the response Updates the original sheet with the shortened results 🛠️ Setup Requirements Google Sheets (OAuth2 credentials connected in n8n) Google Gemini API key (configured in n8n credentials) Sheet must contain: row_index meta_title meta_description Output will be written into: meta_titleFixed meta_descriptionFixed
by David Olusola
🤖 AI-Powered Lead Enrichment with Explorium MCP & Telegram Who it's for Sales reps, agencies, and growth teams who want to turn basic company info into qualified leads with automated research . Perfect for B2B prospecting. What it does This workflow lets you send a company name or domain via Telegram, and instantly returns: ✅ Enriched company profile (industry, size, tech, pain points) ✅ A clean, structured JSON — ready for your CRM or sales tools How it works 💬 Send company info to your Telegram bot 🔎 Workflow pulls data from Explorium MCP + Tavily 🧠 AI analyzes model, tools, pain points & goals 📤 JSON response sent back via Telegram or logged to your database Requirements 🔐 OpenAI API (GPT-4) 🧠 Explorium MCP API 🌐 Tavily Web Search API 🤖 Telegram Bot API 🗃️ PostgreSQL (for memory/logging) How to set up Add API keys in n8n Connect Telegram bot to webhook Set up PostgreSQL for memory persistence Customize prompts (tone, niche, etc.) Test by sending a company name via Telegram Customization Options 🎯 Focus enrichment on specific industries or keywords 💬 Adjust the email sequence structure & style 🧩 Add extra data sources (e.g. Clearbit, Crunchbase) 🧾 Format JSON to match your CRM schema ⚙️ Add approval step before sending emails Highlights ✅ Uses multi-source enrichment ✅ Works 100% from Telegram ✅ Integrates into any sales pipeline
by Matt Chong
Who is this for? This workflow is ideal for: For freelancers, business owners, and finance teams who receive receipts via Gmail Automatically logs expenses for tax, bookkeeping and year-end audits What problem is this workflow solving? When tax season hits, missing receipts create panic. This workflow keeps everything in one place. It uses AI to extract details from Gmail attachments, logs them in a Google Sheet, and stores the PDFs in Google Drive. No digging. No copying. Just everything where it should be. How it works? Apply the label receipt to any incoming Gmail email. Do not mark it as read. On a schedule (e.g. daily at 8:00 AM), the workflow triggers. It searches for unread emails with the label receipt. For each matching email, it downloads the attached receipt file. It extracts text content from the receipt file. It uploads the original receipt file to a specified folder in Google Drive. It merges the extracted text with email metadata. It sends this combined data to OpenAI. OpenAI extracts structured fields: date merchant category description subtotal tax total The extracted data is appended as a new row in the specific Google Sheet. Finally, the email is marked as read to prevent it from being processed again. How to set up? Connect these services in your n8n credentials: Gmail (OAuth2) Google Drive Google Sheets OpenAI Configure the Google Drive upload: In the “Upload File” node, select the target folder where you want receipt PDFs stored. Set your execution schedule: Open the “Schedule Trigger” node and choose when it should run (default is once daily at 8:00 AM). Choose your Google Sheet and tab: In the “Append to Google Sheet” node, select your document and tab Ensure the sheet contains these columns: Date, Merchant, Category, Description, Subtotal, Tax, Total. How to customize this workflow to your needs? Change the Gmail label or search filter** to match your needs. Modify the OpenAI schema** to extract additional fields like currency, project, or notes.
by explorium
Salesforce Lead Enrichment with Explorium Template Download the following json file and import it to a new n8n workflow: salesforce\_Workflow.json Overview This n8n workflow monitors your Salesforce instance for new leads and automatically enriches them with missing contact information. When a lead is created, the workflow: Detects the new lead via Salesforce trigger Matches the lead against Explorium's database using name and company Enriches the lead with professional email addresses and phone numbers Updates the Salesforce lead record with the discovered contact information This automation ensures your sales team always has the most up-to-date contact information for new leads, improving reach rates and accelerating the sales process. Key Features Real-time Processing**: Triggers automatically when new leads are created in Salesforce Intelligent Matching**: Uses lead name and company to find the correct person in Explorium's database Contact Enrichment**: Adds professional emails, mobile phones, and office phone numbers Batch Processing**: Efficiently handles multiple leads to optimize API usage Error Handling**: Continues processing other leads even if some fail to match Selective Updates**: Only updates leads that successfully match in Explorium Prerequisites Before setting up this workflow, ensure you have: n8n instance (self-hosted or cloud) Salesforce account with: OAuth2 API access enabled Lead object permissions (read/write) API usage limits available Explorium API credentials (Bearer token) - Get explorium api key Basic understanding of Salesforce lead management Salesforce Requirements Required Lead Fields The workflow expects these standard Salesforce lead fields: FirstName - Lead's first name LastName - Lead's last name Company - Company name Email - Will be populated/updated by the workflow Phone - Will be populated/updated by the workflow MobilePhone - Will be populated/updated by the workflow API Permissions Your Salesforce integration user needs: Read access to Lead object Write access to Lead object fields (Email, Phone, MobilePhone) API enabled on the user profile Sufficient API calls remaining in your org limits Installation & Setup Step 1: Import the Workflow Copy the workflow JSON from the template In n8n: Navigate to Workflows → Add Workflow → Import from File Paste the JSON and click Import Step 2: Configure Salesforce OAuth2 Credentials Click on the Salesforce Trigger node Under Credentials, click Create New Follow the OAuth2 flow: Client ID: From your Salesforce Connected App Client Secret: From your Salesforce Connected App Callback URL: Copy from n8n and add to your Connected App Authorize the connection Save the credentials as "Salesforce account connection" Note: Use the same credentials for all Salesforce nodes in the workflow. Step 3: Configure Explorium API Credentials Click on the Match\_prospect node Under Credentials, click Create New (HTTP Header Auth) Configure the header: Name: Authorization Value: Bearer YOUR_EXPLORIUM_API_TOKEN Save as "Header Auth account" Apply the same credentials to the Explorium Enrich Contacts Information node Step 4: Verify Node Settings Salesforce Trigger: Trigger On: Lead Created Poll Time: Every minute (adjust based on your needs) Salesforce Get Leads: Operation: Get All Condition: CreatedDate = TODAY (fetches today's leads) Limit: 20 (adjust based on volume) Loop Over Items: Batch Size: 6 (optimal for API rate limits) Step 5: Activate the Workflow Save the workflow Toggle the Active switch to ON The workflow will now monitor for new leads every minute Detailed Node Descriptions Salesforce Trigger: Polls Salesforce every minute for new leads Get Today's Leads: Retrieves all leads created today to ensure none are missed Loop Over Items: Processes leads in batches of 6 for efficiency Match Prospect: Searches Explorium for matching person using name + company Filter: Checks if a valid match was found Extract Prospect IDs: Collects all matched prospect IDs Enrich Contacts: Fetches detailed contact information from Explorium Merge: Combines original lead data with enrichment results Split Out: Separates individual enriched records Update Lead: Updates Salesforce with new contact information Data Mapping The workflow maps Explorium data to Salesforce fields as follows: | Explorium Field | Salesforce Field | Fallback Logic | | ------------------- | ---------------- | --------------------------------- | | emails[0].address | Email | Falls back to professions_email | | mobile_phone | MobilePhone | Falls back to phone_numbers[1] | | phone_numbers[0] | Phone | Falls back to mobile_phone | Usage & Monitoring Automatic Operation Once activated, the workflow runs automatically: Checks for new leads every minute Processes any leads created since the last check Updates leads with discovered contact information Continues running until deactivated Manual Testing To test the workflow manually: Create a test lead in Salesforce Click "Execute Workflow" in n8n Monitor the execution to see each step Verify the lead was updated in Salesforce Monitoring Executions Track workflow performance: Go to Executions in n8n Filter by this workflow Review successful and failed executions Check logs for any errors or issues Troubleshooting Common Issues No leads are being processed Verify the workflow is activated Check Salesforce API limits haven't been exceeded Ensure new leads have FirstName, LastName, and Company populated Confirm OAuth connection is still valid Leads not matching in Explorium Verify company names are accurate (not abbreviations) Check that first and last names are properly formatted Some individuals may not be in Explorium's database Try testing with known companies/contacts Contact information not updating Check Salesforce field-level security Verify the integration user has edit permissions Ensure Email, Phone, and MobilePhone fields are writeable Check for validation rules blocking updates Authentication errors Salesforce: Re-authorize OAuth connection Explorium: Verify Bearer token is valid and not expired Check API quotas haven't been exceeded Error Handling The workflow includes built-in error handling: Failed matches don't stop other leads from processing Each batch is processed independently Failed executions are logged for review Partial successes are possible (some leads updated, others skipped) Best Practices Data Quality Ensure complete lead data: FirstName, LastName, and Company should be populated Use full company names: "Microsoft Corporation" matches better than "MSFT" Standardize data entry: Consistent formatting improves match rates Performance Optimization Adjust batch size: Lower if hitting API limits, higher for efficiency Modify polling frequency: Every minute for high volume, less frequent for lower volume Set appropriate limits: Balance between processing speed and API usage Compliance & Privacy Data permissions: Ensure you have rights to enrich lead data GDPR compliance: Consider privacy regulations in your region Data retention: Follow your organization's data policies Audit trail: Monitor who has access to enriched data Customization Options Extend the Enrichment Add more Explorium enrichment by: Adding firmographic data (company size, revenue) Including technographic information Appending social media profiles Adding job title and department verification Modify Trigger Conditions Change when enrichment occurs: Trigger on lead updates (not just creation) Add specific lead source filters Process only leads from certain campaigns Include lead score thresholds Add Notifications Enhance with alerts: Email sales reps when leads are enriched Send Slack notifications for high-value matches Create tasks for leads that couldn't be enriched Log enrichment metrics to dashboards API Considerations Salesforce Limits API calls: Each execution uses \~4 Salesforce API calls Polling frequency: Consider your daily API limit Batch processing: Reduces API usage vs. individual processing Explorium Limits Match API: One call per batch of leads Enrichment API: One call per batch of matched prospects Rate limits: Respect your plan's requests per minute Integration Architecture This workflow can be part of a larger lead management system: Lead Capture → This Workflow → Lead Scoring → Assignment Can trigger additional workflows based on enrichment results Compatible with existing Salesforce automation (Process Builder, Flows) Works alongside other enrichment tools Security Considerations Credentials**: Stored securely in n8n's credential system Data transmission**: Uses HTTPS for all API calls Access control**: Limit who can modify the workflow Audit logging**: All executions are logged with details Support Resources For assistance with: n8n issues**: Consult n8n documentation or community forum Salesforce integration**: Reference Salesforce API documentation Explorium API**: Contact Explorium support for API questions Workflow logic**: Review execution logs for debugging
by Cyril Nicko Gaspar
Amazon Price Monitoring Workflow This workflow enables you to monitor the prices of Amazon product listings directly from a Google Sheet, using data provided by Bright Data’s Amazon Scraper API. It automates the retrieval of price data for specified products and is ideal for market research, competitor analysis, or personal price tracking. ✅ Requirements Before using this template, ensure you have the following: A Bright Data account and access to the Amazon Scraper API. An active API key from Bright Data. A Google Sheet set up with the required columns. N8N account (self-host or cloud version) ⸻ ⚙️ Setup 1. Create a Google Sheet with the following columns: Product URL ZIP Code (used for regional price variations) ASIN (Amazon Standard Identification Number) 2. Extract ASIN Automatically using the following formula in the ASIN column: =REGEXEXTRACT(A2, "/(?:dp|gp/product|product)/([A-Z0-9]{10})") Replace A2 with the appropriate cell reference 3. Obtain an API Key: Sign in to your Bright Data account. Go to the API section to generate an API key. Create a Bearer Authentication Credential using this key in your automation tool. 4. Configure the Workflow: Use a node (e.g., “Google Sheets”) to read data from your sheet. Use an HTTP Request node to send a query to Bright Data’s Amazon API with the ASIN and ZIP code. Parse the returned JSON response to extract product price and other relevant data. Optionally write the output (e.g., current price, timestamp) back into the sheet or another data store. ⸻ Workflow Functionality The workflow is triggered periodically (or manually) and reads product details from your Google Sheet. For each row, it extracts the Product URL and ZIP code and sends a request to the Bright Data API. The API returns product price information, which is then logged or updated back into the sheet using ASIN. You can also map the product URL to the product URL, but ensure that the URL has no parameters. If the URL has appended parameters, refer to the input field from the Bright Data snapshot result. ⸻ 💡 Use Cases E-commerce sellers monitoring competitors’ prices. Consumers tracking price drops on wishlist items. Market researchers collecting pricing data across ZIP codes. Affiliate marketers ensuring accurate product pricing on their platforms. ⸻ 🛠️ Customization Add columns for additional product data such as rating, seller, or stock availability. Schedule the workflow to run hourly, daily, or weekly depending on your needs. Implement email or Slack alerts for significant price changes. Filter by product category or brand to narrow your tracking focus.
by Jean-Marie Rizkallah
🧩 Jamf Patch Summary to Slack Stay on top of software patch compliance by automatically posting Jamf patch summaries to Slack. This helps IT and security teams quickly identify outdated installs and take action—without logging into Jamf. ✅ Prerequisites • A Jamf Pro API key with permissions to read software titles and patch summary • A Slack app or incoming webhook URL with permission to post messages to your desired channel 🔍 How it works • Manually trigger the flow or Add a webhook • Fetch a list of software titles from Jamf Pro • Filter to select the software you're tracking (e.g. Chrome, Edge) • Retrieve the patch summary for that software (latest version, up-to-date, out-of-date counts) • Format the summary into Slack Block Kit • Post the formatted summary into a Slack channel ⚙️ Set up steps • Takes ~5–10 minutes to configure • Set your server BaseURL variable in the Set Node • Add your Jamf Pro API credentials in the HTTP Request nodes (Get & Retrieve) • Set the target software ID in the Filter node • Add your Slack webhook URL or token in the final HTTP node • Optional: Adjust Slack formatting inside the Function node
by Sira Ekabut
This workflow automates AI-based image generation using the Fal.ai Flux API. Define custom prompts, image parameters, and effortlessly generate, monitor, and save the output directly to Google Drive. Streamline your creative automation with ease and precision. Who is this for? This template is for content creators, developers, automation experts, and creative professionals looking to integrate AI-based image generation into their workflows. It’s ideal for generating custom visuals with the Fal.ai Flux API and automating storage in Google Drive. What problem is this workflow solving? Manually generating AI-based images, checking their status, and saving results can be tedious. This workflow automates the entire process — from requesting image generation, monitoring its progress, downloading the result, and saving it directly to a Google Drive folder. What this workflow does Sets Custom Image Parameters: Allows you to define the prompt, resolution, guidance scale, and steps for AI image generation. Sends a Request to Fal.ai: Initiates the image generation process using the Fal.ai Flux API. Monitors Image Status: Checks for completion and waits if needed. Downloads the Generated Image: Fetches the completed image once ready. Saves to Google Drive: Automatically uploads the generated image to a specified Google Drive folder. Setup 1. Prerequisites: • Fal.ai API Key: Obtain it from the Fal.ai platform and set it as the Authorization header in HTTP Header Auth credentials. • Google Drive OAuth Credentials: Connect your Google Drive account in n8n. 2. Configuration: • Update the “Edit Fields” node with your desired image parameters: • Prompt: Describe the image (e.g., “Thai young woman net idol 25 yrs old, walking on the street”). • Width/Height: Define image resolution (default: 1024x768). • Steps: Number of inference steps (e.g., 30). • Guidance Scale: Controls image adherence to the prompt (e.g., 3.5). • Set your Google Drive folder ID in the “Google Drive” node to save the image. 3. Run the Workflow: • Trigger the workflow manually to generate the image. • The workflow waits, checks status, and saves the final output seamlessly. Customization • Modify Image Parameters: Adjust the prompt, resolution, steps, and guidance scale in the “Edit Fields” node. • Change Storage Location: Update the Google Drive node with a different folder ID. • Add Notifications: Integrate an email or messaging node to alert you when the image is ready. • Additional Outputs: Expand the workflow to send the generated image to Slack, Dropbox, or other platforms. This workflow streamlines AI-based image generation and storage, offering flexibility and customization for creative automation.