by Malik Hashir
Overview The n8n Telegram Gmail Assistant is an intelligent workflow that lets you search and retrieve specific Gmail emails simply by messaging a Telegram bot. Powered by advanced language models, it turns plain-language requests into precise Gmail searches, delivering results directly to your Telegram chat. This no-code automation is perfect for users who want instant, conversational access to their inbox—no Gmail tab required. Key Features Conversational Email Search: Just message the Telegram bot with requests like “Get me all emails from Amazon” or “Show unread emails after 6 June 2025.” The assistant understands sender names, keywords, and date filters—even if you only provide part of the information. AI-Powered Query Parsing: Uses a language model (LLM) to intelligently extract sender, keywords, and date range from your message, then builds an accurate Gmail search query. Flexible Filtering: Supports sender, keywords, ‘after’ and ‘before’ dates, or any combination. Handles both specific and broad queries. Instant Telegram Delivery: Each matching email is formatted with date, sender, subject, and a snippet, and sent as a separate Telegram message for easy reading. Customizable & Extendable: Swap the AI model (Google Gemini or OpenAI), adjust output formatting, and set email limits or read status as needed. How It Works User Sends a Telegram Message: For example, “Get unread emails from Amazon about invoices after 1 June 2025.” AI Interprets the Request: The workflow’s LLM agent extracts sender, keywords, and date filters, converting them into a Gmail search query using Gmail’s syntax (e.g., from:amazon AND (invoice OR invoices) AND after:2025/06/01). Gmail Search: The workflow fetches all matching emails from your connected Gmail account. Message Formatting: Each email is summarized into a concise, emoji-rich Telegram message (date, sender, subject, snippet). Telegram Delivery: Results are sent to your Telegram chat, one message per email. Setup Instructions Create a Telegram Bot: Use @BotFather on Telegram to create a bot and obtain the API token. Connect Telegram to n8n: Add your bot’s API token as a credential in n8n. Connect Gmail Account: Authorize your Gmail account in n8n, set email limits, and choose read/unread status preferences. Configure AI Model: Use your own Google Gemini or OpenAI API key, or select a preferred LLM node in the workflow. Deploy the Workflow: Activate the workflow and start messaging your Telegram bot to retrieve emails instantly. Value Proposition Save Time:** No need to open Gmail or remember search operators—just ask in plain language. Stay Organized:** Instantly filter and retrieve important emails, even on the go. User-Friendly:** No coding required, with clear setup steps and customizable options. Cost-Effective:** Available simply with an n8n subscription—no extra costs or hidden fees of anything. Enjoy the workflow Free Forever within your n8n plan.
by Nukeador
Who is this for? BlueSky users looking to automate the publication of new posts based on new items from a RSS feed. What this workflow does This will create a BlueSky post with each new RSS feed item, including the feed title, post image, link and content (up to 200 characters) Setup You'll need to generate a BlueSky app password Configure your feed URL in the first node Configure your credentials in the second node How to customize this workflow to your needs You can modify the message posted in the `Create post node, changing the JSON text` value, in case you want to include only the feed item title instead of the content. If you RSS feed doesn't provide an image, you can define a static one on the `Download image` node.
by Thomas Janssen
Build a 100% local RAG with n8n, Ollama and Qdrant. This agent uses a semantic database (Qdrant) to answer questions about PDF files. Tutorial Click here to view the YouTube Tutorial How it works Build a chatbot that answers based on documents you provide it (Retrieval Augmented Generation). You can upload as many PDF files as you want to the Qdrant database. The chatbot will use its retrieval tool to fetch the chunks and use them to answer questions. Installation Install n8n + Ollama + Qdrant using the Self-hosted AI starter kit Make sure to install Llama 3.2 and mxbai-embed-large as embeddings model. How to use it First run the "Data Ingestion" part and upload as many PDF files as you want Run the Chatbot and start asking questions about the documents you uploaded
by PixelMakers
This n8n template automates the process of capturing leads from Webflow form submissions and syncing them with your Pipedrive CRM. It ensures that each submission is accurately associated with the correct organization and contact in Pipedrive, streamlining lead management and minimizing duplicates. Use cases include: Sales teams that want to automate CRM data entry, marketing teams capturing qualified leads from landing pages, or any business looking to improve their Webflow-to-CRM integration workflow. Good to know The workflow assumes that Webflow form submissions include the lead’s email address. The domain is extracted from the email to match or create the organization in Pipedrive. This template does not handle lead scoring or enrichment, but can be extended for such use-cases. How it works Extract website from email The email is split to extract the domain (e.g., jane@company.com → company.com), which is used to search for existing organizations. Check if the organization exists The Pipedrive API is queried using the domain. If the organization exists, we proceed. If not, a new organization is created. Check if the person exists -- If the person already exists in Pipedrive, a note is added to their activities to log the form submission. -- If the person does not exist, a new person is created, a note is added to the person, and a new lead is created. (Optional) Add your own actions You can extend this workflow to trigger Slack notifications, email follow-ups, or internal dashboards. How to use Start with the manual trigger node, or replace it with a webhook to connect directly to Webflow form submissions in real-time. Requirements Webflow form integration (via webhook or other method) Pipedrive account and API key Customising this workflow You can add enrichment services to auto-fill job titles or LinkedIn profiles. Perfect for growing sales pipelines without manual CRM input.
by Shiv Gupta
🎵 TikTok Post Scraper via Keywords | Bright Data + Sheets Integration 📝 Workflow Description Automatically scrapes TikTok posts based on keyword search using Bright Data API and stores comprehensive data in Google Sheets for analysis and monitoring. 🔄 How It Works This workflow operates through a simple, automated process: Keyword Input:** User submits search keywords through a web form Data Scraping:** Bright Data API searches TikTok for posts matching the keywords Processing Loop:** Monitors scraping progress and waits for completion Data Storage:** Automatically saves all extracted data to Google Sheets Result Delivery:** Provides comprehensive post data including metrics, user info, and media URLs ⏱️ Setup Information Estimated Setup Time: 10-15 minutes This includes importing the workflow, configuring credentials, and testing the integration. Most of the process is automated once properly configured. ✨ Key Features 📝 Keyword-Based Search Search TikTok posts using specific keywords 📊 Comprehensive Data Extraction Captures post metrics, user profiles, and media URLs 📋 Google Sheets Integration Automatically organizes data in spreadsheets 🔄 Automated Processing Handles scraping progress monitoring 🛡️ Reliable Scraping Uses Bright Data's professional infrastructure ⚡ Real-time Updates Live status monitoring and data processing 📊 Data Extracted | Field | Description | Example | |-------|-------------|---------| | url | TikTok post URL | https://www.tiktok.com/@user/video/123456 | | post_id | Unique post identifier | 7234567890123456789 | | description | Post caption/description | Check out this amazing content! #viral | | digg_count | Number of likes | 15400 | | share_count | Number of shares | 892 | | comment_count | Number of comments | 1250 | | play_count | Number of views | 125000 | | profile_username | Creator's username | @creativity_master | | profile_followers | Creator's follower count | 50000 | | hashtags | Post hashtags | #viral #trending #fyp | | create_time | Post creation timestamp | 2025-01-15T10:30:00Z | | video_url | Direct video URL | https://video.tiktok.com/tos/... | 🚀 Setup Instructions Step 1: Prerequisites n8n instance (self-hosted or cloud) Bright Data account with TikTok scraping dataset access Google account with Sheets access Basic understanding of n8n workflows Step 2: Import Workflow Copy the provided JSON workflow code In n8n: Go to Workflows → + Add workflow → Import from JSON Paste the JSON code and click Import The workflow will appear in your n8n interface Step 3: Configure Bright Data In n8n: Navigate to Credentials → + Add credential → Bright Data API Enter your Bright Data API credentials Test the connection to ensure it's working Update the workflow nodes with your dataset ID: gd_lu702nij2f790tmv9h Replace BRIGHT_DATA_API_KEY with your actual API key Step 4: Configure Google Sheets Create a new Google Sheet or use an existing one Copy the Sheet ID from the URL In n8n: Credentials → + Add credential → Google Sheets OAuth2 API Complete OAuth setup and test connection Update the Google Sheets node with your Sheet ID Ensure the sheet has a tab named "Tiktok by keyword" Step 5: Test the Workflow Activate the workflow using the toggle switch Access the form trigger URL to submit a test keyword Monitor the workflow execution in n8n Verify data appears in your Google Sheet Check that all fields are populated correctly ⚙️ Configuration Details Bright Data API Settings Dataset ID:** gd_lu702nij2f790tmv9h Discovery Type:** discover_new Search Method:** keyword Results per Input:** 2 posts per keyword Include Errors:** true Workflow Parameters Wait Time:** 1 minute between status checks Status Check:** Monitors until scraping is complete Data Format:** JSON response from Bright Data Error Handling:** Automatic retry on incomplete scraping 📋 Usage Guide Running the Workflow Access the form trigger URL provided by n8n Enter your desired keyword (e.g., "viral dance", "cooking tips") Submit the form to start the scraping process Wait for the workflow to complete (typically 2-5 minutes) Check your Google Sheet for the extracted data Best Practices Use specific, relevant keywords for better results Monitor your Bright Data usage to stay within limits Regularly backup your Google Sheets data Test with simple keywords before complex searches Review extracted data for accuracy and completeness 🔧 Troubleshooting Common Issues 🚨 Scraping Not Starting Verify Bright Data API credentials are correct Check dataset ID matches your account Ensure sufficient credits in Bright Data account 🚨 No Data in Google Sheets Confirm Google Sheets credentials are authenticated Verify sheet ID is correct Check that the "Tiktok by keyword" tab exists 🚨 Workflow Timeout Increase wait time if scraping takes longer Check Bright Data dashboard for scraping status Verify keyword produces available results 📈 Use Cases Content Research Research trending content and hashtags in your niche to inform your content strategy. Competitor Analysis Monitor competitor posts and engagement metrics to understand market trends. Influencer Discovery Find influencers and creators in specific topics or industries. Market Intelligence Gather data on trending topics, hashtags, and user engagement patterns. 🔒 Security Notes Keep your Bright Data API credentials secure Use appropriate Google Sheets sharing permissions Monitor API usage to prevent unexpected charges Regularly rotate API keys for better security Comply with TikTok's terms of service and data usage policies 🎉 Ready to Use! Your TikTok scraper is now configured and ready to extract valuable data. Start with simple keywords and gradually expand your research as you become familiar with the workflow. Need Help? Visit the n8n community forum or check the Bright Data documentation for additional support and advanced configuration options. For any questions or support, please contact: Email or fill out this form
by Automate With Marc
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 🧠 Perplexity-Powered Daily AI News Digest (via Telegram) This ready-to-deploy n8n workflow automates the entire process of collecting, filtering, formatting, and distributing daily AI industry news summaries directly to your Telegram group or channel. Powered by Perplexity and OpenAI, it fetches only high-signal AI updates from trusted sources (e.g. OpenAI, DeepMind, HuggingFace, MIT Tech Review), filters out duplicates based on a Google Sheet archive, and delivers beautifully formatted news directly to your team — every morning at 10AM. For more such build and step-by-step tutorials, check out: https://www.youtube.com/@Automatewithmarc 🚀 Key Features: Perplexity AI Integration: Automatically fetches the most relevant AI developments from the last 24 hours. AI Formatter Agent: Cleans the raw feed, removes duplicates, adds summaries, and ensures human-friendly formatting. Google Sheets Log: Tracks previously reported news items to avoid repetition. Telegram Delivery: Sends a polished daily digest straight to your chat, ready for immediate team consumption. Customizable Scheduling: Preconfigured for daily use, but can be modified to fit your team's preferred cadence. 💼 Ideal For: Anyone who wants to stay ahead of fast-moving AI trends with zero manual effort 🛠️ Tech Stack: Perplexity AI OpenAI (GPT-4 or equivalent) Google Sheets Telegram API ✅ Setup Notes: You’ll need to connect your own OpenAI, Perplexity, Google Sheets, and Telegram credentials. Replace the Google Sheet ID and Telegram channel settings with your own.
by Franz
🧠 Sentiment Analyzer Google Sheets → OpenAI GPT-4o → QuickChart → Gmail 🚀 What this workflow does Fetches customer reviews from a Google Sheet. Classifies each review as Positive, Neutral or Negative with GPT-4o-mini. Writes the sentiment back to your sheet. Builds a doughnut chart summarising the totals. Emails the chart to your chosen recipient so the whole team stays in the loop. Perfect for support teams, product managers or anyone who wants a zero-code mood ring for their users’ feedback. 🗺️ Node-by-node tour | 🔩 Node | 💡 Purpose | | ------------------------------------------------------- | ---------------------------------------------------------- | | Manual Trigger | Lets you test the workflow on demand. | | Select Google Sheet | Points to the spreadsheet that holds your reviews. | | Loop Over Items | Feeds each row through the analysis routine. | | Sentiment Analysis (LangChain) | Calls GPT-4o-mini and returns only the sentiment category. | | Update Google Sheet | Writes the new Sentiment value into column C. | | Read Data from Google Sheet | Pulls the full sheet again to create a summary. | | Extract Number of Answers per Sentiment (Code node) | Tallies up how many reviews fall into each category. | | Generate QuickChart | Creates a doughnut (or pie) chart as a PNG. | | Send Gmail with Sentiment Chart | Fires the chart off to your inbox. | | (Sticky Notes) | Friendly setup tips scattered around the canvas. | 🛠️ Setup checklist | ✅ Step | Where | | ------------------------------------------------------------------------------------- | -------------------------------- | | Connect Google Sheets → paste your Spreadsheet ID & choose the correct sheet. | All Google Sheets nodes | | Add OpenAI credentials (sk-… key). | Sentiment Analysis node | | Configure Gmail OAuth2 + recipient address. | Gmail node | | Match your sheet columns → “Review title”, “Review text”, empty “Sentiment”. | Google Sheet itself | | (Optional) Switch to gpt-4o for maximum accuracy. | Sentiment Analysis “Model” param | 🏃♂️ How to run Drop a few sample reviews into the sheet. Click “Test workflow” on the Manual Trigger. Watch each row march through → sentiment appears in column C. After all rows finish, check your inbox for a fresh chart. ✔️ ✨ Ideas for next level Schedule** the trigger (Cron) to auto-process new reviews daily. Feed the counts to Slack or Discord instead of email. Add a second GPT call to generate a short summary for each review. Happy automating! 🎉
by Allan Daemon
This creates a git backup of the workflows and credentials. It uses the n8n export command with git diff, so you can run as many times as you want, but only when there are changes they will create a commit. Setup You need some access to the server. Create a repository in some remote place to host your project, like Github, Gitlab, or your favorite private repo. Clone the repository in the server in a place that the n8n has access. In the example, it's the ., and the repository name is repo. Change it in the commands and in the workflow commands (you can set it as a variable in the wokflow). Checkout to another branch if you won't use the master one. cd . git clone repository Or you could git init and then add the remote (git remote add origin YOUR_REPO_URL), whatever pleases you more. As the server, check if everything is ok for beeing able to commit. Very likely you'll need to setup the user email and name. Try to create a commit, and push it to upstream, and everything you need (like config a user to comit) will appear in way. I strong suggest testing with exporting the commands to garantee it will work too. cd ./repo git commit -c "Initial commmit" --allow-empty -u is the same as --set-upstream git push -u origin master Testing to push to upstream with the first exported data npx n8n export:workflow --backup --output ./repo/workflows/ npx n8n export:credentials --backup --output repo/credentials/ cd ./repo git add . git commit -c "manual backup: first export" git push After that, if everything is ok, the workflow should work just fine. Adjustments Adjust the path in used in the workflow. See the the git -C PATH command is the same as cd PATH; git .... Also, adjust the cron to run as you need. As I said in the beginning, you can run it even for every minute, but it will create commits only when there are changes. Credentials encryption The default for exporting the credentials is to do them encrypted. You can add the flag --decrypted to the n8n export:credentials command if you need to save them in plain. But as general rule, it's better to save the encryption key, that you only need to do that once, and them export it safely encrypted.
by Parth Pansuriya
Automate Amazon searches to Telegram with AI-powered scraping This workflow connects Amazon product lookups to Telegram using AI-enhanced scraping and automation. It lets users send a product name to a Telegram bot and instantly receive pricing, discount, and product links — all pulled dynamically from Amazon. Who’s it for Amazon affiliates Telegram shopping groups Product reviewers & resellers Deal-focused communities Anyone wanting fast price checks without browsing How it works Telegram Trigger receives messages from the user. AI Classifier (via OpenRouter & LangChain) detects whether the user is asking for a product. If yes, it sends the query to Apify's Amazon Scraper to fetch real product listings. The scraped data (price, discount, rating, link) is formatted into a Telegram response. If not a product query, an AI fallback response is sent instead. Requirements Telegram Bot Token Apify API Token OpenRouter API Key (or compatible LLM provider)
by Boriwat Chanruang
Who is this for? This workflow is for small business owners, personal assistants, or project managers who rely on multiple platforms for communication and scheduling. Ideal for users managing customer support, personal scheduling, or group event coordination via LINE, Google Calendar, and Gmail. What problem is this workflow solving? Reduces the manual effort needed to manage conversations, schedule events, and handle email communications. Provides an intelligent system for replying to user messages and fetching relevant calendar or email information in real time. Bridges the gap between messaging platforms and productivity tools, improving efficiency. What this workflow does LINE Chatbot Automation**: Automatically processes and responds to messages received via LINE. Google Calendar Management**: Retrieves upcoming events or schedules new events dynamically based on user queries. Email Retrieval**: Fetches recent emails using Gmail and filters them based on user instructions. AI-Powered Replies**: Uses OpenAI GPT to interpret user queries and provide tailored responses. Setup Prerequisites: LINE Developer account and API access. Google Calendar and Gmail accounts with OAuth credentials. An n8n instance with access to environment variables. Steps: Set up environment variables (LINE_API_TOKEN and DYNAMIC_EMAIL). Configure API credentials for Google Calendar and Gmail in n8n. Test the workflow by sending a sample message via LINE. Enhancements: Use sticky notes to provide inline instructions for each node. Include a video walkthrough or a step-by-step document for first-time users. How to customize this workflow to your needs Localization**: Modify responses in the AI Agent node to match the language and tone of your audience. Integration**: Add more integrations like Slack or Microsoft Teams for additional notifications. Advanced Filters**: Add specific conditions to Gmail or Google Calendar nodes to fetch only relevant data, such as events with specific keywords or emails from certain senders. Advanced Use Cases Customer Support**: Automatically schedule meetings with clients based on their messages in LINE. Event Management**: Handle RSVP confirmations, event reminders, and email follow-ups for planned events. Personalized Assistant**: Use the workflow to act as a personal virtual assistant that syncs your schedule, replies to messages, and summarizes emails. Tips for Optimization Edit Fields Node**: Add a centralized node to configure dynamic inputs (e.g., tokens, emails, or thresholds) for easy updates. Fallback Responses**: Use a switch node to handle unrecognized input gracefully and provide clear feedback to users. Logs and Monitoring**: Add nodes to log interactions and track message flows for debugging or analytics. Let me know if you'd like me to expand on any specific section or add more customization ideas!
by Roninimous
This workflow integrates iOS Shortcuts with n8n to create a simple, automatic location-based reminder system. When the user arrives at a specified location, an automation in the Shortcuts app sends a webhook trigger to n8n. If the trigger matches predefined date and time conditions, n8n sends a Telegram message reminder to the user. This is perfect for repetitive weekly tasks like taking out the bins, customized with conditions for day and time. Key Features Location-Based Trigger: Uses iOS Shortcuts automation to start the workflow upon arrival at a specific location. Time and Day Validation: Logic in n8n checks current weekday and time to ensure reminders are sent only when appropriate. Telegram Integration: Sends reminders directly to your Telegram account using your bot. Minimal Setup: Uses native iOS and simple webhook setup in n8n. How It Works iOS Shortcut Trigger: When the user arrives at a designated location, the iOS shortcut sends a GET request to the n8n webhook. n8n Webhook Node: Receives the request and triggers the workflow. Conditional Check: An IF node checks if the current time is after 4:00 PM and it's a Wednesday (or any other configured condition). Telegram Node: If the condition passes, n8n sends a message like "Don't forget to take the bins out." to your Telegram bot. Setup Instructions Create a Telegram Bot: Use @BotFather to create a bot and obtain your bot token. Add Telegram API credentials in n8n with your bot token. Setup iOS Shortcut: Open the Shortcuts app on your iPhone. Go to the Automation tab → Tap + → Create Personal Automation. Choose Arrive → Select a location. Add action: Get Contents of URL. Method: GET, URL: your n8n Webhook URL (e.g. https://n8n.yourdomain.com/webhook/your-path). Save the automation. (You can also test the automation by pressing the Play button) Import Workflow into n8n: Load the provided workflow JSON. Set your webhook path and Telegram credentials. Adjust the logic in the IF node to your usecase. In my case, I check if today is Wednesday and after 4 PM until Midnight. Expose n8n Publicly: Ensure your n8n instance is publicly accessible via HTTPS so the shortcut can reach it. Customization Guidance Change Reminder Message: Modify the text inside the Telegram node to suit different reminders. Add More Conditions: Extend the logic to support more days, hours, or different trigger messages. Add Multi-Channel Output: Send reminders via email, SMS, or Slack in addition to Telegram. Use More Triggers: Expand to other types of shortcut triggers (e.g. NFC tag, leaving location, time of day). Security and Implementation Webhook Protection: Avoid using easily guessable webhook URLs. Secure Telegram Token: Store your bot token securely in n8n credentials, not in plain workflow text. Limit Shortcut Scope: Only trigger the shortcut at trusted locations or with secure iCloud sync. Automation Permissions: Ensure your iPhone allows shortcut automations to run without confirmation. Benefits Automates repetitive location-based reminders without user interaction. Provides a lightweight, native solution using iOS and n8n with no extra apps. Keeps you on track for routine tasks like garbage days, medicine reminders, or arrival-based tasks. Easily extendable for multiple locations or trigger conditions.
by Javier Hita
Find LinkedIn Professionals with Google Search and Airtable Who is this for? This workflow is perfect for sales professionals, recruiters, business development teams, and marketers who need to build targeted prospect lists from LinkedIn. Whether you're looking for specific job titles, industry professionals, or experts in particular locations, this template automates the tedious process of manual LinkedIn searching. Follow me for more What problem is this workflow solving? Finding qualified prospects on LinkedIn manually is time-consuming and inefficient. Traditional methods involve: Manually searching LinkedIn with limited search capabilities Copy-pasting profile information one by one Struggling with LinkedIn's search limitations and restrictions Difficulty organizing and tracking prospect data No systematic way to avoid duplicate contacts This workflow solves these challenges by leveraging Google's powerful search capabilities to find LinkedIn profiles at scale, automatically extracting key information, and organizing everything in a structured database. What this workflow does The workflow performs intelligent LinkedIn prospect discovery through these key steps: Keyword-Based Search: Uses Google Custom Search API to find LinkedIn profiles matching your specific criteria (job titles, industries, locations) Smart Data Extraction: Automatically parses profile titles, descriptions, URLs, and search snippets from Google results Structured Storage: Saves all prospect data to Airtable with proper field mapping and automatic deduplication Pagination Handling: Automatically processes multiple pages of search results to maximize prospect discovery Rate Limiting: Includes built-in delays to respect API limits and ensure reliable operation Key features: Deduplication**: Prevents storing duplicate LinkedIn profiles Batch Processing**: Handles large prospect lists efficiently Customizable Search**: Easily modify keywords to target different professional segments Clean Data Output**: Structured data ready for outreach campaigns Setup Prerequisites You'll need accounts with the following services: Google Cloud Console** (for Custom Search API) Airtable** (free tier works) n8n** (cloud or self-hosted) Step 1: Google Custom Search Setup Go to Google Cloud Console Create a new project or select existing one Enable the Custom Search API Create credentials (API Key) Set up a Custom Search Engine at Google CSE Configure it to search the entire web Copy your Search Engine ID (cx parameter) Bonus: Youtube Set-up Guide Step 2: Airtable Base Setup Create a new Airtable base with a table named "LinkedIn Prospects" containing these fields: Title (Single line text) - LinkedIn profile headline linkedin_url (URL) - Direct link to LinkedIn profile Search (Single line text) - Original search terms used Description (Long text) - Profile description/summary Snippet (Long text) - Google search result snippet Step 3: n8n Credentials Configuration Set up these credentials in n8n: Google Custom Search API: Type: HTTP Query Auth Name: Google Query Auth Query Parameter Name: key Value: Your Google API key Airtable: Type: Airtable Personal Access Token Token: Your Airtable personal access token Configure the base and table IDs in the Airtable node Step 4: Workflow Configuration Import this workflow template Update the "⚙️ CUSTOMIZE YOUR SEARCH KEYWORDS HERE" node with your target keywords Configure the Airtable node with your base and table information Test the workflow with a small keyword set first How to customize this workflow to your needs Targeting Different Industries Modify the search keywords in the yellow configuration node: // For technology professionals "Software Engineer React" "Product Manager SaaS" "Data Scientist Machine Learning" // For sales professionals "Account Executive Enterprise" "Sales Director B2B" "Business Development Manager" // For marketing professionals "Digital Marketing Manager" "Content Marketing Specialist" "Growth Marketing Lead" Geographic Targeting Add location keywords to narrow your search: "Marketing Manager London" "Sales Director New York" "Software Engineer Berlin" Company Size Targeting Include company type indicators: "CFO Startup" "VP Engineering Fortune 500" "Marketing Director SMB" Adjusting Search Volume Modify the Maxresults parameter in the "Configure Search Settings" node: Set to 10 for quick tests Set to 50-100 for comprehensive searches Maximum recommended: 100 per search to respect API limits Industry-Specific Customization For Recruiters: Target specific job titles and seniority levels Add skills-based keywords ("Python Developer", "React Specialist") Include experience indicators ("Senior", "Lead", "Principal") For Sales Teams: Focus on decision-maker titles ("Director", "VP", "C-Level") Target specific company sizes or industries Include location-based searches for territory management For Marketers: Search for industry influencers and thought leaders Target specific professional communities Look for content creators and industry experts Advanced Filtering Add conditional logic after the search results to filter prospects based on: Profile description keywords Title patterns Company information (when available in snippets) Integration Extensions Connect additional tools to enhance your prospect research: Email finder tools** (Hunter.io, Apollo) for contact discovery CRM integration** (HubSpot, Salesforce) for automatic lead creation Enrichment services** (Clearbit, ZoomInfo) for additional prospect data Slack/Teams notifications** for real-time prospect alerts Data Quality Improvements Enhance the workflow with additional processing: Duplicate detection** across multiple search terms Profile validation** to ensure active LinkedIn profiles Keyword scoring** to rank prospect relevance Export formatting** for specific CRM requirements This template provides a solid foundation that can be adapted for virtually any B2B prospect research need, making it an essential tool for modern sales and marketing teams.