by Ranjan Dailata
Who this is for? The Structured Data Extract & Data Mining workflow is crafted for researchers, content analysts, SEO strategists, and AI developers who need to transform semi-structured web data (like markdown content or scraped HTML) into actionable structured datasets. It is ideal for: Content Analysts** - Organizing and mining large volumes of markdown or HTML content. SEO & Trend Researchers** - Exploring topics by location and category. AI Engineers & NLP Developers** - Looking to automate insight extraction from unstructured inputs. Growth Marketers** - Tracking topic-level trends for strategic campaigns. Automation Specialists** - Streamlining workflows from scrape to storage. What problem is this workflow solving? Extracting insights from markdown or HTML documents typically requires manual review, formatting, and parsing. This becomes unscalable when dealing with large datasets or when real-time response is needed. Additionally, trend and topic extraction usually involves external tools, custom scripts, and inconsistent formatting. This workflow solves: Automatic text extraction from markdown or structured content. Location and category-based trend mining with semantic grouping. AI-driven topic extraction and summarization Real-time notification via webhook with rich structured payloads. Persistent storage of mined data to disk for audits or further processing. What this workflow does Receives input: Sets the URL for the data extraction and analysis. Uses Bright Data's Web Unlocker to extract content from relevant sites. A Markdown/Text Extractor node parses the content into clean plaintext The cleaned data is passed to Google Gemini to: Identify trends by location and category Extract key topics and themes Format the response into structured JSON The structured insights are sent via Webhook Notification to external systems (e.g., Slack, Web apps, Zapier) The final output is saved to disk Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. A Google Gemini API key (or access through Vertex AI or proxy). Update the Set URL and Bright Data Zone for setting the brand content URL and the Bright Data Zone name. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. How to customize this workflow to your needs Update Source** : Update the workflow input to read from Google Sheet or Airbase for dynamically tracking multiple brands or topics. Gemini Prompt Customization** : Extract trends within a custom category (e.g., E-commerce design patterns in the US) Output topics with popularity metrics Structure the output as per your database schema (e.g., [{ topic, trend_score, location }]) Webhook Output** : Send notifications to - Slack – with AI summaries in rich blocks Internal APIs – for use in dashboards Zapier/Make – for multi-step automation Persistence** Save output to: Remote FTP or SFTP storage Amazon S3, Google Cloud Storage etc.
by Gavin
This Template gives the ability to monitor all uplinks for your Meraki Dashboard and then alert your team in a method you prefer. This example is a Teams notification to our Dispatch Channel Setup will probably take around 30 minutes to 1h provided with the Template. Most time intensive steps are getting a Meraki API key which I go over and setting up the Teams node which n8n has good documentation for. Tutorial & explanation https://www.youtube.com/watch?v=JvaN0dNwRNU
by phil
AI-Powered SEO Keyword Research Workflow with n8n > automates comprehensive keyword research for content creation Table of Contents Introduction Workflow Architecture NocoDB Integration Data Flow Core Components Setup Requirements Possible Improvements Introduction This n8n workflow automates SEO keyword research using AI and data-driven analytics. It combines OpenAI's language models with DataForSEO's analytics to generate comprehensive keyword strategies for content creation. The workflow is triggered by a webhook from NocoDB, processes the input data through multiple stages, and returns a detailed content brief with optimized keywords. Workflow Architecture The workflow follows a structured process: Input Collection: Receives data via webhook from NocoDB Topic Expansion: Generates keywords using AI Keyword Metrics Analysis: Gathers search volume, CPC, and difficulty metrics Competitor Analysis: Analyzes competitor content for ranking keywords Final Strategy Creation: Combines all data to generate a comprehensive keyword strategy Output Storage: Saves results back to NocoDB and sends notifications NocoDB Integration Database Structure The workflow integrates with two tables in NocoDB: Input Table Schema This table collects the input parameters for the keyword research: | Field Name | Type | Description | | --------------- | ------------- | --------------------------------------------------------------------------- | | ID | Auto Number | Unique identifier | | Primary Topic | Text | The main keyword/topic to research | | Competitor URLs | Text | Comma-separated list of competitor websites | | Target Audience | Single Select | Description of the target audience (Solopreneurs, Marketing Managers, etc.) | | Content Type | Single Select | Type of content (Blog, Product page, etc.) | | Location | Single Select | Target geographic location | | Language | Single Select | Target language for keywords | | Status | Single Select | Workflow status (Pending, Started, Done) | | Start Research | Checkbox | Active Workflow when you set this to true | Output Table Schema This table stores the generated keyword strategy: | Field Name | Type | Description | | ------------------ | ----------- | ------------------------------------------------ | | ID | Auto Number | Unique identifier | | primary_topic_used | Text | The topic that was researched | | report_content | Long Text | The complete keyword strategy in Markdown format | | generatedAt | Datetime | Automatically generated by NocoDb | Webhook Settings NocoDB Webhook Settings Data Flow The workflow handles data in the following sequence: Webhook Trigger: Receives input from NocoDB when a new keyword research request is created Field Extraction: Extracts primary topic, competitor URLs, audience, and other parameters AI Topic Expansion: Uses OpenAI to generate related keywords, categorized by type and intent Keyword Analysis: Sends primary keywords to DataForSEO to get search volume, CPC, and difficulty Competitor Research: Analyzes competitor pages to identify their keyword rankings Strategy Generation: Combines all data to create a comprehensive keyword strategy Storage & Notification: Saves the strategy to NocoDB and sends a notification to Slack Core Components 1. Topic Expansion This component uses OpenAI and a structured output parser to generate: 20 primary keywords 30 long-tail keywords with search intent 15 question-based keywords 10 related topics 2. DataForSEO Integration Two API endpoints are used: Search Volume & CPC**: Gets monthly search volume and cost-per-click data Keyword Difficulty**: Evaluates how difficult it would be to rank for each keyword 3. Competitor Analysis This component: Analyzes competitor URLs to identify which keywords they rank for Identifies content gaps or opportunities Determines the search intent their content targets 4. Final Keyword Strategy The AI-generated strategy includes: Top 10 primary keywords with metrics 15 long-tail opportunities with low competition 5 question-based keywords to address in content Content structure recommendations 3 potential content titles optimized for SEO Setup Requirements To use this workflow, you'll need: n8n Instance: Either cloud or self-hosted NocoDB Account: For data input and storage API Keys: OpenAI API key DataForSEO API credentials Slack API token (for notifications) Database Setup: Create the required tables in NocoDB as described above Possible Improvements The workflow could be enhanced with the following improvements: Enhanced Keyword Strategy Add topic clustering to group related keywords Enhance the final output with more specific content structure suggestions Include word count recommendations for each content section Additional Data Sources Integrate Google Search Console data for existing content optimization Add Google Trends data to identify rising topics Include sentiment analysis for different keyword groups Improved Competitor Analysis Analyze content length and structure from top-ranking pages Identify common backlink sources for competitor content Extract content headings to better understand content organization Automation Enhancements Add scheduling capabilities to run updates on existing content Implement content performance tracking over time Create alert thresholds for changes in keyword difficulty or search volume Example Output Here is an example Output the Workflow generated based on the following inputs. Inputs: Primary Topic: AI Automation Competitor URLs: n8n.io, zapier.com, make.com Target Audience: Small Business Owners Content Type: Landing Page Location: United States Language: English Output: Final Keyword Strategy The workflow provides a powerful automation for content marketers and SEO specialists to develop data-driven keyword strategies with minimal manual effort. > Original Workflow: AI-Powered SEO Keyword Research Automation - The vibe Marketer
by Dr. Firas
👉 Build a Phone Agent to qualify outbound leads and schedule inbound calls Who is this for? This workflow is designed for sales teams, call centers, and businesses handling both outbound and inbound lead calls who want to automate their qualification, follow-up, and call documentation process without manual intervention. It’s ideal for teams using Google Sheets, RetellAI, OpenAI, and Gmail as part of their tech stack. Real-World Use Cases 🛍 E-commerce – Instantly handle product FAQs and order status checks, 24/7. 🏬 Retail Stores – Share store hours, directions, and return policies without lifting a finger. 🍽 Restaurants – Take reservations or answer menu questions automatically. 💼 Service Providers – Book appointments or consultations while you focus on your craft. 📞 Any Local Business – Deliver friendly, consistent phone support — no live agent required. What problem is this workflow solving? Managing lead calls at scale can be chaotic—between scheduling outbound qualification calls, handling inbound appointment requests, and making sure every call is documented and followed up. This workflow automates the entire process, reducing human error and saving time by: ✅ Sending reminders to reps for outbound calls ✅ Automatically placing calls with RetellAI ✅ Handling inbound calls and checking caller details ✅ Generating and emailing call summaries automatically What this workflow does This n8n template connects Google Sheets, RetellAI, OpenAI, and Gmail into a seamless workflow: Outbound Lead Qualification Workflow Triggers when a new lead is added to Google Sheets Sends an SMS notification to remind the rep to call in 5 minutes (Optional) Waits 5 minutes Initiates an automated call to the lead via RetellAI Inbound Call Appointment Scheduler Receives inbound calls from RetellAI (via webhook) Checks if the caller’s number exists in Google Sheets Responds to RetellAI with a success or error message Post-Call Workflow Receives post-call data from RetellAI Filters only analyzed calls Updates the lead’s record in Google Sheets Uses OpenAI to generate a call summary Emails the summary to a team inbox or rep Setup ✅ You need an active RetellAI API key Sign up for RetellAI, create an agent, and set the webhook URLs (n8n_call for call events). Purchase a Twilio phone number and link it to the agent. ✅ Your Google Sheet must have a column for phone numbers (e.g., "Phone") ✅ Gmail account connected and authorized in n8n ✅ OpenAI API key added to your environment variables or credentials Configure your Google Sheets node with the correct spreadsheet ID and range Add your RetellAI API key to the HTTP request nodes Connect your Gmail account in the Gmail node Add your OpenAI key in the OpenAI node 👉 See full setup guide here: Notion Documentation How to customize this workflow to your needs Change SMS content**: Edit the text in the “Send SMS reminder” node to match your team’s tone Modify call wait time**: Enable and adjust the “Wait 5 minutes” node to any delay you prefer Add CRM integration**: Replace or extend the Google Sheets node to update your CRM instead of a spreadsheet Customize call summary prompts**: Edit the prompt sent to OpenAI to change the summary style or add extra insights Send email to different recipients**: Change the recipient address in the Gmail node or make it dynamic from the lead record Need help customizing? Contact me for consulting and support : Linkedin
by Don Jayamaha Jr
📈 Get daily and on-demand Tesla (TSLA) trading signals via Telegram—powered by GPT-4.1 and real-time market data. This is the central AI supervisor that orchestrates seven sub-agents for technical analysis, price pattern recognition, and news sentiment. Reports are delivered in structured Telegram-ready HTML, optimized for traders seeking fast, intelligent decision-making signals. ⚠️ This master agent requires 7 connected sub-workflows to function. One of them, the News & Sentiment Agent, also requires a DeepSeek Chat API key for language processing. 🔌 Required Sub-Workflows You must download and publish the following workflows: Tesla Financial Market Data Analyst Tool Tesla News and Sentiment Analyst Tool (Requires DeepSeek Chat API Key) Tesla 15min Indicators Tool Tesla 1hour Indicators Tool Tesla 1day Indicators Tool Tesla 1hour & 1day Klines Tool Tesla Quant Technical Indicators Webhooks Tool (Requires Alpha Vantage Premium API Key) 📍 See all tools at: 🔗 https://n8n.io/creators/don-the-gem-dealer/ 🔍 What This Agent Does Listens to your trading query via Telegram Calls the Financial Analyst and News & Sentiment Analyst These agents aggregate: RSI, MACD, BBANDS, SMA, EMA, ADX Candlestick pattern + volume divergence analysis News summaries and sentiment scoring via DeepSeek Chat GPT-4.1 composes the final structured TSLA trade report with: Spot and leverage setups Signal rationale Confidence score Sentiment tag News summary 🧠 Output Example TSLA Trading Report (Daily Summary) Spot Trade • Action: Buy • Entry: 172.45 • TP: 182.00 • SL: 169.80 • Signal: RSI bounce + Bullish Engulfing • Sentiment: Neutral Leveraged Position • Position: Long • Leverage: 3x • TP: 186 • SL: 170 • Confidence: High (83/100) 📰 Top News • Tesla Model Y delivery surge – Electrek • Options market pricing in upside – Bloomberg • FSD delayed in Canada – TeslaNorth 🛠️ Setup Instructions 1. Import All 8 Workflows Ensure all sub-agents above are published in your n8n instance. 2. Create Your Telegram Bot Use @BotFather to generate the token and connect to the trigger/send nodes. 3. Connect OpenAI GPT-4.1 Add your OpenAI credentials for GPT-4.1 in the designated node. 4. Add DeepSeek Chat API Key Sign up at https://deepseek.com and insert your DeepSeek Chat credentials in the News Agent. 5. Add Alpha Vantage Premium API Key Sign up at https://www.alphavantage.co/premium/ Use HTTP Header Auth for webhook-based indicator fetchers. 6. Replace Telegram ID Update the placeholder <<replace your ID here>> with your actual Telegram numeric ID in the auth node. 📌 Included Sticky Notes ✅ Telegram Bot Setup ✅ Agent Routing & Memory ✅ Financial vs. Sentiment Trigger Flow ✅ Report Formatting (HTML) ✅ API Requirements (GPT-4.1, DeepSeek, Alpha Vantage) ✅ Troubleshooting & Licensing 🧾 Licensing & Attribution © 2025 Treasurium Capital Limited Company Architecture, prompts, and trade report structure are IP-protected. No unauthorized rebranding permitted. 🔗 For support: LinkedIn – Don Jayamaha 🚀 Deploy the Tesla Quant Trading AI system with GPT-4.1, DeepSeek Chat, and Alpha Vantage Premium—right into Telegram. All 8 workflows are required. 🎥 Tesla Quant AI Agent – Live Demo Experience the power of the Tesla Quant Trading AI Agent in action.
by Marian Tcaciuc
Manage Calendar with Voice & Text Commands using GPT-4, Telegram & Google Calendar This n8n workflow transforms your Telegram bot into a personal AI calendar assistant, capable of understanding both voice and text commands in Romanian, and managing your Google Calendar using the GPT-4 model via LangChain. Whether you want to create, update, fetch, or delete events, you can simply speak or write your request to your Telegram bot — and the assistant takes care of the rest. 🚀 Features Voice command support using Telegram voice messages (.ogg) Transcription using OpenAI Whisper Natural language understanding with GPT-4 via LangChain Google Calendar integration: ✅ Create Events 🔁 Update Events ❌ Delete Events 📅 Fetch Events Responses sent back via Telegram 🛠️ Step-by-Step Setup Instructions 1. Create a Telegram Bot Go to @BotFather on Telegram. Send /newbot and follow the instructions. Save the Bot Token. 2. Configure Telegram Trigger Node Paste the Telegram token into the Telegram Trigger and Telegram nodes. Set updates to ["message"]. 3. Set up OpenAI Credentials Get an OpenAI API key from https://platform.openai.com Create a credential in n8n for OpenAI. This is used for both transcription and AI reasoning. 4. Set up Google Calendar In Google Cloud Console: Enable Google Calendar API Set up OAuth2 credentials Add your n8n redirect URI (usually https://yourdomain/rest/oauth2-credential/callback) Create a credential in n8n using Google Calendar OAuth2 Grant access to your calendar (e.g., "Family" calendar). ⚙️ Customization Options 🗣️ Change Language or Locale The transcription node uses "en" for English. Change to another locale if needed. ✏️ Edit Prompt You can modify the prompt in the AI Agent node to include your name, work schedule, or specific behavior expectations. 📆 Change Calendar Logic Adjust time ranges or filters in the Get Events node Add custom logic before Create Event (e.g., validation, conflict checks) 📚 Helpful Tips Make sure n8n has HTTPS enabled to receive Telegram updates. You can test the flow first using only text, then voice. Use AI memory or vector stores (like Supabase) if you want context-aware planning in the future.
by Belgacem Dhiflaoui
Description What Problem Does This Solve? 🛠️ This workflow automates the process of extracting key information from resumes received as email attachments and storing that data in a structured format within a Supabase database. It eliminates the manual effort of reviewing each resume, identifying relevant details, and entering them into a database. This streamlines the hiring process, making it faster and more efficient for recruiters and HR professionals. Target audience: Recruiters, HR departments, and talent acquisition teams. What Does It Do? 🌟 Monitors a designated email inbox for new messages with resume attachments. Extracts key information such as name, contact details, education, work experience, and skills from the attached resumes. Cleans and formats the extracted data. Stores the processed data securely in a Supabase database. Key Features 📋 Automatic email monitoring for resume attachments. Intelligent data extraction from various resume formats (e.g., PDF, DOC, DOCX). Customizable data fields to capture specific information. Seamless integration with Supabase for data storage. Uses OpenRouter to streamline API key management for services such as AI-powered parsing. Setup Instructions Prerequisites ⚙️ n8n Instance**: Self-hosted or cloud instance of n8n. Email Account**: Gmail account with Gmail API access for receiving resumes. Supabase Account**: A Supabase project with a database/table ready to store extracted resume data. You'll need the Supabase URL and API key. OpenRouter Account**: For managing AI model API keys centrally when using LLM-based resume parsing. Installation Steps 📦 1. Import the Workflow: Copy the exported workflow JSON. Import it into your n8n instance via “Import from File” or “Import from URL”. 2. Configure Credentials: In n8n > Credentials, add credentials for: Email account (Gmail API): Provide Client ID and Client Secret from the Google Cloud Platform. Supabase: Provide the Supabase URL and the anon public API key. OpenRouter (Optional): Add your OpenRouter API key for use with any AI-powered resume parsing nodes. Assign these credentials to their respective nodes: Gmail Trigger → Email credentials. Supabase Insert → Supabase credentials. AI Parsing Node → OpenRouter credentials. 3. Set Up Supabase Table: Create a table in Supabase with columns such as: name, email, phone, education, experience, skills, received_date, etc. Make sure the field names align with the structure used in your workflow. 4. Customize Nodes: Parsing Node(s):* Modify the workflow to use an *OpenAI model* directly for field extraction, replacing the *Basic LLM Chain** node that utilizes OpenRouter. 5. Test the Workflow: Send a test email with a resume attachment. Check n8n's execution log to confirm the workflow triggered, parsed the data, and inserted it into Supabase. Verify data integrity in your Supabase table. How It Works High-Level Workflow 🔍 Email Monitoring: Triggered when a new email with an attachment is received (via Gmail API). Attachment Check: Verifies the email contains at least one attachment. Prepare Data: Extracts the attachment and prepares it for analysis. Data Extraction: Uses OpenRouter-powered LLM (if configured) to extract structured information from the resume. Data Storage: The structured information is saved into the Supabase database. Node Names and Actions (Example) Gmail Trigger:** Triggers when a new email is received. IF**: Checks whether the received email includes any attachments. Get Attachments:** Retrieves attachments from the triggering email. Prepare Data:** Prepares the attachment content for processing. Basic LLM Chain:** Uses an AI model via OpenRouter to extract relevant resume data and returns it as structured fields. Supabase-Insert:** Inserts the structured resume data into your Supabase database.
by Andrew
Who is this for? This workflow is ideal for n8n self-hosted users, DevOps engineers, and automation developers who want to automatically back up their n8n workflows to GitHub on a regular basis. What problem is this workflow solving Manually backing up n8n workflows can be time-consuming and prone to human error. This workflow automates the backup process, ensuring that all workflows are safely stored in a version-controlled GitHub repository every 24 hours. What this workflow does This automation runs daily to back up all workflows from your n8n instance to a specified GitHub repository. Each workflow is saved as a .json file using its unique ID, organized into a folder path defined by repo_path. The workflow is designed to manage memory usage efficiently by recursively calling itself. Once the backup is complete, it optionally sends a Slack notification to confirm success. Setup Configure the Config node in the subworkflow to set: GitHub Repo Owner GitHub Repo Name Main folder path (repo_path) Connect your GitHub and (optionally) Slack credentials. Set the workflow to run on a daily cron schedule. Test the workflow manually to confirm the GitHub integration works. Sign up for a free consultation and find out how n8n can help you.
by victor de coster
*Smartlead to HubSpot Performance Analytics A streamlined workflow to analyze your Smartlead performance metrics by tracking lifecycle stages in HubSpot and generating automated reports.* Who is this for? (Outbound) Automation Agencies, Sales and marketing teams using Smartlead for outreach campaigns who want to track their performance metrics and lead progression in HubSpot. What problem does this workflow solve? Manual tracking of lead performance across Smartlead and HubSpot is time-consuming and error-prone. This workflow automates performance reporting by connecting your Smartlead data with HubSpot lifecycle stages, providing clear insights into your outreach campaign effectiveness. What this workflow does Automatically pulls performance data from your Smartlead campaigns Cross-references contact status with HubSpot lifecycle stages Generates comprehensive performance reports in Google Sheets Provides customizable reporting schedules to match your team's needs Setup Requirements PostgreSQL Database Set up your PostgreSQL instance (includes $300 free GCP credits) Follow our step-by-step setup guide: Find a step-by-step guide here Google Account Integration Connect your Google Account to n8n Find the guide here Smartlead Configuration Connect your Smartlead instance: Detailed connection guide included in workflow How to customize this workflow Configure the Trigger node to adjust report frequency Modify the Google Sheets template to match your specific KPIs Customize HubSpot lifecycle stage mapping in the Function node Adjust PostgreSQL queries to track additional metrics Need assistance or have suggestions? lmk here
by Javier Hita
Who is this for? This workflow is perfect for sales teams, business development professionals, recruitment agencies, and fractional CFO service providers who need to identify and qualify companies actively hiring. Whether you're prospecting for new clients, building a database of potential customers, or researching market opportunities, this automated solution saves hours of manual research while delivering high-quality, AI-analyzed leads. What problem is this workflow solving? Finding qualified prospects in the finance sector is time-consuming and often inefficient. Traditional methods involve: Manually browsing LinkedIn job postings for hours Difficulty distinguishing between genuine opportunities and recruitment spam Inconsistent lead categorization and qualification Risk of contacting the same companies multiple times Lack of structured data for sales team follow-up This workflow automates the entire lead generation process, from data collection to AI-powered qualification, ensuring you focus only on the most promising opportunities. What this workflow does This comprehensive lead generation system performs six key functions: Automated LinkedIn Job Scraping: Uses Apify's reliable LinkedIn Jobs Scraper to extract detailed job postings for finance positions, including company information, job descriptions, and contact details. Smart Data Processing: Removes duplicates, filters companies by size, and structures data for consistent analysis across all leads. Intelligent Lead Categorization: Compares new leads against your existing database to optimize processing and avoid duplicate work. AI-Powered Qualification: Leverages OpenAI's GPT-4 Mini to analyze each lead and determine: Company Category: Consumer companies, Fractional CFO services, Recruiting agencies, or Other Finance Role Validation: Confirms the position is genuinely finance-related Seniority Level: Entry, Mid, Senior, Director, or C-Level classification Job Summary: Concise description for quick sales team review Automated Database Management: Stores qualified leads in Airtable with comprehensive profiles, preventing duplicates while maintaining data integrity. Lead Scoring & Routing: Prioritizes leads based on processing status and qualification results for efficient sales team follow-up. Setup Prerequisites You'll need accounts for three services: Airtable** (Free tier supported) - For lead storage and management Apify** (14-day free trial available) - For LinkedIn job scraping OpenAI** (Pay-per-use) - For AI-powered lead analysis Step 1: Create Required Credentials Apify API Credential Sign up for an Apify account at apify.com Navigate to Settings → Integrations → API tokens Create a new API token In n8n, create a new Apify API credential with your token OpenAI API Credential Create an account at platform.openai.com Generate an API key in the API section In n8n, create a new OpenAI credential with your key Airtable Personal Access Token Go to airtable.com/create/tokens Create a personal access token with the following scopes: data.records:read data.records:write schema.bases:read In n8n, create a new Airtable Personal Access Token credential Step 2: Set Up Airtable Base Create a new Airtable base with the following structure: Table Name: Qualified Leads Required Fields: Company Name (Single line text) Job Title (Single line text) Is Finance Job (Checkbox) Seniority Level (Single select: Entry, Mid, Senior, Director, C-Level) Company Category (Single select: Consumer, Recruiting, Fractional CFO, Other) Job Summary (Long text) Company LinkedIn (URL) Job Link (URL) Posted Date (Date) Location (Single line text) Industry (Single line text) Company Employees (Number) Step 3: Configure the Workflow Import the Workflow: Copy the JSON and import it into your n8n instance Update Credentials: Replace placeholder credential IDs with your actual credential IDs in: "Scrape LinkedIn Jobs" node (Apify credential) "OpenAI GPT-4 Mini" node (OpenAI credential) "Save to Airtable" and "Get Existing Leads" nodes (Airtable credential) Configure Airtable Connection: Update the base ID and table ID in both Airtable nodes Set Search Parameters: In the "Edit Variables" node, configure: linkedinUrls: Your target LinkedIn job search URLs maxEmployees: Maximum company size filter (default: 200) batchSize: Processing batch size for API efficiency (default: 5) Step 4: Test the Workflow Start with a small test by setting count: 50 in the HTTP Request node Use a specific LinkedIn job search URL (e.g., "CFO jobs in New York") Execute the workflow manually and verify results in your Airtable base Review the AI categorization accuracy and adjust prompts if needed How to customize this workflow to your needs Targeting Different Roles Modify the LinkedIn search URLs in the "Edit Variables" node to target different positions: "https://www.linkedin.com/jobs/search/?keywords=Controller" "https://www.linkedin.com/jobs/search/?keywords=Finance%20Director" "https://www.linkedin.com/jobs/search/?keywords=VP%20Finance" Adjusting Company Size Filters Change the maxEmployees parameter to focus on different company segments: Startups: 1-50 employees SMBs: 51-500 employees Enterprise: 500+ employees Customizing AI Analysis Enhance the GPT-4 prompt in the "AI Lead Analyzer" node to include: Industry-specific criteria Geographic preferences Technology stack requirements Company growth stage indicators Integration Options Extend the workflow by adding: Slack notifications** for new qualified leads Email alerts** for high-priority prospects CRM integration** (Salesforce, HubSpot, Pipedrive) Lead enrichment** with additional data sources Scheduling Automation Set up the workflow to run automatically: Daily**: For active prospecting campaigns Weekly**: For ongoing market research Monthly**: For periodic database updates Performance & Cost Optimization API Efficiency**: The workflow processes leads in batches to optimize API usage Smart Deduplication**: Avoids re-processing existing leads to reduce costs Configurable Limits**: Adjust batch sizes and employee count filters based on your needs Expected Costs**: Approximately $0.05-$0.20 per 100 analysed leads (OpenAI costs) Troubleshooting Common Issues: Rate Limiting**: Increase delays between API calls if you encounter rate limits Data Quality**: Review LinkedIn search URLs for relevance to your target market AI Accuracy**: Adjust prompts if categorisation doesn't match your criteria Airtable Errors**: Verify field names match exactly between workflow and base structure Support Resources: Apify LinkedIn Scraper Documentation OpenAI API Documentation Airtable API Reference Transform your lead generation process with this powerful, AI-driven workflow that delivers qualified prospects ready for immediate outreach.
by InfraNodus
This template can be used to find the content gaps in your competitors' discourse: identifying the topics they are not yet connecting and giving you an opportunity to fill in this gap with your content and product ideas. It will also generate research questions that will help bridge the gaps and generate new ideas. The template showcases the use of multiple n8n nodes and processes: enriching Google sheets file with the new data data extraction content enhancement using GraphRAG approach content gap / research question generation This approach can be very useful for research, marketing, and SEO applications as you can quickly get an overview of the main topics that are available online for a certain niche and understand what is missing. What are Content Gaps in Marketing and SEO? In the context of SEO, content gaps are usually understood as the topics that your competitors rank for but you do not. However, it's hard to rank for these topics because there's very high competition. So a much more effective way is to identify the gaps between the topics your competitors are talking about that are not yet bridged in their discourse. If you address these gaps in your content, you will increase the informational gain that your content offers and also offer a novel perspective while touching upon the topics that are relevant in your field. For example, if we analyze the top websites for "body and physical practices, fitness, etc." we will see that most of them are talking about the health and fitness aspects and another big topic is the community aspect. However, there is a gap between the two topics: which means that most of the websites (companies) that talk about this topic don't mention the two in the same context. This might be an opportunity: bridging the gap between health, fitness but also emphasizing the community aspect that comes with a collective practice. How it works This template consists of the two stages: 1) Data enrichment of a Google sheet file with a list of your competitors using InfraNodus' GraphRAG to generate topical summaries and graph summaries for every URL you're analyzing. 2) Insight generation (using InfraNodus to identify the main topical clusters and gaps in those summaries, these insights are then added to the Google sheet file. Additionally, it contains a sub workflow that you can activate and launch to ask Perplexity model to conduct a market research and find the companies that operate in your field and populate the original Google sheet file. Here's a description step by step: Step 0: Populate the Google sheets file with the company data (either manually or using the sub-workflow provided or Manus AI / Deep Research) Steps 1-2: Triggering and Launching the workflow, extracting the company URL from the Google sheet row Step 3: Scraping the url content from the companies' websites and cleaning the data Steps 5-7: Use InfraNodus GraphRAG Content Enhancer to get a topical summary and graph summary. This is what you're going to get: Steps 8-10: Use InfraNodus AI to generate insight advice and research questions based on the content gaps How to use You need an InfraNodus GraphRAG API account and key to use this workflow. Create an InfraNodus account Get the API key at https://infranodus.com/api-access and create a Bearer authorization key for the InfraNodus HTTP nodes. Create a separate knowledge graph for each expert (using PDF / content import options) in InfraNodus For each graph, go to the workflow, paste the name of the graph into the body name field. Keep other settings intact or learn more about them at the InfraNodus access points page. Once you add one or more graphs as experts to your flow, add the LLM key to the OpenAI node and launch the workflow Requirements An InfraNodus account and API key A Google Sheet account and an authorization key Note: OpenAI key is not required. But you might want to get a Perplexity AI key if you'd like to use the sub-workflow that populates the Google sheet with your competitors' website addresses (if you don't have this list yet). Customizing this workflow You can use this same workflow with a Telegram bot or Slack (to be notified of the summaries and ideas). You can also hook up automated social media content creation workflows in the end of this template, so you can generate posts that are relevant (covering the important topics in your niche) but also novel (because they connect them in a new way). Check out our n8n templates for ideas at https://n8n.io/creators/infranodus/ Check out the complete guide at https://support.noduslabs.com/hc/en-us/articles/20234254556828-Find-Content-Gaps-in-Websites-Market-Research-and-SEO-n8n-Workflow Also check the full tutorial with a conceptual explanation at https://support.noduslabs.com/hc/en-us/articles/20454382597916-Beat-Your-Competition-Target-Their-Content-Gaps-with-this-n8n-Automation-Workflow Also check out the video tutorial with a demo: For support and help with this workflow, please, contact us at https://support.noduslabs.com
by PollupAI
Who is this for? This workflow is ideal for individuals focused on nutrition tracking, meal planning, or diet optimization—whether you’re a health-conscious individual, fitness coach, or developer working on a healthtech app. It also fits well for anyone who wants to capture their meal data via voice or text, without manually entering everything into a spreadsheet. What problem is this workflow solving? Manually logging meals and breaking down their nutritional content is time-consuming and often skipped. This workflow automates that process using Telegram for input, OpenAI for natural language understanding, and Google Sheets for structured tracking. It enables users to record meals by typing or sending voice messages, which are transcribed, analyzed for nutrients, and automatically stored for tracking and review. What this workflow does This n8n automation lets users send either a text or voice message to a Telegram bot describing their meal. The workflow then: Receives the Telegram message Checks if it’s a voice message • If yes: Downloads the audio file and transcribes it using OpenAI • If no: Uses the text input directly Sends the meal description to OpenAI to extract a structured list of ingredients and nutritional details Parses and stores the results in Google Sheets Responds via Telegram with a personalized confirmation message A testing interface also allows you to simulate prompts and view structured outputs for development or debugging. Setup Create a Telegram bot via BotFather and note the API token. Create an empty Google Sheet and store the sheet ID in the environment. Set up your OpenAI credentials in the n8n credential manager. Customize the “List of Ingredients and Nutrients” node with your prompt if needed. (Optional) Use the “Testing” section to simulate messages and refine outputs before going live. How to customize this workflow to your needs • Enhance prompts in the OpenAI node to improve the structure and accuracy of responses. • Add new fields in the Google Sheet and corresponding logic in the parser if you want more detail. • Adjust the Telegram response to provide motivational feedback, dietary tips, or summaries. • Upgrade to the “Pro” version mentioned in the contact section for USDA database integration and complete nutrient breakdowns. This is a lightweight, AI-powered meal logging automation that transforms voice or text into actionable nutrition data—perfect for making healthy eating easier and more data-driven. See my other workflows here