by Gavin
This Template gives the ability to monitor all uplinks for your Meraki Dashboard and then alert your team in a method you prefer. This example is a Teams notification to our Dispatch Channel Setup will probably take around 30 minutes to 1h provided with the Template. Most time intensive steps are getting a Meraki API key which I go over and setting up the Teams node which n8n has good documentation for. Tutorial & explanation https://www.youtube.com/watch?v=JvaN0dNwRNU
by phil
AI-Powered SEO Keyword Research Workflow with n8n > automates comprehensive keyword research for content creation Table of Contents Introduction Workflow Architecture NocoDB Integration Data Flow Core Components Setup Requirements Possible Improvements Introduction This n8n workflow automates SEO keyword research using AI and data-driven analytics. It combines OpenAI's language models with DataForSEO's analytics to generate comprehensive keyword strategies for content creation. The workflow is triggered by a webhook from NocoDB, processes the input data through multiple stages, and returns a detailed content brief with optimized keywords. Workflow Architecture The workflow follows a structured process: Input Collection: Receives data via webhook from NocoDB Topic Expansion: Generates keywords using AI Keyword Metrics Analysis: Gathers search volume, CPC, and difficulty metrics Competitor Analysis: Analyzes competitor content for ranking keywords Final Strategy Creation: Combines all data to generate a comprehensive keyword strategy Output Storage: Saves results back to NocoDB and sends notifications NocoDB Integration Database Structure The workflow integrates with two tables in NocoDB: Input Table Schema This table collects the input parameters for the keyword research: | Field Name | Type | Description | | --------------- | ------------- | --------------------------------------------------------------------------- | | ID | Auto Number | Unique identifier | | Primary Topic | Text | The main keyword/topic to research | | Competitor URLs | Text | Comma-separated list of competitor websites | | Target Audience | Single Select | Description of the target audience (Solopreneurs, Marketing Managers, etc.) | | Content Type | Single Select | Type of content (Blog, Product page, etc.) | | Location | Single Select | Target geographic location | | Language | Single Select | Target language for keywords | | Status | Single Select | Workflow status (Pending, Started, Done) | | Start Research | Checkbox | Active Workflow when you set this to true | Output Table Schema This table stores the generated keyword strategy: | Field Name | Type | Description | | ------------------ | ----------- | ------------------------------------------------ | | ID | Auto Number | Unique identifier | | primary_topic_used | Text | The topic that was researched | | report_content | Long Text | The complete keyword strategy in Markdown format | | generatedAt | Datetime | Automatically generated by NocoDb | Webhook Settings NocoDB Webhook Settings Data Flow The workflow handles data in the following sequence: Webhook Trigger: Receives input from NocoDB when a new keyword research request is created Field Extraction: Extracts primary topic, competitor URLs, audience, and other parameters AI Topic Expansion: Uses OpenAI to generate related keywords, categorized by type and intent Keyword Analysis: Sends primary keywords to DataForSEO to get search volume, CPC, and difficulty Competitor Research: Analyzes competitor pages to identify their keyword rankings Strategy Generation: Combines all data to create a comprehensive keyword strategy Storage & Notification: Saves the strategy to NocoDB and sends a notification to Slack Core Components 1. Topic Expansion This component uses OpenAI and a structured output parser to generate: 20 primary keywords 30 long-tail keywords with search intent 15 question-based keywords 10 related topics 2. DataForSEO Integration Two API endpoints are used: Search Volume & CPC**: Gets monthly search volume and cost-per-click data Keyword Difficulty**: Evaluates how difficult it would be to rank for each keyword 3. Competitor Analysis This component: Analyzes competitor URLs to identify which keywords they rank for Identifies content gaps or opportunities Determines the search intent their content targets 4. Final Keyword Strategy The AI-generated strategy includes: Top 10 primary keywords with metrics 15 long-tail opportunities with low competition 5 question-based keywords to address in content Content structure recommendations 3 potential content titles optimized for SEO Setup Requirements To use this workflow, you'll need: n8n Instance: Either cloud or self-hosted NocoDB Account: For data input and storage API Keys: OpenAI API key DataForSEO API credentials Slack API token (for notifications) Database Setup: Create the required tables in NocoDB as described above Possible Improvements The workflow could be enhanced with the following improvements: Enhanced Keyword Strategy Add topic clustering to group related keywords Enhance the final output with more specific content structure suggestions Include word count recommendations for each content section Additional Data Sources Integrate Google Search Console data for existing content optimization Add Google Trends data to identify rising topics Include sentiment analysis for different keyword groups Improved Competitor Analysis Analyze content length and structure from top-ranking pages Identify common backlink sources for competitor content Extract content headings to better understand content organization Automation Enhancements Add scheduling capabilities to run updates on existing content Implement content performance tracking over time Create alert thresholds for changes in keyword difficulty or search volume Example Output Here is an example Output the Workflow generated based on the following inputs. Inputs: Primary Topic: AI Automation Competitor URLs: n8n.io, zapier.com, make.com Target Audience: Small Business Owners Content Type: Landing Page Location: United States Language: English Output: Final Keyword Strategy The workflow provides a powerful automation for content marketers and SEO specialists to develop data-driven keyword strategies with minimal manual effort. > Original Workflow: AI-Powered SEO Keyword Research Automation - The vibe Marketer
by victor de coster
*Smartlead to HubSpot Performance Analytics A streamlined workflow to analyze your Smartlead performance metrics by tracking lifecycle stages in HubSpot and generating automated reports.* Who is this for? (Outbound) Automation Agencies, Sales and marketing teams using Smartlead for outreach campaigns who want to track their performance metrics and lead progression in HubSpot. What problem does this workflow solve? Manual tracking of lead performance across Smartlead and HubSpot is time-consuming and error-prone. This workflow automates performance reporting by connecting your Smartlead data with HubSpot lifecycle stages, providing clear insights into your outreach campaign effectiveness. What this workflow does Automatically pulls performance data from your Smartlead campaigns Cross-references contact status with HubSpot lifecycle stages Generates comprehensive performance reports in Google Sheets Provides customizable reporting schedules to match your team's needs Setup Requirements PostgreSQL Database Set up your PostgreSQL instance (includes $300 free GCP credits) Follow our step-by-step setup guide: Find a step-by-step guide here Google Account Integration Connect your Google Account to n8n Find the guide here Smartlead Configuration Connect your Smartlead instance: Detailed connection guide included in workflow How to customize this workflow Configure the Trigger node to adjust report frequency Modify the Google Sheets template to match your specific KPIs Customize HubSpot lifecycle stage mapping in the Function node Adjust PostgreSQL queries to track additional metrics Need assistance or have suggestions? lmk here
by Incrementors
Google Play Review Intelligence with Bright Data & Telegram Alerts Overview This n8n workflow automates the process of scraping Google Play Store reviews, analyzing app performance, and sending alerts for low-rated applications. It integrates with Bright Data for web scraping, Google Sheets for data storage, and Telegram for notifications. Workflow Components 1. β Trigger Input Form Type:** Form Trigger Purpose:** Initiates the workflow with user input Input Fields:** URL (Google Play Store app URL) Number of reviews to fetch Function:** Captures user requirements to start the scraping process 2. π Start Scraping Request Type:** HTTP Request (POST) Purpose:** Sends scraping request to Bright Data API Endpoint:** https://api.brightdata.com/datasets/v3/trigger Parameters:** Dataset ID: gd_m6zagkt024uwvvwuyu Include errors: true Limit multiple results: 5 Custom Output Fields:** url, review_id, reviewer_name, review_date review_rating, review, app_url, app_title app_developer, app_images, app_rating app_number_of_reviews, app_what_new app_content_rating, app_country, num_of_reviews 3. π Check Scrape Status Type:** HTTP Request (GET) Purpose:** Monitors the progress of the scraping job Endpoint:** https://api.brightdata.com/datasets/v3/progress/{snapshot_id} Function:** Checks if the dataset scraping is complete 4. β±οΈ Wait for Response 45 sec Type:** Wait Node Purpose:** Implements polling mechanism Duration:** 45 seconds Function:** Pauses workflow before checking status again 5. π§© Verify Completion Type:** IF Condition Purpose:** Evaluates scraping completion status Condition:** status === "ready" Logic:** True: Proceeds to fetch data False: Loops back to status check 6. π₯ Fetch Scraped Data Type:** HTTP Request (GET) Purpose:** Retrieves the final scraped data Endpoint:** https://api.brightdata.com/datasets/v3/snapshot/{snapshot_id} Format:** JSON Function:** Downloads completed review and app data 7. π Save to Google Sheet Type:** Google Sheets Node Purpose:** Stores scraped data for analysis Operation:** Append rows Target:** Specified Google Sheet document Data Mapping:** URL, Review ID, Reviewer Name, Review Date Review Rating, Review Text, App Rating App Number of Reviews, App What's New, App Country 8. β οΈ Check Low Ratings Type:** IF Condition Purpose:** Identifies poor-performing apps Condition:** review_rating < 4 Logic:** True: Triggers alert notification False: No action taken 9. π£ Send Alert to Telegram Type:** Telegram Node Purpose:** Sends performance alerts Message Format:** β οΈ Low App Performance Alert π± App: {app_title} π§βπ» Developer: {app_developer} β Rating: {app_rating} π Reviews: {app_number_of_reviews} π View on Play Store Workflow Flow Input Form β Start Scraping β Check Status β Wait 45s β Verify Completion β β βββββ Loop βββββ β Fetch Data β Save to Sheet & Check Ratings β Send Telegram Alert Configuration Requirements API Keys & Credentials Bright Data API Key:** Required for web scraping Google Sheets OAuth2:** For data storage access Telegram Bot Token:** For alert notifications Setup Parameters Google Sheet ID:** Target spreadsheet identifier Telegram Chat ID:** Destination for alerts N8N Instance ID:** Workflow instance identifier Key Features Data Collection Comprehensive app metadata extraction Review content and rating analysis Developer and country information App store performance metrics Quality Monitoring Automated low-rating detection Real-time performance alerts Continuous data archiving Integration Capabilities Bright Data web scraping service Google Sheets data persistence Telegram instant notifications Polling-based status monitoring Use Cases App Performance Monitoring Track rating trends over time Identify user sentiment patterns Monitor competitor performance Quality Assurance Early warning for rating drops Customer feedback analysis Market reputation management Business Intelligence Review sentiment analysis Performance benchmarking Strategic decision support Technical Notes Polling Interval:** 45-second status checks Rating Threshold:** Alerts triggered for ratings < 4 Data Format:** JSON with structured field mapping Error Handling:** Includes error tracking in dataset requests Result Limiting:** Maximum 5 multiple results per request For any questions or support, please contact: info@incrementors.com or fill out this form https://www.incrementors.com/contact-us/
by PollupAI
Who is this for? This workflow is ideal for individuals focused on nutrition tracking, meal planning, or diet optimizationβwhether youβre a health-conscious individual, fitness coach, or developer working on a healthtech app. It also fits well for anyone who wants to capture their meal data via voice or text, without manually entering everything into a spreadsheet. What problem is this workflow solving? Manually logging meals and breaking down their nutritional content is time-consuming and often skipped. This workflow automates that process using Telegram for input, OpenAI for natural language understanding, and Google Sheets for structured tracking. It enables users to record meals by typing or sending voice messages, which are transcribed, analyzed for nutrients, and automatically stored for tracking and review. What this workflow does This n8n automation lets users send either a text or voice message to a Telegram bot describing their meal. The workflow then: Receives the Telegram message Checks if itβs a voice message β’ If yes: Downloads the audio file and transcribes it using OpenAI β’ If no: Uses the text input directly Sends the meal description to OpenAI to extract a structured list of ingredients and nutritional details Parses and stores the results in Google Sheets Responds via Telegram with a personalized confirmation message A testing interface also allows you to simulate prompts and view structured outputs for development or debugging. Setup Create a Telegram bot via BotFather and note the API token. Create an empty Google Sheet and store the sheet ID in the environment. Set up your OpenAI credentials in the n8n credential manager. Customize the βList of Ingredients and Nutrientsβ node with your prompt if needed. (Optional) Use the βTestingβ section to simulate messages and refine outputs before going live. How to customize this workflow to your needs β’ Enhance prompts in the OpenAI node to improve the structure and accuracy of responses. β’ Add new fields in the Google Sheet and corresponding logic in the parser if you want more detail. β’ Adjust the Telegram response to provide motivational feedback, dietary tips, or summaries. β’ Upgrade to the βProβ version mentioned in the contact section for USDA database integration and complete nutrient breakdowns. This is a lightweight, AI-powered meal logging automation that transforms voice or text into actionable nutrition dataβperfect for making healthy eating easier and more data-driven. See my other workflows here
by Billy Christi
Who is this for? This workflow is perfect for: HR professionals** seeking to automate employee and department management Startups and SMBs** that want an AI-powered HR assistant on Telegram Internal operations teams** that want to simplify onboarding and employee data tracking What problem is this workflow solving? Managing employee databases manually is error-prone and inefficientβespecially for growing teams. This workflow solves that by: Enabling natural language-based HR operations directly through Telegram Automating the creation, retrieval, and deletion of employee records in Airtable Dynamically managing related data such as departments and job titles Handling data consistency and linking across relational tables automatically Providing a conversational interface backed by OpenAI for smart decision-making What this workflow does Using Telegram as the interface and Airtable as the backend database, this intelligent HR workflow allows users to: Chat in natural language (e.g. βShow me all employeesβ or βCreate employee: Sarah, Marketingβ¦β) Interpret and route requests via an AI Agent that acts as the orchestrator Query employee, department, and job title data from Airtable Create or update records as needed: Add new departments and job titles automatically if they donβt exist Create new employees and link them to the correct department and job title Delete employees based on ID Respond directly in Telegram, providing user-friendly feedback Setup View & Copy the Airtable base here: π Employee Database Management β Airtable Base Template Telegram Bot: Set up a Telegram bot and connect it to the Telegram Trigger node Airtable: Prepare three Airtable tables: Employees with links to Departments and Job Titles Departments with Name & Description Job Titles with Title & Description Connect your Airtable API key and base/table IDs into the appropriate Airtable nodes Add your OpenAI API key to the AI Agent nodes Deploy both workflows: the main chatbot workflow and the employee creation sub-workflow Test with sample messages like: βCreate employee: John Doe, john@company.com, Engineering, Software Engineerβ βRemove employee ID rec123xyzβ How to customize this workflow to your needs Switch databases**: Replace Airtable with Notion, PostgreSQL, or Google Sheets if desired Enhance security**: Add authentication and validation before allowing deletion Add approval flows**: Integrate Telegram button-based approvals for sensitive actions Multi-language support**: Expand system prompts to support multiple languages Add logging**: Store every user action in a log table for auditability Expand capabilities**: Integrate payroll, time tracking, or Slack notifications Extra Tips This is a two-workflow setup. Make sure the sub-workflow is deployed and accessible from the main agent. Use Simple Memory per chat ID to preserve context across user queries. You can expand the orchestration logic by adding more tools to the main agentβsuch as βGet active employees onlyβ or βList employees by job title.β
by Guido X Jansen
Introduction **Manual LinkedIn data collection is time-consuming, error-prone, and results in inconsistent data quality across CRM/database records.** This workflow is great for organizations that struggle with: Incomplete contact records with only LinkedIn URLs but missing profile details Hours spent manually copying LinkedIn information into databases Inconsistent data formats due to copy-paste from LinkedIn (emojis, styled text, special characters) Outdated profile information that doesn't reflect current roles/companies No systematic way to enrich contacts at scale Primary Users Sales & Marketing Teams Event Organizers & Conference Managers for event materials Recruitment & HR Professionals CRM Administrators Specific Problems Addressed Data Completeness: Automatically fills missing profile fields (headline, bio, skills, experience) Data Quality: Sanitizes problematic characters that break databases/exports Time Efficiency: Reduces hours of manual data entry to automated monthly updates Error Handling: Gracefully manages invalid/deleted LinkedIn profiles Scalability: Processes multiple profiles in batch without manual intervention Standardization: Ensures consistent data format across all records Cost Each URL scraped by Apify costs $0.01 to get all the data above. Apify charges per scrape, regardless of how much dta or fields you extract/use. Setup Instructions Prerequisites n8n Instance: Access to a running n8n instance (self-hosted or cloud) NocoDB Account: Database with a table containing LinkedIn URLs Apify Account: Free or paid account for LinkedIn scraping Required fields in NocoDB table Input: single LinkedIn URL NocoDB Field name LinkedIn Output: first/last/full name e-mail bio headline profile pic URL current role country skills current employer employer URL experiences (all previous jobs) personal website publications (articles) NocoDB Field names linkedin_full_name linkedin_first_name: linkedin_headline: linkedin_email: linkedin_bio: linkedin_profile_pic linkedin_current_role linkedin_current_company linkedin_country linkedin_skills linkedin_company_website linkedin_experiences linkedin_personal_website linkedin_publications linkedin_scrape_error_reason linkedin_scrape_last_attempt linkedin_scrape_status linkedin_last_modified Technically you also need an Id field, but that is always there so no need to add it :) n8n Setup 1. Import the Workflow Copy the workflow JSON from the template In n8n, click "Add workflow" β "Import from JSON" Paste the workflow and click "Import" 2. Configure NocoDB Connection Click on any NocoDB node in the workflow Add new credentials β "NocoDB Token account" Enter your NocoDB API token (found in NocoDB β User Settings β API Tokens) Update the projectId and table parameters in all NocoDB nodes 3. Set Up Apify Integration Create an Apify account at apify.com Generate an API token (Settings β Integrations β API) In the workflow, update the Apify token in the "Get Scraper Results" node Configure HTTP Query Auth credentials with your token 4. Map Your Database Fields Review the "Transform & Sanitize Data" node Update field mappings to match your NocoDB table structure Ensure these fields exist in your table: LinkedIn (URL field) linkedin_headline, linkedin_full_name, linkedin_bio, etc. linkedin_scrape_status, linkedin_last_modified 5. Configure the Filter In "Get Guests with LinkedIn" node Adjust the filter to match your requirements Default: (LinkedIn,isnot,null)~and(linkedin_headline,is,null) 6. Test the Workflow Click "Execute Workflow" with Manual Trigger Monitor execution for any errors Verify data is properly updated in NocoDB 7. Activate Automated Schedule Configure the Schedule Trigger node (default: monthly) Toggle the workflow to "Active" Monitor executions in n8n dashboard Customization Options 1. Data Source Modifications Different Database: Replace NocoDB nodes with Airtable, Google Sheets, or PostgreSQL Multiple Tables: Add parallel branches to process different contact tables Custom Filters: Modify the WHERE clause to target specific record subsets 2. Enrichment Fields Add Fields: Include additional LinkedIn data like education, certifications, or recommendations Remove Fields: Simplify by removing unnecessary fields (publications, skills) Custom Transformations: Add business logic for field calculations or formatting 3. Scheduling Options Frequency: Change from monthly to daily, weekly, or hourly Time-based: Set specific times for different timezones Event-triggered: Replace with webhook trigger for on-demand processing 4. Error Handling Enhancement Notifications: Add email/Slack nodes to alert on failures Retry Logic: Implement wait and retry for temporary failures Logging: Add database logging for audit trails 5. Data Quality Rules Validation: Add IF nodes to validate data before updates Duplicate Detection: Check for existing records before creating new ones Data Standardization: Add custom sanitization rules for industry-specific needs 6. Integration Extensions CRM Sync: Add nodes to push data to Salesforce, HubSpot, or Pipedrive AI Enhancement: Use OpenAI to summarize bios or extract key skills Image Processing: Download and store profile pictures locally 7. Performance Optimization Batch Size: Adjust the number of profiles processed per run Rate Limiting: Add delays between API calls to avoid limits Parallel Processing: Split large datasets across multiple workflow executions 8. Compliance Additions GDPR Compliance: Add consent checking before processing Data Retention: Implement automatic cleanup of old records Audit Logging: Track who accessed what data and when These customizations allow the workflow to adapt from simple contact enrichment to complex data pipeline scenarios across various industries and use cases.
by IvanCore
Disclaimer: This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Important distinction: This template manages Telegram Copilot's UserBots (client accounts), not Telegram Bots. UserBot vs. Bot: Key Differences πΉ Telegram Copilot's UserBots Authenticate as real user accounts (phone number required) Can join groups/channels without "Bot" label Subject to Telegram's client API limits Require manual login (MFA supported) πΉ Telegram Bots Use @BotFather-created tokens Limited to bot API functionality Can't initiate chats with unbidden users No phone number required This template solves the unique challenges of UserBot management through: Core Functionality π‘οΈ Session Reliability Automatic crash recovery (5-step restart sequence) Persistent session monitoring (checks every 6h) Database cleanup via /clear command π± Multi-Device Support Manages sessions independently from mobile clients Tracks active devices via /stat command Isolates session data per credential π Smart Notifications Real-time alerts to admin chat Detailed error context with authState snapshots Success confirmations with session metadata Setup Guide Prerequisites Self-hosted n8n instance (community node required) Valid Telegram account for UserBot Telegram bot token for notifications TelePilot credentials with api_id/api_hash Configuration Steps Credential Setup Add TelePilot credentials in n8n Configure Telegram bot token in notification nodes Set admin chat ID for alerts Monitoring Customization Adjust check frequency in Schedule Trigger Modify alert thresholds in Filter nodes Configure retry logic in recovery sequence Session Management Test /start command flow Verify /stat output format Confirm notification delivery Workflow Customization Advanced Options Add secondary notification channels (Email, Slack) Implement escalating alert system Integrate with monitoring dashboards Customize recovery attempt limits Compliance Notes UserBots must comply with Telegram's Terms of Service Not intended for bulk messaging or spam Recommended for legitimate automation use cases Note: UserBots must comply with Telegram ToS. Not for spam/mass messaging. Why This Matters: UserBots enable automation scenarios impossible with regular bots (e.g., group management as normal user, reacting as human account). This workflow keeps them reliably online 24/7.
by Jimleuk
This n8n template lets you summarize individual team member activity on MS Teams for the past week and generates a report. For remote teams, chat is a crucial communication tool to ensure work gets done but with so many conversations happening at once and in multiple threads, ideas, information and decisions usually live in the moment and get lost just as quickly - and all together forgotten by the weekend! Using this template, this doesn't have to be the case. Have AI crawl through last week's activity, summarize all messages and replies and generate a casual and snappy report to bring the team back into focus for the current week. A project manager's dream! How it works A scheduled trigger is set to run every Monday at 6am to gather all team channel messages within the last week. Messages are grouped by user. AI analyses the raw messages and replies to pull out interesting observations and highlights. This is referred to as the individual reports. All individual reports are then combined and summarized together into what becomes the team weekly report. This allows understanding of group and similar activities. Finally, the team weekly report is posted back to the channel. The timing is important as it should be the first message of the week and ready for the team to glance over coffee. How to use Ideally works best per project and where most of the comms happens on a single channel. Avoid combining channels and instead duplicate this workflow for more channels. You may need to filter for specific team members if you want specific team updates. Customise the report to suit your organisation, team or the channel. You may prefer to be more formal if clients or external stakeholders are also present. Requirements MS Teams for chat platform OpenAI for LLM Customising this workflow If the teams channel is busy enough already, consider posting the final report to email. Pull in project metrics to include in your report. As extra context, it may be interesting to tie the messages to production performance. Use an AI Agent to query for knowledgebase or tickets relevant to the messages. This may be useful for attaching links or references to add context.
by Nikan Noorafkan
π€ AI-Powered Content Marketing Research Tool > Transform your content strategy with automated competitor intelligence β‘ What It Does Never miss a competitor move again. This workflow automatically: π Monitors competitor content across multiple domains π Tracks trending keywords by region π¬ Extracts audience pain points from Reddit & forums π€ Generates AI strategy recommendations via OpenAI π Outputs to Airtable, Notion & Slack for instant action π― Perfect For Growth marketers** tracking competitor strategies Content teams** discovering trending topics SEO specialists** finding keyword opportunities Marketing agencies** managing multiple clients π οΈ Technical Setup Required APIs & Credentials | Service | Credential Type | Monthly Cost | Purpose | |---------|----------------|--------------|---------| | Ahrefs | Header Auth | $99+ | Backlink & traffic analysis | | SEMrush | Query Auth | $119+ | Keyword research | | BuzzSumo | Header Auth | $199+ | Content performance | | OpenAI | Header Auth | ~$50 | AI recommendations | | Reddit | OAuth2 | Free | Audience insights | | Google Trends | Public API | Free | Trending topics | π Database Schema Airtable Base: content-research-base Table 1: competitor-intelligence timestamp (Date) domain (Single line text) traffic_estimate (Number) backlinks (Number) content_gaps (Long text) publishing_frequency (Single line text) Table 2: keyword-opportunities timestamp (Date) trending_keywords (Long text) top_questions (Long text) content_opportunities (Long text) π Quick Start Guide Step 1: Import & Configure Import the workflow JSON Update competitor domains in π Configuration Settings Map all API credentials Step 2: Setup Storage Airtable:** Create base with exact schema above Notion:** Create database with properties listed Slack:** Create #content-research-alerts channel Step 3: Test & Deploy First run populates: β Airtable tables with competitor data β Notion database with AI insights β Slack channel with formatted alerts π‘ Example Output AI Recommendations Format { "action_items": [ { "topic": "Copy trading explainer", "format": "Video", "region": "UK", "priority": "High" } ], "publishing_calendar": [ {"week": "W34", "posts": 3} ], "alerts": [ "eToro gained 8 .edu backlinks this week" ] } Slack Alert Preview π¨ Content Research Alert π Top Findings: Sustainable packaging solutions Circular economy trends Eco-friendly manufacturing π Trending Keywords: forex trading basics (+45%) social trading platforms (+32%) copy trading strategies (+28%) π‘ AI Recommendations: Focus on educational content in UK market... π§ Advanced Features β Data Quality Validation Automatic retry** for failed API calls Data validation** before storage Error notifications** via Slack βοΈ Scalability Options Multi-region support** (US, UK, DE, FR, JP) Batch processing** for large competitor lists Rate limiting** to respect API quotas π¨ Customization Ready Modular design** - disable unused APIs Industry templates** - forex, ecommerce, SaaS Custom scoring** algorithms π ROI & Performance Cost Analysis Setup time:** ~2 hours Monthly API costs:** $400-500 Time saved:** 15+ hours/week ROI:** 300%+ within first month Success Metrics Competitor insights:** 50+ data points daily Keyword opportunities:** 100+ suggestions/week Content ideas:** 20+ AI-generated topics Trend alerts:** Real-time notifications π‘οΈ Troubleshooting Common Issues & Solutions | Symptom | Cause | Fix | |-------------|-----------|---------| | OpenAI timeout | Large data payload | Reduce batch size β Split processing | | Airtable 422 error | Field mismatch | Copy schema exactly | | Reddit 401 | OAuth expired | Re-authorize application | Rate Limiting Best Practices Ahrefs:** Max 1000 requests/day SEMrush:** 3000 requests/day OpenAI:** Monitor token usage π Why Choose This Template? > "From manual research to automated intelligence in 15 minutes" β Production-ready - No additional coding required β Cost-optimized - Uses free tiers where possible β Scalable - Add competitors with one click β Actionable - AI outputs ready for immediate use β Community-tested - 500+ successful deployments Start your competitive intelligence today π Built with β€οΈ for the n8n community
by Romain Jouhannet
Linear Project/Issue Status and End Date to Productboard feature Sync Sync project and issue data between Linear and Productboard to keep teams aligned. This workflow updates Productboard features with the status and end date from Linear projects or due date from Linear issues. It ensures consistent data and sends a Slack notification whenever changes are made. Features Listens for updates in Linear projects/issues. Maps Linear statuses to Productboard feature statuses. Updates Productboard feature details including timeframe. Sends a Slack notification summarizing the updates. Setup Linear Credentials: Add your Linear API credentials in n8n. Productboard Credentials: Configure the Productboard API credentials in n8n. Linear Projects or Issues: Select the Linear project(s) or Issue(s) you want to monitor for updates. Productboard Custom Field: Create a custom field in Productboard named "Linear". This field should store the URL of the Linear project or issue you want to sync. Retrieve the UUID of the custom field in Productboard and set it up in the "Get Productboard Feature ID" node. Slack Notification: Update the Slack node with the desired Slack channel ID. Activate the Workflow: Enable the workflow to automatically sync data when triggered by updates in Linear.
by Frank Chen
Automatically fetch existing domains from Notion's Database and verify the validity of SSL certificates through SSL-Checker. If the validity period is less than 14 days, send a Telegram message notification and trigger SSH remote automatic refresh. Successful refresh notification will be sent through Telegram. This can prevent problems with the server-side automatic refresh program, which may cause unexpected service interruptions. Main use cases: Notion store domain. Telegram receives warning messages. Remotely trigger Certbot to refresh SSL. How it works: Record who triggered this workflow, because if there is a credential that is about to expire, this workflow will be triggered repeatedly. After getting all the domains from Notion, send an http request to SSL-Checker. After getting all the SSL-Checker results, add the validity label. And use the IF node to check if there are any certificates that are about to expire. Then there are two workflows: If there is a certificate that is about to expire: send an SSH command to the remote control server to refresh the certificate, notify through Telegram, and call this workflow again to re-verify the validity of the SSL certificate. If the validity period of SSL is normal: then refresh the data on Notion, and if a re-called workflow is detected, Telegram will be used to notify that the SSL has been updated.