by Daniel Shashko
This workflow enables you to automate the daily monitoring of how an AI model (like ChatGPT) responds to specific queries relevant to your market. It identifies mentions of your brand and predefined competitors, logs detailed interactions in Google Sheets, and delivers a comprehensive email report. Main Use Cases Monitor how your brand is mentioned by AI in response to relevant user queries. Track mentions of key competitors to understand AI's comparative positioning. Gain insights into AI's current knowledge and portrayal of your brand and market landscape. Automate daily intelligence gathering on AI-driven brand perception. How it works The workflow operates as a scheduled process, organized into these stages: Configuration & Scheduling Triggers daily (or can be run manually). Key variables are defined within the workflow: your brand name (e.g., "YourBrandName"), a list of queries to ask the AI, and a list of competitor names to track in responses. AI Querying For each predefined query, the workflow sends a request to the OpenAI ChatGPT API (via an HTTP Request node). Response Analysis Each AI response is processed by a Code node to: Check if your brand name is mentioned (case-insensitive). Identify if any of the listed competitors are mentioned (case-insensitive). Extract the core AI response content (limited to 500 characters for brevity in logs/reports). Data Logging to Google Sheets Detailed results for each query—including timestamp, date, the query itself, query index, your brand name, the AI's response, whether your brand was mentioned, and any errors—are appended to a specified Google Sheet. Email Report Generation A comprehensive HTML email report is compiled. This report summarizes: Total queries processed, number of times your brand was mentioned, total competitor mentions, and any errors encountered. A summary of competitor mentions, listing each competitor and how many times they were mentioned. A detailed table listing each query, whether your brand was mentioned, and which competitors (if any) were mentioned in the AI's response. Automated Reporting The generated HTML email report is sent to specified recipients, providing a daily snapshot of AI interactions. Summary Flow: Schedule/Workflow Trigger → Initialize Brand, Queries, Competitors (in Code node) → For each Query: Query ChatGPT API → Process AI Response (Check for Brand & Competitor Mentions) → Log Results to Google Sheets → Generate Consolidated HTML Email Report → Send Email Notification Benefits: Fully automated daily monitoring of AI responses concerning your brand and competitors. Provides objective insights into how AI models are representing your brand in user interactions. Delivers actionable competitive intelligence by tracking competitor mentions. Centralized logging in Google Sheets for historical analysis and trend spotting. Easily customizable with your specific brand, queries, competitor list, and reporting recipients.
by Sidetool
Hello there! This is a supporting workflow for an Airtable Base that handles Recurring Tasks. The objective of the workflow is to handle creating tasks on a recurring basis depending on the Airtable Setup You can access that Airtable Template here for complete context- Airtable Universe The functionality of the workflow can be easliy adapted to any data source. Feel free to contact us with any doubts or questions at http://sidetool.co Use this as is, or adapted to your existing Airtable Base – embrace automated simplicity! 🚀🌟
by n8n Team
This workflow syncs data between Notion and Asana whenever a new task or an update is done in one of the apps. Prerequisites Asana account and Asana credentials Notion account and Notion credentials How it works Go to Asana account. Create a new task in Asana. Notice a new task created in Notion account. Update the task in Asana. Notice the task is updated in Notion.
by Nasser
For Who? Content Creators Youtube Automation Marketing Team How it works? 1 - Enter the ID of the YTB channel to trigger the workflow when a new video is posted 2 - Apify scrape the last YTB video of the channel 3 - Wait until the dataset is completed in Apify and get it 4 - Verify if Metadata are not already generated and generate them with LLM 5 - Format all the data created and update YTB Video 📺 YouTube Video Tutorial: SETUP Setup Input YTB Chanel : Go to the channel's page on YouTube, and look at the URL of the page. The channel ID is the value that comes after channel/ in the URL. Add it after "?channel_id=" You can also use free tools available to retrieve channel ID. Setup Output YTB Video Update : Connect your YTB account to your n8n instance thanks to the Google Cloud Console. You can find tutorials by typing "youtube api Oauth" on Google. APIs : For the following third-party integrations, replace ==[YOUR_API_TOKEN]== with your API Token or connect your account via Client ID / Secret to your n8n instance : Apify : https://docs.apify.com/api/v2/getting-started Youtube : https://docs.n8n.io/integrations/builtin/app-nodes/n8n-nodes-base.youtube/?utm_source=n8n_app&utm_medium=node_settings_modal-credential_link&utm_campaign=n8n-nodes-base.youTube#templates-and-examples 👨💻 More Workflows : https://n8n.io/creators/nasser/
by n8n Team
This workflow syncs your GitHub issues to your Notion database. Whenever a new issue is opened in your GitHub repository, it will be shown in your Notion database, syncing the status property (opened/edited/closed/deleted). In case there’s no Notion database existing yet, a new one will be created automatically. Prerequisites Notion account and Notion credentials GitHub account and GitHub credentials How it works Github trigger starts the workflow when a new issue is created in a GitHub repository. If node splits the workflow conditionally, showing whether the issue is new or an update of an existing issue. If data is new, the Notion node will create a new database page in Notion. If the data is not new, the Function node will create a Notion filter that will find its specific database page by issue ID. Switch node will then conditionally route the data into the appropriate Notion page, based on the update made upon it.
by n8n Team
This workflow reads PDF textual content and sends the text to OpenAI. Attachments of interest will then be uploaded to a specified Google Drive folder. For example, you may wish to send invoices received from an email to an inbox folder in Google Drive for later processing. This workflow has been designed to easily change the search term to match your needs. See the workflow for more details. Prerequisites OpenAI credentials. Google credentials. How it works Triggers off on the On email received node. Iterates over the attachments in the email. Uses the OpenAI node to filter out the attachments that do not match the search term set in the Configure node. You could match on various PDF files (i.e. invoice, receipt, or contract). If the PDF attachment matches the search term, the workflow uses the Google Drive node to upload the PDF attachment to a specific Google Drive folder.
by Adam Janes
How it works The workflow loads a list of test cases from a Google Sheet (previous results stored from an LLM) For each test case, we execute a call to an LLM judge in parallel (using HTTP Request + Webhook nodes) The judge uses the Input, Output, and Reference Answer fields from the spreadsheet to mark each LLM response as Pass/Fail The results are logged into a separate sheet in the same Sheets file. Set up steps: Add your credentials for Google Sheets and OpenRouter (or replace the OpenRouter node with your favourite chat model). Make a copy of the example Sheet to populate it with you own test data. Run the workflow with the Execute Workflow button next to the Manual Trigger node.
by Yosua Surojo
Who it's for This workflow is for anyone who wants to build an automated, AI-enhanced reading list. Ideal for: Knowledge workers and researchers who collect and organize articles Students managing study materials Productivity hackers who use Telegram and Notion for personal knowledge management Anyone using the AI-Enhanced Knowledge Base Tracker Notion Template How it works This workflow takes any article link sent to your Telegram bot and automatically: Parses the article into a clean title and body Uses OpenAI to generate a 1–2 sentence highlight and topic tag Saves it into your Notion database Sends a confirmation message with the highlight and Notion link back to Telegram Main steps: Telegram Trigger - Listens for incoming message containing an article link. Fetch Article Title & Content - Calls the article-parser-api deployed on Vercel to fetch and parse the article content into structured JSON (title and content). Generate Highlight + Tag (AI Agent) - Processes the parsed content to generate Highlight and Type tag values. Structured Metadata for Notion - Adjusts the extracted data before saving it to Notion. Save Article to Notion Database - Inserts the article and generated metadata into your Notion knowledge base. Confirm Save via Telegram - Sends a confirmation message and the Notion page link back to the Telegram bot chat after the entry is created. Setup Create and connect your API credentials: Telegram Bot OpenAI API Key Notion Integration Deploy the article parser: Use this repo: article-parser-api Deploy it to Vercel or any serverless environment Link your Notion database: Duplicate the AI‑Enhanced Knowledge Base Tracker Copy the database URL and connect it in the Notion node Test your workflow: Click Execute workflow Send an article link to your Telegram bot Once verified, activate the workflow so it runs automatically Requirements Telegram bot token OpenAI API key Notion integration and shared database A deployed article parser (e.g., article-parser-api) Optional customization Edit the AI Agent prompt to change tone or tagging style Add filtering or additional fields in the Edit Fields node Trigger from other sources (e.g., Slack or Email)
by Yaron Been
Workflow Overview This advanced n8n automation is a powerful channel research and intelligence gathering tool designed to transform raw YouTube channel data into actionable insights. By intelligently connecting multiple APIs and data sources, this workflow: Discovers Channel Metrics: Automatically retrieves channel statistics Captures detailed performance indicators Provides comprehensive channel intelligence Performs Deep Analysis: Extracts recent video performance data Calculates engagement metrics Aggregates view count insights Uncovers Contact Information: Attempts to retrieve public email addresses Provides direct outreach opportunities Enhances lead generation capabilities Seamless Data Logging: Automatically updates Google Sheets Maintains a live intelligence dashboard Preserves historical channel data Key Benefits 🤖 Full Automation: Continuous channel intelligence gathering 💡 Smart Analysis: Comprehensive performance insights 📊 Real-Time Tracking: Always-updated channel metrics 🔍 Lead Generation: Direct contact information extraction Workflow Architecture 🔹 Stage 1: Channel Identification Google Sheets Trigger**: Detects new channel additions YouTube Data API**: Fetches channel statistics Comprehensive Metric Collection**: Subscriber count Total view metrics Channel overview 🔹 Stage 2: Video Performance Analysis Recent Video Retrieval**: Fetches 5 latest uploads View Count Aggregation**: Calculates total recent views Provides engagement snapshot Performance Insights**: Measures content effectiveness 🔹 Stage 3: Contact Discovery SerpAPI Integration**: Attempts email extraction Public Contact Information**: Retrieves available email addresses Supports outreach and networking 🔹 Stage 4: Data Compilation Intelligent Data Formatting** Google Sheets Update** Live Intelligence Dashboard** Potential Use Cases Marketing Teams**: Influencer research Sales Professionals**: Lead qualification Content Strategists**: Competitive analysis Recruitment Specialists**: Talent scouting Business Development**: Partnership identification Setup Requirements YouTube Data API Google Cloud API credentials Configured API access SerpAPI Account API key for email extraction Web scraping permissions Google Sheets Connected Google account Prepared tracking spreadsheet Appropriate sharing settings n8n Installation Cloud or self-hosted instance Workflow configuration API credential management Future Enhancement Suggestions 🤖 AI-powered channel scoring 📊 Advanced trend analysis 🔔 Automated alert system 🌐 Multi-platform channel tracking 🧠 Machine learning insights generation Technical Considerations Implement robust error handling Use exponential backoff for API calls Maintain flexible data extraction strategies Ensure compliance with platform terms of service Ethical Guidelines Respect content creator privacy Use data for legitimate research Maintain transparent data collection practices Provide opt-out mechanisms Connect With Me Ready to unlock YouTube channel insights? 📧 Email: Yaron@nofluff.online 🎥 YouTube: @YaronBeen 💼 LinkedIn: Yaron Been Transform your channel research with intelligent, automated workflows!
by Corentin Ribeyre
This template can be used to verify an email address with Icypeas. Be sure to have an active account to use this template. How it works This workflow can be divided into three steps : The workflow initiates with a manual trigger (On clicking 'execute'). It connects to your Icypeas account. It performs an HTTP request to verify an email address. Set up steps You will need a working icypeas account to run the workflow and get your API Key, API Secret and User ID. You will need an email address to perform the verification.
by Dvir Sharon
🔍 Extract Competitor SERP Rankings from Google Search to Sheets with Bright Data This template requires a self-hosted n8n instance to run. A comprehensive n8n automation that extracts competitor data from Google search results for specific keywords and target countries, automatically saving structured data to Google Sheets for competitive analysis and market research. 📋 Overview This workflow provides a professional competitor analysis solution that identifies ranking websites for specific search terms across different countries. Perfect for SEO research, competitive intelligence, market analysis, and content strategy planning. The system uses Bright Data's SERP API for accurate search result extraction and advanced HTML parsing for detailed competitor information. Who is this for? SEO professionals conducting competitive analysis Digital marketers researching market landscapes Business analysts studying competitor positioning Content strategists analyzing competitor content approaches Market researchers tracking competitive intelligence across regions What problem is this workflow solving? Extracting competitor data from Google search results Processing multiple keywords across different countries Organizing results in a structured, analyzable format Eliminating manual copy-paste work Ensuring consistent data collection methodology What this workflow does Manual Trigger: Starts the workflow execution Get Keywords from Sheet: Fetches keywords and target countries from Google Sheets URL Encode Keywords: Converts keywords to URL-safe format Process Keywords in Batches: Handles multiple keywords sequentially Fetch Google Search Results: Uses Bright Data SERP API to scrape HTML Extract Competitor Data from HTML: Parses HTML to extract competitor details Save Competitor Results to Sheet: Stores structured data in Google Sheets Wait to Avoid Rate Limits: Implements 30-second delays between requests Output Data Points | Field | Description | Example | | :--------------- | :--------------------------------- | :------------------------------------------ | | Keyword | Original search term | digital marketing services | | Target Country | Geographic target | US | | websiteName | Domain/company name | hubspot | | websiteUrl | Complete website URL | https://www.hubspot.com/marketing | | websiteTitle | Page title from search results | Digital Marketing Software & Tools | | websiteDescription | Meta description/snippet | Grow your business with HubSpot's digital marketing tools... | ⚙️ Setup Prerequisites n8n instance (self-hosted) Google account with Sheets access Bright Data account with SERP API access Google Sheet Structure This workflow utilizes two Google Sheets: one for input keywords and one for outputting competitor data. Input Sheet: "Keywords" This sheet should contain the keywords and target countries for your search queries. | Column Header | Data Type | Description | Example | | :------------- | :-------- | :------------------------------------------------- | :-------------- | | Keyword | Text | The search term you want to analyze. | digital marketing | | Country | Text | The 2-letter ISO country code for the target region of the search (e.g., US, GB, DE). | US | Output Sheet: "Competitor Results" This sheet will be populated automatically by the workflow with the extracted competitor data. | Column Header | Data Type | Description | Example | | :----------------- | :-------- | :---------------------------------------------------------------------------------- | :----------------------------------------------- | | Keyword | Text | The original search term used for the query. | digital marketing services | | Target Country | Text | The 2-letter ISO country code of the search results. | US | | websiteName | Text | The name of the website or domain found in the search results. | hubspot | | websiteUrl | URL | The full URL of the website or page found in the search results. | https://www.hubspot.com/marketing | | websiteTitle | Text | The title of the page as displayed in the Google search results. | Digital Marketing Software & Tools | | websiteDescription | Text | The meta description or snippet text displayed under the title in search results. | Grow your business with HubSpot's digital marketing tools... | Step-by-Step Setup Import the Workflow: Copy JSON → n8n → Workflows → + Add → Import from JSON Configure Bright Data Credentials: Credential Type: HTTP Header Auth Header Name: Authorization Header Value: Bearer YOUR_API_TOKEN Configure Google Sheets: Create two new Google Sheets as described above: one named "Keywords" (for input) and one named "Competitor Results" (for output). Set up Google Sheets OAuth2 credentials within n8n. Update Workflow Settings: Replace placeholders: YOUR_GOOGLE_SHEET_ID (for both input and output sheets), YOUR_BRIGHTDATA_CREDENTIAL_ID. Ensure correct sheet/tab names are selected in the Google Sheets nodes. Test & Activate: Add test data to your "Keywords" sheet → Execute workflow → Verify output in your "Competitor Results" sheet. 🛠 How to Customize Add More Data Points:** Modify the JavaScript code in the "Extract Competitor Data from HTML" node to parse and extract additional information from the HTML. Custom Filtering:** Implement logic to exclude specific domains, filter results by title length, or other criteria. Expand Geographic Coverage:** Add more 2-letter ISO country codes to the Bright Data SERP API call to broaden your competitive analysis. Batch Processing:** Adjust the settings in the "Process Keywords in Batches" node to optimize for your Bright Data plan and desired execution speed. Rate Limiting:** Modify the "Wait" node (default: 30 seconds) to increase or decrease the delay between requests based on API limits or performance needs. 📊 Use Cases & Examples SEO Competitive Analysis:** Identify top-ranking competitors for your target keywords and analyze their strategies. Market Entry Research:** Understand the competitive landscape in new geographic regions before expanding. Content Strategy Planning:** Analyze competitor page titles and meta descriptions for inspiration and to identify content gaps. International Market Research:** Compare search engine results and competitor positioning across different countries. 📈 Performance & Limits Single Keyword:** 30–60 seconds per keyword. Batch of 10 Keywords:** Typically takes 5–10 minutes. Large Lists (50+ Keywords):** Expect execution times of 30–60 minutes or more, depending on batching and rate limits. Success Rate:** Generally 95%+ for data extraction. Data Accuracy:** Typically 98%+ for extracted fields. API Calls:** 1 Bright Data SERP API call per keyword, plus multiple Google Sheets writes per execution. Rate Limit:** A 30-second delay between requests is recommended to prevent exceeding API limits. 🧰 Troubleshooting Bright Data API error:** Double-check your API token, ensure you have sufficient credits, and confirm SERP API access is enabled on your Bright Data account. No keywords found:** Verify the Google Sheet ID and ensure the column headers in your "Keywords" sheet precisely match the specifications (e.g., "Keyword", "Country"). Google Sheets permission denied:** Re-authenticate your Google Sheets credentials within n8n and check that the correct sharing settings are applied to your sheets. No results extracted:** Review the JavaScript parsing logic in the "Extract Competitor Data from HTML" node. Also, verify the validity of your keywords and target countries. Loop not processing all:** Check the batch settings in the "Process Keywords in Batches" node and ensure all connections within the loop are correctly configured. 🤝 Support & Community n8n Forum:** <https://community.n8n.io> n8n Docs:** <https://docs.n8n.io> Bright Data Support:** Access support directly via your Bright Data dashboard. GitHub Issues:** Report any bugs or suggest new features on the n8n GitHub repository. 🎯 Final Notes This workflow provides a comprehensive foundation for competitor research and market analysis. Customize it to fit your specific industry needs and competitive intelligence requirements. Please note that this template uses Community Nodes. Ensure you understand the risks before using community nodes.
by Audun
Who Is This For? Web developers SEO specialists Digital marketers What Problem Is This Workflow Solving? Automates the extraction of internal links from a webpage Eliminates the manual and error-prone process of collecting links Facilitates analysis of website structure and optimization What This Workflow Does Uses HTTP request node to fetch HTML content from a specified webpage Parses the HTML to identify and extract internal links Compiles a list of URLs directing to pages within the same domain Setup Configure the Set Base URL node: Set the url field to the URL you want to analyze. How to Customize This Workflow to Your Needs Change the target URL in the Set Base URL node to analyze different webpages. Add nodes to: Filter or categorize the extracted links Export the list to a database or CSV Send links via email or integrate with other tools This workflow can be used as a base for workflows to manage the process of extracting internal links, aiding in website optimization and SEO efforts.