by Sk developer
Backlink Checker with Google Sheets Logging (Seo) Description: This workflow helps you analyze top backlinks using Semrush API and logs the results directly into Google Sheets for easy SEO tracking and reporting. It integrates the Top Backlink Checker API from RapidAPI, providing in-depth backlink analysis, and combines that with Google Sheets for efficient data storage and tracking. Node-by-Node Explanation: 1. On form submission Captures the website URL submitted by the user through a form. This node triggers the workflow when the form is filled with a website URL. The Top Backlink Checker API (via RapidAPI) is used to check backlinks after this step. 2. Check webTraffic Sends a request to the Top Backlink Checker API to gather traffic data for the submitted website. This includes important metrics like visits, bounce rate, and more, which will later be stored in Google Sheets for analysis. 3. Reformat output Extracts and re-formats the traffic data received from the Top Backlink Checker API. This node cleans and structures the raw data for easier processing, ensuring it is usable for later stages in the workflow. 4. Reformat Processes the backlink data received from the Top Backlink Checker API (RapidAPI). The data is reformatted and structured to be added to Google Sheets for storage, making it easier to analyze. 5. Backlink overview Appends the re-formatted backlink overview data into a Google Sheets document. This stores important backlink information like source URLs, anchor texts, and more, making it available for later analysis and reporting. 6. Backlinks Appends detailed backlink data, including target URLs, anchors, and internal/external links, into Google Sheets. This helps track individual backlinks, their attributes, and page scores, allowing for deeper SEO analysis and reporting. Benefits and Use Cases: Benefits: Backlink Tracking: The integration of the **Top Backlink Checker API helps you track all the backlinks associated with a website. You can get insights on the source URL, anchor text, first and last seen, and more. Traffic Insights: By integrating **Top Backlink Checker API, this workflow allows you to monitor important website traffic data such as visits, bounce rates, and organic reach, helping with SEO strategies. Automated Google Sheets Logging**: All traffic and backlink data is logged automatically into Google Sheets for easy access and future analysis. This avoids manual data entry and ensures consistency. Efficient Workflow: The automation provided by **n8n streamlines your SEO analysis workflow, ensuring that data is formatted, structured, and updated without any manual intervention. Use Cases: SEO Reports**: Generate regular SEO reports by tracking backlinks and traffic data automatically from Semrush and Top Backlink Checker, saving time and ensuring accurate reporting. Competitor Analysis: Analyze your competitorsโ backlinks and traffic to stay ahead in SEO rankings by leveraging data from the **Top Backlink Checker API. Backlink Management: Use the data from **Top Backlink Checker API to assess the health of backlinks, ensuring that high-value backlinks are tracked, and toxic backlinks are identified for removal or disavow. SEO Campaign Tracking**: Monitor how backlinks and website traffic evolve over time to evaluate the effectiveness of your SEO campaigns, keeping all your data in Google Sheets for easy tracking.
by Evoort Solutions
Automate YouTube Channel Metadata Extraction to Google Docs Description: This workflow leverages the powerful YouTube Metadata API to automatically extract detailed metadata from any YouTube channel URL. Using the YouTube Metadata API, it collects information like subscribers, views, keywords, and banners, reformats it for readability, and saves it directly to Google Docs for easy sharing and record-keeping. Ideal for marketers, content creators, and analysts looking to streamline YouTube channel data collection. By integrating the YouTube Metadata, this workflow ensures accurate and up-to-date channel insights fetched instantly from the source. Node-by-Node Explanation 1. On form submission Triggers the workflow when a user submits a YouTube channel URL via a web form, starting the metadata extraction process. 2. YouTube Channel Metadata (HTTP Request) Calls the YouTube Metadata API with the provided channel URL to retrieve comprehensive channel details like title, subscriber count, and banner images. 3. Reformat (Code) Transforms the raw API response into a clean, formatted string with emojis and markdown styling for easy reading and better presentation. 4. Add Data in Google Docs Appends the formatted channel metadata into a specified Google Docs document, providing a centralized and accessible record of the data. Benefits of This Workflow Automated Data Collection:* Eliminates manual effort by automatically extracting YouTube channel data via the *YouTube Metadata API**. Accurate & Reliable:** Ensures data accuracy by using a trusted API source, keeping metadata current. Improved Organization:** Saves data in Google Docs, allowing for easy sharing, editing, and collaboration. User-Friendly:** Simple form-based trigger lets anyone gather channel info without technical knowledge. Scalable & Flexible:** Can process multiple URLs easily, perfect for marketing or research teams handling numerous channels. Use Cases Marketing Teams:** Track competitor YouTube channel stats and trends for strategic planning. Content Creators:** Monitor channel growth metrics and optimize content strategy accordingly. Researchers:** Collect and analyze YouTube channel data for academic or market research projects. Social Media Managers:** Automate reporting by documenting channel performance metrics in Google Docs. Businesses:** Maintain up-to-date records of brand or partner YouTube channels efficiently. By leveraging the YouTube Metadata, this workflow provides an efficient, scalable solution to extract and document YouTube channel metadata with minimal manual input. ๐ How to Get Your API Key for YouTube Metadata API Visit the API Page: Go to the YouTube Metadata on RapidAPI. Sign Up/Login: Create an account or log in if you already have one. Subscribe to the API: Click "Subscribe to Test" and choose a plan (free or paid). Copy Your API Key: After subscribing, your API Key will be available in the "X-RapidAPI-Key" section under "Endpoints". Use the Key: Include the key in your API requests like this: -H "X-RapidAPI-Key: YOUR_API_KEY" Create your free n8n account and set up the workflow in just a few minutes using the link below: ๐ Start Automating with n8n
by Evoort Solutions
SEO On Page API โ Complete Guide, Use Cases & Benefits The SEO On Page API is a powerful tool for keyword research, competitor analysis, backlink insights, and overall SEO optimization. With multiple endpoints, you can instantly gather actionable SEO data without juggling multiple tools. You can explore and subscribe via SEO On Page API. ๐ Description The SEO On Page API on SEO On Page API allows you to quickly analyze websites, keywords, backlinks, and competitors โ all in one place. Ideal for SEO professionals, marketers, and developers who want fast, accurate, and easy-to-integrate data. Node-by-node Overview On form submission โ Shows a web form (field: website) and triggers the workflow on submit. Global Storage โ Copies website (and optional country) into the execution JSON for reuse. Website Traffic Cheker โ POSTs website to webtraffic.php (RapidAPI) to fetch traffic summary. Re-Format โ Extracts data.semrushAPI.trafficSummary[0] from the traffic API response. Website Traffic โ Appends traffic metrics (visits, users, bounce, etc.) to the "WebSite Traffic" sheet. Website Metrics DA PA โ POSTs website to dapa.php (RapidAPI) to get DA, PA, spam score, DR, org traffic. Re-Format 2 โ Pulls the data object from the DA/PA API response for clean mapping. DA PA โ Appends DA/PA and related fields into the "DA PA" sheet. Top Baclinks โ POSTs website to backlink.php (RapidAPI) to retrieve backlink data. Re-Format 3 โ Extracts data.semrushAPI.backlinksOverview (aggregate backlink metrics). Backlinks Overview โ Appends overview metrics into the "Backlinks Overview" sheet. Re-Format 4 โ Extracts detailed data.semrushAPI.backlinks (individual backlinks list). Backlinks โ Appends each backlink row into the "Backlinks" sheet. Competitors Analysis โ POSTs website to competitor.php (RapidAPI) to fetch competitors/data sets. Re-Format 5 โ Flattens all array datasets under data.semrushAPI into rows with a dataset label. Competitor Analysis โ Appends the flattened competitor and keyword rows into the "Competitor Analysis" sheet. ๐ Use Cases Keyword Research** โ Find high-volume, low-competition keywords for content planning. Competitor Analysis** โ Identify competitor strategies and ranking keywords. Backlink Insights** โ Discover referring domains and link-building opportunities. Domain Authority Checks** โ Evaluate site authority before guest posting or partnerships. Content Optimization** โ Improve on-page SEO using actionable data. ๐ก Benefits One API, Multiple Insights** โ No need for multiple SEO tools. Accurate Data** โ Get trusted metrics for informed decision-making. Fast Integration** โ Simple POST requests for quick setup. Time-Saving** โ Automates complex SEO analysis in seconds. Affordable** โ Access enterprise-grade SEO insights without breaking the bank. ๐ Start using the *SEO On Page API* today to supercharge your keyword research, backlink tracking, and competitor analysis. Create your free n8n account and set up the workflow in just a few minutes using the link below: ๐ Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by Growth AI
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Firecrawl batch scraping to Google Docs Who's it for AI chatbot developers, content managers, and data analysts who need to extract and organize content from multiple web pages for knowledge base creation, competitive analysis, or content migration projects. What it does This workflow automatically scrapes content from a list of URLs and converts each page into a structured Google Doc in markdown format. It's designed for batch processing multiple pages efficiently, making it ideal for building AI knowledge bases, analyzing competitor content, or migrating website content to documentation systems. How it works The workflow follows a systematic scraping process: URL Input: Reads a list of URLs from a Google Sheets template Data Validation: Filters out empty rows and already-processed URLs Batch Processing: Loops through each URL sequentially Content Extraction: Uses Firecrawl to scrape and convert content to markdown Document Creation: Creates individual Google Docs for each scraped page Progress Tracking: Updates the spreadsheet to mark completed URLs Final Notification: Provides completion summary with access to scraped content Requirements Firecrawl API key (for web scraping) Google Sheets access Google Drive access (for document creation) Google Sheets template (provided) How to set up Step 1: Prepare your template Copy the Google Sheets template Create your own version for personal use Ensure the sheet has a tab named "Page to doc" List all URLs you want to scrape in the "URL" column Step 2: Configure API credentials Set up the following credentials in n8n: Firecrawl API: For web content scraping and markdown conversion Google Sheets OAuth2: For reading URLs and updating progress Google Drive OAuth2: For creating content documents Step 3: Set up your Google Drive folder The workflow saves scraped content to a specific Drive folder Default folder: "Contenu scrapรฉ" (Content Scraped) Folder ID: 1ry3xvQ9UqM2Rf9C4-AoJdg1lfB9inh_5 (customize this to your own folder) Create your own folder and update the folder ID in the "Create file markdown scraping" node Step 4: Choose your trigger method Option A: Chat interface Use the default chat trigger Send your Google Sheets URL through the chat interface Option B: Manual trigger Replace chat trigger with manual trigger Set the Google Sheets URL as a variable in the "Get URL" node How to customize the workflow URL source customization Sheet name: Change "Page to doc" to your preferred tab name Column structure: Modify field mappings if using different column names URL validation: Adjust filtering criteria for URL format requirements Batch size: The workflow processes all URLs sequentially (no batch size limit) Scraping configuration Firecrawl options: Add specific scraping parameters (wait times, JavaScript rendering) Content format: Currently outputs markdown (can be modified for other formats) Error handling: The workflow continues processing even if individual URLs fail Retry logic: Add retry mechanisms for failed scraping attempts Output customization Document naming: Currently uses the URL as document name (customizable) Folder organization: Create subfolders for different content types File format: Switch from Google Docs to other formats (PDF, TXT, etc.) Content structure: Add headers, metadata, or formatting to scraped content Progress tracking enhancements Status columns: Add more detailed status tracking (failed, retrying, etc.) Metadata capture: Store scraping timestamps, content length, etc. Error logging: Track which URLs failed and why Completion statistics: Generate summary reports of scraping results Use cases AI knowledge base creation E-commerce product pages: Scrape product descriptions and specifications for chatbot training Documentation sites: Convert help articles into structured knowledge base content FAQ pages: Extract customer service information for automated support systems Company information: Gather about pages, services, and team information Content analysis and migration Competitor research: Analyze competitor website content and structure Content audits: Extract existing content for analysis and optimization Website migrations: Backup content before site redesigns or platform changes SEO analysis: Gather content for keyword and structure analysis Research and documentation Market research: Collect information from multiple industry sources Academic research: Gather content from relevant web sources Legal compliance: Document website terms, policies, and disclaimers Brand monitoring: Track content changes across multiple sites Workflow features Smart processing logic Duplicate prevention: Skips URLs already marked as "Scrapรฉ" (scraped) Empty row filtering: Automatically ignores rows without URLs Sequential processing: Handles one URL at a time to avoid rate limiting Progress updates: Real-time status updates in the source spreadsheet Error handling and resilience Graceful failures: Continues processing remaining URLs if individual scrapes fail Status tracking: Clear indication of completed vs. pending URLs Completion notification: Summary message with link to scraped content folder Manual restart capability: Can resume processing from where it left off Results interpretation Organized content output Each scraped page creates: Individual Google Doc: Named with the source URL Markdown formatting: Clean, structured content extraction Metadata preservation: Original URL and scraping timestamp Organized storage: All documents in designated Google Drive folder Progress tracking The source spreadsheet shows: URL list: Original URLs to be processed Status column: "OK" for completed, empty for pending Real-time updates: Progress visible during workflow execution Completion summary: Final notification with access instructions Workflow limitations Sequential processing: Processes URLs one at a time (prevents rate limiting but slower for large lists) Google Drive dependency: Requires Google Drive for document storage Firecrawl rate limits: Subject to Firecrawl API limitations and quotas Single format output: Currently outputs only Google Docs (easily customizable) Manual setup: Requires Google Sheets template preparation before use No content deduplication: Creates separate documents even for similar content
by Sk developer
Automated IMDB Video Downloader: Download, Upload to Google Drive & Notify via Email Easily download IMDB videos via a user-friendly form. Automatically fetch video links using the IMDB Downloader API, save videos to Google Drive, and notify users via email with shareable links or failure alerts. Perfect for content creators and marketers. Node-by-Node Explanation On form submission**: Triggers the workflow when a user submits an IMDB video URL via a form. Fetch IMDB Video Info from API: Sends the URL to the **IMDB Downloader API to get video metadata and download links. Check API Response Status**: Verifies if the API responded successfully (status code 200). Download Video File**: Downloads the video from the provided media URL. Upload Video to Google Drive**: Uploads the downloaded video file to a specified Google Drive folder. Google Drive Set Permission**: Sets sharing permissions on the uploaded video for easy access. Success Notification Email with Drive Link**: Emails the user the Google Drive link to access the video. Processing Delay**: Adds a wait time before sending failure notifications. Failure Notification Email**: Emails the user if the video download or processing fails. How to Obtain Your RapidAPI Key Go to RapidAPI's IMDB Downloader API page. Sign up or log in to your RapidAPI account. Subscribe to the IMDB Downloader API. Find your unique x-rapidapi-key in the dashboard under the API keys section. Replace "your key" in your workflow headers with this key to authenticate requests. Use Cases & Benefits Use Cases Content creators downloading trailers or clips quickly. Marketing teams preparing video content for campaigns. Educators sharing film excerpts. Social media managers sourcing videos efficiently. Benefits Fully automates video download and upload workflow. Seamless Google Drive integration with sharing. Instant user notifications on success or failure. User-friendly with simple URL form submission. Who Is This For? Content creators** looking for fast video downloads. Marketers** needing instant access to IMDB clips. Educators** requiring film excerpts for lessons. Social media managers** preparing engaging content. Any user wanting hassle-free IMDB video downloads with cloud storage.
by Piotr Sikora
[LI] โ Search Profiles > โ ๏ธ Self-hosted disclaimer: > This workflow uses the SerpAPI community node, which is available only on self-hosted n8n instances. > For n8n Cloud, you may need to use an HTTP Request node with the SerpAPI REST API instead. Whoโs it for Recruiters, talent sourcers, SDRs, and anyone who wants to automatically gather public LinkedIn profiles from Google search results based on keywords โ across multiple pages โ and log them to a Google Sheet for further analysis. What it does / How it works This workflow extends the standard LinkedIn profile search to include pagination, allowing you to fetch results from multiple Google result pages in one go. Hereโs the step-by-step process: Form Trigger โ โLinkedIn Searchโ Collects: Keywords (comma separated) โ e.g., python, fintech, warsaw Pages to fetch โ number of Google pages to scrape (each page โ 10 results) Triggers the workflow when submitted. Format Keywords (Set) Converts the keywords into a Google-ready query string: ("python") ("fintech") ("warsaw") These parentheses improve relevance in Google searches. Build Page List (Code) Creates a list of pages to iterate through. For example, if โPages to fetchโ = 3, it generates 3 search batches with proper start offsets (0, 10, 20). Keeps track of: Grouped keywords (keywordsGrouped) Raw keywords Submission timestamp Loop Over Items (Split In Batches) Loops through the page list one batch at a time. Sends each batch to SerpAPI Search and continues until all pages are processed. SerpAPI Search Queries Google with: site:pl.linkedin.com/in/ ("keyword1") ("keyword2") ("keyword3") Fixed to the Warsaw, Masovian Voivodeship, Poland location. The start parameter controls pagination. Check how many results are returned (Switch) If no results โ Triggers No profiles found. If results found โ Passes data forward. Split Out Extracts each LinkedIn result from the organic_results array. Get Full Name to property of object (Code) Extracts a clean full name from the search result title (text before โโโ or โ|โ). Append profile in sheet (Google Sheets) Saves the following fields into your connected sheet: | Column | Description | |---------|-------------| | Date | Submission timestamp | | Profile | Public LinkedIn profile URL | | Full name | Extracted candidate name | | Keywords | Original keywords from the form | Loop Over Items (continue) After writing each batch, it loops to the next Google page until all pages are complete. Form Response (final step) Sends a confirmation back to the user after all pages are processed: Check linked file ๐งพ Google Sheets Setup Before using the workflow, prepare your Google Sheet with these columns in row 1: | Column Name | Description | |--------------|-------------| | Date | Automatically filled with the form submission time | | Profile | LinkedIn profile link | | Full name | Extracted name from search results | | Keywords | Original search input | > You can expand the sheet to include optional fields like Snippet, Job Title, or Notes if you modify the mapping in the Append profile in sheet node. Requirements SerpAPI account* โ with API key stored securely in *n8n Credentials**. Google Sheets OAuth2 credentials** โ connected to your target sheet with edit access. n8n instance (Cloud or self-hosted)** > Note: SerpAPI node is part of the Community package and may require self-hosted n8n. How to set up Import the [LI] - Search profiles workflow into n8n. Connect your credentials: SerpAPI โ use your API key. Google Sheets OAuth2 โ ensure you have write permissions. Update the Google Sheets node to point to your own spreadsheet and worksheet. (Optional) Edit the location field in SerpAPI Search for different regions. Activate the workflow and open the public form (via webhook URL). Enter your keywords and specify the number of pages to fetch. How to customize the workflow Change search region:** Modify the location in the SerpAPI node or change the domain to site:linkedin.com/in/ for global searches. Add pagination beyond 3โ4 pages:** Increase โPages to fetchโ โ but note that excessive pages may trigger Google rate limits. Avoid duplicates:* Add a *Google Sheets โ Read* + *IF** node before appending new URLs. Add notifications:* Add *Slack, **Discord, or Email nodes after Google Sheets to alert your team when new data arrives. Capture more data:** Map additional fields like title, snippet, or position into your Sheet. Security notes Never store API keys directly in nodes โ always use n8n Credentials. Keep your Google Sheet private and limit edit access. Remove identifying data before sharing your workflow publicly. ๐ก Improvement suggestions | Area | Recommendation | Benefit | |-------|----------------|----------| | Dynamic location | Add a โLocationโ field to the form and feed it to SerpAPI dynamically. | Broader and location-specific searches | | Rate limiting | Add a short Wait node (e.g., 1โ2s) between page fetches. | Prevents API throttling | | De-duplication | Check for existing URLs before appending. | Prevents duplicates | | Logging | Add a second sheet or log file with timestamps per run. | Easier debugging and tracking | | Data enrichment | Add a LinkedIn or People Data API enrichment step. | Collect richer candidate data | โ Summary: This workflow automates the process of searching public LinkedIn profiles from Google across multiple pages. It formats user-entered keywords into advanced Google queries, iterates through paginated SerpAPI results, extracts profile data, and stores it neatly in a Google Sheet โ all through a single, user-friendly form.
by Evoort Solutions
๐ Automated Keyword Analysis with On-Page SEO Workflow ๐ Description Boost your SEO strategy by automating keyword research and on-page SEO analysis with n8n. This workflow uses user input (keyword + country), retrieves essential data using the powerful SEO On-Page API, and saves it directly into Google Sheets. Ideal for marketers, content strategists, and SEO agencies looking for efficiency. ๐ Node-by-Node Flow explanation 1. ๐ข On form submission Triggers the workflow when a user submits a keyword and country via a simple form. 2. ๐ฆ Global Storage Captures and stores the submitted keyword and country for use across the workflow. 3. ๐ Keyword Insights Request Sends a POST request to the SEO On-Page API to fetch keyword suggestions (broad match keywords). 4. ๐งพ Re-Format Extracts the relevant broadMatchKeywords array from the keyword API response. 5. ๐ Keyword Insights Appends extracted keyword suggestions into the "Keyword Insights" tab in Google Sheets. 6. ๐ KeyWord Difficulty Request Sends a second POST request to the SEO On-Page API to fetch keyword difficulty and SERP data. 7. ๐ Re-Format 2 Extracts the keywordDifficultyIndex value from the API response. 8. ๐ KeyWord Difficulty Saves the keyword difficulty score into the "KeyWord Difficulty" sheet for reference. 9. ๐ Re -Format 5 Extracts SERP result data from the difficulty API response. 10. ๐๏ธ SERP Result Appends detailed SERP data into the "Serp Analytics" sheet in Google Sheets. ๐ฏ Benefits โ Fully Automated SEO Research โ No manual data entry or API calls required. ๐ Real-time Data Collection โ Powered by SEO On-Page API on RapidAPI, ensuring fresh and reliable results. ๐ Organized Insights โ Data is cleanly categorized into separate Google Sheets tabs. โฑ๏ธ Time Saver โ Instantly analyze keywords without switching between tools. ๐ก Use Cases ๐ SEO Agencies โ Generate keyword reports for clients automatically. ๐ Content Writers โ Discover keyword difficulty and SERP competition before drafting. ๐งโ๐ป Digital Marketers โ Monitor keyword trends and search visibility in real-time. ๐ Bloggers & Influencers โ Choose better keywords to rank faster on search engines. ๐ API Reference This workflow is powered by the SEO On-Page API available on RapidAPI. It offers keyword research, difficulty metrics, and SERP analytics through simple endpoints, making it ideal for automation with n8n. > โ ๏ธ Note: Make sure to replace "your key" with your actual RapidAPI key in both HTTP Request nodes for successful API calls. Create your free n8n account and set up the workflow in just a few minutes using the link below: ๐ Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by Evoort Solutions
๐ Analyze Competitor Keywords with RapidAPI and Google Sheets Reporting ๐ Description This n8n workflow streamlines the process of analyzing SEO competitor keywords using the Competitor Keyword Analysis API on RapidAPI. It collects a website and country via form submission, calls the API to retrieve keyword metrics, reformats the response, and logs the results into Google Sheets โ all automatically. It is ideal for SEO analysts, marketing teams, and agencies who need a hands-free solution for competitive keyword insights. ๐งฉ Node-by-Node Explanation ๐ On form submission (formTrigger) Starts the workflow when a user submits their website and country through a form. ๐ Competitor Keyword Analysis (httpRequest) Sends a POST request to the Competitor Keyword Analysis API on RapidAPI with form input to fetch keyword data. ๐ Reformat Code (code) Extracts the domainOrganicSearchKeywords array from the API response for structured processing. ๐ Google Sheets (googleSheets) Appends the cleaned keyword metrics into a Google Sheet for easy viewing and tracking. ๐ Benefits of This Workflow โ Automates SEO research using the Competitor Keyword Analysis API. โ Eliminates manual data entry โ results go straight into Google Sheets. โ Scalable and reusable for any number of websites or countries. โ Reformatting logic is built-in, so you get clean, analysis-ready data. ๐ผ Use Cases Marketing Agencies Use the Competitor Keyword Analysis API to gather insights for client websites and store the results automatically. In-house SEO Teams Quickly compare keyword performance across competitors and monitor shifts over time with historical Google Sheets logs. Freelancers and Consultants Provide fast, data-backed SEO reports using this automation with the Competitor Keyword Analysis API. Keyword Research Automation Make this flow part of a larger system for identifying keyword gaps, content opportunities, or campaign ideas. ๐ Output Example (Google Sheets) | keyword | searchVolume | cpc | competition | position | previousPosition | keywordDifficulty | |---------------|--------------|-----|-------------|----------|------------------|-------------------| | best laptops | 9900 | 2.3 | 0.87 | 5 | 7 | 55 | ๐ How to Get Your API Key for the Competitor Keyword Analysis API Go to ๐ Competitor Keyword Analysis API - RapidAPI Click "Subscribe to Test" (you may need to sign up or log in). Choose a pricing plan (thereโs a free tier for testing). After subscribing, click on the "Endpoints" tab. Your API Key will be visible in the "x-rapidapi-key" header. ๐ Copy and paste this key into the httpRequest node in your workflow. โ Summary This workflow is a powerful no-code automation tool that leverages the Competitor Keyword Analysis API on RapidAPI to deliver real-time SEO insights directly to Google Sheets โ saving time, boosting efficiency, and enabling smarter keyword strategy decisions. Create your free n8n account and set up the workflow in just a few minutes using the link below: ๐ Start Automating with n8n
by Evoort Solutions
๐ GST Data Analytics Automation Flow with Google Docs Reporting Description: Streamline GST data collection, analysis, and automated reporting using the GST Insights API and Google Docs integration. This workflow allows businesses to automate the extraction of GST data and directly generate formatted reports in Google Docs, making compliance easier. โ๏ธ Node-by-Node Explanation On form submission Triggers the automation whenever a user submits the GST-related data (like GSTIN) via a web form. It collects all necessary input for further processing in the workflow. Fetch GST Data Using GST Insights API Sends a request to the GST Insights API to fetch GST data based on the user's input. This is done via a POST request that includes the required authentication and the inputted GSTIN. Data Reformatting This node processes and structures the raw GST data received from the API. The reformatting ensures only the essential information (e.g., tax summaries, payment status, etc.) is extracted for reporting. Google Docs Reporting Generates a Google Docs document and auto-populates it with the reformatted GST data. The report is structured in a clean format, ready for sharing or downloading. ๐ก Use Cases Tax Consultants & Agencies:** Automate the GST insights and reporting process for clients by extracting key metrics directly from the GST Insights API. Accountants & Auditors:** Streamline GST compliance by generating automated reports based on the most current data from the API. E-commerce Platforms:** Automatically track GST payments, returns, and summaries for each sale and consolidate them into structured reports. SMEs and Startups:** Track your GST status and compliance without the need for manual intervention. Generate reports directly within Google Docs for easy access. ๐ฏ Benefits of this Workflow Automated GST Data Collection:* Fetch GST insights directly using the *GST Insights API** without manually searching through different resources. Google Docs Integration:** Automatically generate customized Google Docs reports with detailed GST data, making the reporting process efficient. Error-Free Data Analysis:** Automates data extraction and reporting, significantly reducing the risk of human errors. Customizable Reporting:** Customize the flow for various GST-related data such as payments, returns, and summaries. Centralized Document Storage:** All GST reports are saved and managed within Google Docs, ensuring easy collaboration and access. Quick Note: The GST Insights API provides detailed GST data analysis for Indian businesses. It can extract crucial data like returns, payments, and summaries directly from the GST system, which you can then use for compliance and reporting. Would you like to explore the API further or need help with other integrations? ๐ How to Get Your API Key for GST Insights API Visit the API Page: Go to the GST Insights API on RapidAPI. Sign Up/Login: Create an account or log in if you already have one. Subscribe to the API: Click "Subscribe to Test" and choose a plan (free or paid). Copy Your API Key: After subscribing, your API Key will be available in the "X-RapidAPI-Key" section under "Endpoints". Use the Key: Include the key in your API requests like this: -H "X-RapidAPI-Key: YOUR_API_KEY"
by PDF Vector
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Transform Complex Research Papers into Accessible Summaries This workflow automatically generates multiple types of summaries from research papers, making complex academic content accessible to different audiences. By combining PDF Vector's advanced parsing capabilities with GPT-4's language understanding, researchers can quickly digest papers outside their expertise, communicate findings to diverse stakeholders, and create social media-friendly research highlights. Target Audience & Problem Solved This template is designed for: Research communicators** translating complex findings for public audiences Journal editors** creating accessible abstracts and highlights Science journalists** quickly understanding technical papers Academic institutions** improving research visibility and impact Funding agencies** reviewing large volumes of research outputs It solves the critical challenge of research accessibility by automatically generating summaries tailored to different audience needs - from technical experts to the general public. Prerequisites n8n instance with PDF Vector node installed OpenAI API key with GPT-4 or GPT-3.5 access PDF Vector API credentials Basic understanding of webhook setup Optional: Slack/Email integration for notifications Minimum 20 API credits per paper summarized Step-by-Step Setup Instructions Configure API Credentials Navigate to n8n Credentials section Add PDF Vector credentials with your API key Add OpenAI credentials with your API key Test both connections to ensure they work Set Up the Webhook Endpoint Import the workflow template into n8n Note the webhook URL from the "Webhook - Paper URL" node This URL will receive POST requests with paper URLs Example request format: { "paperUrl": "https://example.com/paper.pdf" } Configure Summary Models Review the OpenAI model settings in each summary node GPT-4 recommended for executive and technical summaries GPT-3.5-turbo suitable for lay and social media summaries Adjust temperature settings for creativity vs accuracy Customize Output Formats Modify the "Combine All Summaries" node for your needs Add additional fields or metadata as required Configure response format (JSON, HTML, plain text) Test the Workflow Use a tool like Postman or curl to send a test request Monitor the execution for any errors Verify all four summary types are generated Check response time and adjust timeout if needed Implementation Details The workflow implements a sophisticated summarization pipeline: PDF Parsing: Uses LLM-enhanced parsing for accurate extraction from complex layouts Parallel Processing: Generates all summary types simultaneously for efficiency Audience Targeting: Each summary type uses specific prompts and constraints Quality Control: Structured prompts ensure consistent, high-quality outputs Flexible Output: Returns all summaries in a single API response Customization Guide Adding Custom Summary Types: Create new summary nodes with specialized prompts: // Example: Policy Brief Summary { "content": "Create a policy brief (max 300 words) highlighting: Policy-relevant findings Recommendations for policymakers Societal implications Implementation considerations Paper content: {{ $json.content }}" } Modifying Summary Lengths: Adjust word limits in each summary prompt: // In Executive Summary node: "max 500 words" // Change to your desired length // In Tweet Summary node: "max 280 characters" // Twitter limit Adding Language Translation: Extend the workflow with translation nodes: // After summary generation, add: "Translate this summary to Spanish: {{ $json.executiveSummary }}" Implementing Caching: Add a caching layer to avoid reprocessing: Use Redis or n8n's static data Cache based on paper DOI or URL hash Set appropriate TTL for cache entries Batch Processing Enhancement: For multiple papers, modify the workflow: Accept array of paper URLs Use SplitInBatches node for processing Aggregate results before responding Summary Types: Executive Summary: 1-page overview for decision makers Technical Summary: Detailed summary for researchers Lay Summary: Plain language for general audience Social Media: Tweet-sized key findings Key Features: Parse complex academic PDFs with LLM enhancement Generate multiple summary types simultaneously Extract and highlight key methodology and findings Create audience-appropriate language and depth API-driven for easy integration Advanced Features Quality Metrics: Add a quality assessment node: // Evaluate summary quality const qualityChecks = { hasKeyFindings: summary.includes('findings'), appropriateLength: summary.length <= maxLength, noJargon: !technicalTerms.some(term => summary.includes(term)) }; Template Variations: Create field-specific templates: Medical research: Include clinical implications Engineering papers: Focus on technical specifications Social sciences: Emphasize methodology and limitations
by Rahul Joshi
Description: Recover missed opportunities automatically with this n8n automation template. The workflow connects with Calendly, identifies no-show meetings, and instantly sends personalized Telegram messages encouraging leads to reschedule. It then notifies the assigned sales representative via email, ensuring timely human follow-up. Perfect for sales teams, consultants, and customer success managers who want to minimize no-shows, improve conversion rates, and keep pipelines warm โ all without manual tracking. What This Template Does (Step-by-Step) โฐ Runs Every Hour Automatically triggers every hour to check your Calendly events for recently missed meetings. ๐ฅ Fetch Active Calendly Appointments Retrieves all scheduled events from Calendly using your user URI and event metadata. ๐ Filter for No-Shows (30+ Minutes Past) Uses a built-in logic block to detect appointments that ended over 30 minutes ago and were not attended. ๐ฏ Check Lead Intent Processes only leads tagged as โHigh Intentโ in metadata to focus recovery efforts on qualified prospects. ๐ฌ Send Telegram Message to Lead Sends a personalized message to the leadโs Telegram ID, including a direct reschedule link and friendly tone from your sales team. ๐ง Notify Assigned Sales Rep via Email Alerts the relevant rep (from metadata) that the lead missed a meeting and has received an automated Telegram follow-up. Includes contact name, status update, and meeting link for manual re-engagement. ๐ Continuous Follow-Up Automation Repeats hourly, ensuring no missed appointment goes unnoticed โ even outside working hours. Key Features ๐ค Smart detection of no-shows via Calendly API ๐ฌ Telegram message automation with personalization ๐ง Sales rep email alerts with complete context ๐ฏ Filters by โHigh Intentโ tag to focus efforts โ๏ธ Easy setup with environment variables and credentials Use Cases ๐ Automatically re-engage missed sales calls ๐ Reduce no-show rates for Calendly meetings ๐ฌ Keep your sales pipeline active and responsive ๐ข Notify sales reps in real time about recovery actions Required Integrations Calendly API โ to fetch scheduled events and meeting details Telegram API โ to send automated reschedule messages SMTP or Gmail โ to alert the assigned sales representative Why Use This Template? โ Saves hours of manual follow-up effort โ Boosts reschedule rate for missed meetings โ Keeps high-value leads warm and engaged โ Ensures your sales reps never miss a no-show
by Yaron Been
Generate 3D Models & Textures from Images with Hunyuan3D AI This workflow connects n8n โ Replicate API to generate 3D-like outputs using the ndreca/hunyuan3d-2.1-test model. It handles everything: sending the request, waiting for processing, checking status, and returning results. โก Section 1: Trigger & Setup โ๏ธ Nodes 1๏ธโฃ On Clicking โExecuteโ What it does:** Starts the workflow manually in n8n. Why itโs useful:** Great for testing or one-off runs before automation. 2๏ธโฃ Set API Key What it does:* Stores your *Replicate API Key**. Why itโs useful:** Keeps authentication secure and reusable across HTTP nodes. ๐ก Beginner Benefit No coding needed โ just paste your API key once. Easy to test: press Execute, and youโre live. ๐ค Section 2: Send Job to Replicate โ๏ธ Nodes 3๏ธโฃ Create Prediction (HTTP Request) What it does:* Sends a *POST request** to Replicateโs API with: Model version (70d0d816...ae75f) Input image URL Parameters like steps, seed, generate_texture, remove_background Why itโs useful:** This kicks off the AI generation job on Replicateโs servers. 4๏ธโฃ Extract Prediction ID (Code) What it does:* Grabs the *prediction ID** from the API response and builds a status-check URL. Why itโs useful:** Every job has a unique ID โ this lets us track progress later. ๐ก Beginner Benefit You donโt need to worry about JSON parsing โ the workflow extracts the ID automatically. Everything is reusable if you run multiple generations. โณ Section 3: Poll Until Complete โ๏ธ Nodes 5๏ธโฃ Wait (2s) What it does:** Pauses for 2 seconds before checking the job status. Why itโs useful:** Prevents spamming the API with too many requests. 6๏ธโฃ Check Prediction Status (HTTP Request) What it does:** GET request to see if the job is finished. 7๏ธโฃ Check If Complete (IF Node) What it does:** If status = succeeded โ process results. If not โ loops back to Wait and checks again. ๐ก Beginner Benefit Handles waiting logic for you โ no manual refreshing needed. Keeps looping until the AI job is really done. ๐ฆ Section 4: Process the Result โ๏ธ Nodes 8๏ธโฃ Process Result (Code) What it does:** Extracts: status output (final generated file/URL) metrics (performance stats) Timestamps (created_at, completed_at) Model info Why itโs useful:** Packages the response neatly for storage, email, or sending elsewhere. ๐ก Beginner Benefit Get clean, structured data ready for saving or sending. Can be extended easily: push output to Google Drive, Notion, or Slack. ๐ Workflow Overview | Section | What happens | Key Nodes | Benefit | | --------------------- | --------------------------------- | ----------------------------- | --------------------------------- | | โก Trigger & Setup | Start workflow + set API key | Manual Trigger, Set | Easy one-click start | | ๐ค Send Job | Send input & get prediction ID | Create Prediction, Extract ID | Launches AI generation | | โณ Poll Until Complete | Waits + checks status until ready | Wait, Check Status, IF | Automated loop, no manual refresh | | ๐ฆ Process Result | Collects output & metrics | Process Result | Clean result for next steps | ๐ฏ Overall Benefits โ Fully automates Replicate model runs โ Handles waiting, retries, and completion checks โ Clean final output with status + metrics โ Beginner-friendly โ just add API key + input image โ Extensible: connect results to Google Sheets, Gmail, Slack, or databases โจ In short: This is a no-code AI image-to-3D content generator powered by Replicate and automated by n8n.