by Daniel Lianes
Auto-generate SEO blog posts from Google Trends to WordPress This workflow provides complete blog automation from trend detection to publication. It eliminates manual content research, writing, and publishing by using AI agents, Google Trends analysis, and WordPress integration for hands-free blog management that scales your content strategy. Overview This workflow automatically handles the entire blog creation pipeline using advanced AI coordination and SEO optimization. It manages trend discovery, topic selection, content research, writing, HTML formatting, and WordPress publishing with built-in internal linking and comprehensive performance tracking. Core Function: Autonomous blog generation that transforms trending Google searches into SEO-optimized WordPress posts with zero manual intervention, maintaining consistent publishing schedules while capturing emerging traffic opportunities. Key Capabilities Automated trend detection** - Discovers emerging topics using Google Trends via SerpAPI before they become saturated AI-powered topic selection** - Intelligent evaluation of search volume, user intent, and competition levels Content research automation** - Perplexity API integration for reliable source gathering and fact verification SEO-optimized writing** - AI agents create keyword-focused, engaging content with proper structure Internal linking intelligence** - Automatic cross-linking with existing posts for enhanced SEO authority WordPress publishing** - Direct publication with semantic HTML formatting and complete metadata Performance tracking** - Comprehensive logging in Google Sheets for analytics and optimization Tools Used n8n**: Workflow orchestration platform managing the entire automation pipeline SerpAPI**: Google Trends data access and trend analysis for keyword discovery Perplexity API**: Reliable content research and fact-checking for authoritative sources OpenRouter**: Gateway to multiple AI models for specialized content generation tasks WordPress API**: Direct publishing integration with full metadata and formatting control Google Sheets**: Performance logging, internal link database, and analytics tracking Built-in SEO Logic**: Automated slug generation, meta descriptions, and HTML optimization How to Install Import the Workflow: Download the JSON file and import into your n8n instance Configure API Access: Set up SerpAPI, Perplexity, and OpenRouter credentials in n8n WordPress Integration: Add WordPress site credentials and enable REST API access Google Sheets Setup: Create tracking spreadsheet using provided template structure Schedule Configuration: Set desired publication frequency (daily, weekly, or custom) Content Customization: Adjust AI prompts and SEO parameters for your niche Test Execution: Run manual test to verify all integrations work correctly Use Cases Content Marketing Automation**: Maintain consistent blog publishing without manual content creation SEO Traffic Capture**: Generate optimized posts targeting trending keywords before competition Authority Building**: Regular publication on emerging topics to establish thought leadership Organic Growth Strategy**: Systematic content creation that builds domain authority over time Content Calendar Management**: Automated scheduling eliminates manual planning and publishing Internal Link Building**: Systematic SEO improvement through intelligent cross-linking strategy Setup requirements SerpAPI account**: For Google Trends data access and trend monitoring capabilities Perplexity API**: Professional content research and reliable source verification OpenRouter account**: Access to GPT-4.1 and other advanced AI models for content generation WordPress site**: With REST API enabled and proper user permissions configured Google Sheets**: For comprehensive performance tracking and internal link database management Total setup time: 15-20 minutes once all API accounts are properly configured. How to customize Content Focus: Modify trend detection parameters and keyword filters to target your specific niche. Adjust topic selection criteria based on your content strategy and audience interests. Writing Style: Customize AI writing prompts to match your brand voice, adjust article length requirements, modify tone and complexity, or update HTML template structure for consistent formatting. SEO Strategy: Update internal linking logic for your site structure, modify meta description templates, adjust keyword density parameters, or customize slug generation patterns. Publishing Control: Change automation frequency, add human review checkpoints, integrate with social media platforms, or connect to email marketing systems for content distribution. Performance Optimization: Adjust Google Sheets tracking columns, modify trend analysis parameters, or integrate with analytics platforms for deeper insights. Google Sheets Template The workflow includes a pre-configured Google Sheets template for tracking: Publication dates and performance metrics Target keywords and search volume data Internal link mapping and SEO improvements Content performance analytics WordPress URLs and metadata tracking Template Structure: Date Published | Title | Slug | Target Keyword | WordPress URL | Internal Links Added | Traffic Data Was this helpful? Let me know! I truly hope this automated blog system helps scale your content strategy. Your feedback helps me create better automation resources for the community. Want to take content automation further? If you're looking to optimize your content strategy or need custom automation solutions: Advisory (Discovery Call): Have content goals but unsure how automation can help? Let's explore how AI-powered workflows can transform your content pipeline and drive organic growth. Schedule a Discovery Call Custom Content Automation: Need a tailored solution for your specific content workflow, CMS integration, or multi-platform publishing strategy? Let's build the perfect automation for your needs. Book Content Automation Consulting Stay Updated on Automation For more content automation strategies, AI workflow tips, and business automation insights: Follow me on LinkedIn #n8n #automation #wordpress #seo #contentmarketing #ai #blogging #googletrends #serpapi #perplexity #workflow #contentautomation #seooptimization #aiwriting #blogautomation #digitalmarketing #contentcreation #organicgrowth #inboundmarketing #productivity
by Ruben AI
LinkedIn DM Automation Overview Effortlessly scale personalized LinkedIn outreach using a no-code automation stack. This template provides a powerful, user-friendly system for harvesting leads from LinkedIn posts and managing outreach—all within Airtable and n8n. Features & Highlights Actionable Input:** Simply enter a LinkedIn post URL to kickstart the engine—no browser scraping or manual work needed. Lead Harvesting:** Automatically scrape commenters, likers, and profile data using Unipile’s API access. Qualification Hub:** Easily review and qualify leads right inside Airtable using custom filters and statuses. Automated Campaign Flow:** n8n handles the sequence—from sending connection requests (adhering to LinkedIn limits) to delivering personalized DMs upon acceptance. Unified Dashboard:** Monitor campaign progress, connection status, and messaging performance in real time. Flexible & Reusable:** Fully customizable for your own messaging, filters, or UD campaigns—clone, adapt, and deploy. Why Use This Template? ++Zero-code friendly:++ Ideal for entrepreneurs, sales professionals, and growth teams looking for streamlined, scalable outreach. ++Transparent and compliant:++ Built with Airtable UI and compliant API integration—no reliance on browser automation or unofficial methods. ++Rapid Deployment:++ Clone and launch your automation in under 30 minutes—no dev setup required. Setup Instructions Import the template into your n8n workspace. Connect your Airtable and Unipile credentials. Configure LinkedIn post input, filters, and DM templates in Airtable. Run the workflow and monitor results directly from Airtable or n8n. Use Cases Capture inbound leads from your viral LinkedIn posts. Qualify and nurture prospects seamlessly without manual follow-ups. Scale outreach with precision and personalization. YouTube Explanation You can access the video explanation of how to use the workflow: Explanation Video
by Trung Tran
Free PDF Generator in n8n – No External Libraries or Paid Services > A 100% free n8n workflow for generating professionally formatted PDFs without relying on external libraries or paid converters. It uses OpenAI to create Markdown content, Google Docs to format and convert to PDF, and integrates with Google Drive and Slack for archiving and sharing, ideal for reports, BRDs, proposals, or any document you need directly inside n8n. Watch the demo video below: Who’s it for Teams that need auto-generated documents (reports, guides, checklists) in PDF format. Operations or enablement teams who want files archived in Google Drive and shared in Slack automatically. Anyone experimenting with LLM-powered document generation integrated into business workflows. How it works / What it does Manual trigger starts the workflow. LLM generates a sample Markdown document (via OpenAI Chat Model). Google Drive folder is configured for storage. Google Doc is created from the generated Markdown content. Document is exported to PDF using Google Drive. (Sample PDF generated from comprehensive markdown) PDF is archived in a designated Drive folder. Archived PDF is downloaded for sharing. Slack message is sent with the PDF attached. How to set up Add nodes in sequence: Manual Trigger OpenAI Chat Model (prompt to generate sample Markdown) Set/Manual input for Google Drive folder ID(s) HTTP Request or Google Drive Upload (convert to Google Docs) Google Drive Download (PDF export) Google Drive Upload (archive PDF) Google Drive Download (fetch archived file) Slack Upload (send message with attachment) Configure credentials for OpenAI, Google Drive, and Slack. Map output fields: data.markdown → Google Docs creation docId → PDF export fileId → Slack upload Test run to ensure PDF is generated, archived, and posted to Slack. Requirements Credentials**: OpenAI API key (or compatible LLM provider) Google Drive (OAuth2) with read/write permissions Slack bot token with files:write permission Access**: Write access to target Google Drive folders Slack bot invited to the target channel How to customize the workflow Change the prompt** in the OpenAI Chat Model to generate different types of content (reports, meeting notes, checklists). Automate triggering**: Replace Manual Trigger with Cron for scheduled document generation. Use Webhook Trigger to run on-demand from external apps. Modify storage logic**: Save both .md and .pdf versions in Google Drive. Use separate folders for drafts vs. final versions. Enhance distribution**: Send PDFs to multiple Slack channels or via email. Integrate with project management tools for automated task creation.
by Guillaume Duvernay
Go beyond basic AI-generated text and create articles that are well-researched, comprehensive, and credible. This template automates an advanced content creation process that mimics a professional writing team: it plans, researches, and then writes. Instead of just giving an AI a topic, this workflow first uses an AI "planner" to break the topic down into logical sub-questions. Then, it deploys an AI "researcher" powered by Linkup to search the web for relevant insights and sources for each question. Finally, this complete, sourced research brief is handed to a powerful AI "writer" to compose a high-quality article, complete with hyperlinks back to the original sources. Who is this for? Content marketers & SEO specialists:** Scale the production of well-researched, link-rich articles that are built for authority and performance. Bloggers & thought leaders:** Quickly generate high-quality first drafts on any topic, complete with a list of sources for easy fact-checking and validation. Marketing agencies:** Dramatically improve your content turnaround time by automating the entire research and first-draft process for clients. What problem does this solve? Adds credibility with sources:** Solves one of the biggest challenges of AI content by automatically finding and preparing to include hyperlinks to the web sources used in the research, just as a human writer would. Ensures comprehensive coverage:** The AI-powered "topic breakdown" step prevents superficial content by creating a logical structure for the article and ensuring all key aspects of a topic are researched. Improves content quality and accuracy:** The "research-first" approach provides the final AI writer with a rich brief of specific, up-to-date information, leading to more detailed and factually grounded articles than a simple prompt ever could. Automates the entire writing workflow:** This isn't just an AI writer; it's an end-to-end system that automates the planning, research, and drafting process, saving you hours of manual work. How it works This workflow orchestrates a multi-step "Plan, Research, Write" process: Plan (Decomposition): You provide an article title and guidelines via the built-in form. An initial AI call acts as a "planner," breaking down the main topic into an array of logical sub-questions. Research (Web Search): The workflow then loops through each of these sub-questions. For each one, it uses Linkup to perform a targeted web search, gathering multiple relevant insights and their source URLs. Consolidate (Brief Creation): All the sourced insights from the research phase are compiled into a single, comprehensive research brief. Write (Final Generation): This complete, sourced brief is handed to a final, powerful AI writer (e.g., GPT-5). Its instructions are clear: write a high-quality article based only on the provided research and integrate the source links as hyperlinks where appropriate. Setup Connect your Linkup account: In the Query Linkup for insights (HTTP Request) node, add your Linkup API key. We recommend creating a "Generic Credential" of type "Bearer Token" for this. Linkup's free plan is very generous and includes credits for ~1000 searches per month. Connect your AI provider: Connect your AI provider (e.g., OpenAI) credentials to the two Language Model nodes. For cost-efficiency, we recommend a smaller, faster model for Generate research questions and a more powerful, creative model for Generate the AI output. Activate the workflow: Toggle the workflow to "Active" and use the built-in form to enter an article title and guidelines to generate your first draft! Taking it further Control your sources:* For more brand-aligned or niche content, you can restrict the web search to specific websites by adding site:example.com OR site:anothersite.com to the query in the *Query Linkup for insights** node. Automate publishing:* Connect the final *Article result* node to a *Webflow* or *WordPress** node to automatically create a draft post in your CMS. Generate content in bulk:* Replace the *Form Trigger* with an *Airtable* or *Google Sheet** trigger to automatically generate a whole batch of articles from your content calendar. Customize the writing style:* Tweak the system prompt in the final *Generate the AI output** node to match your brand's specific tone of voice, add SEO keywords, or include calls-to-action.
by Trung Tran
Beginner’s Tutorial: Manage Google Cloud Storage Buckets and Objects with n8n Watch the demo video below: Who’s it for Beginners who want to learn how to automate Google Cloud Storage (GCS) operations with n8n. Developers who want to combine AI image generation with cloud storage management. Anyone looking for a simple introduction to working with Buckets and Objects in GCS. How it works / What it does This workflow demonstrates end-to-end usage of Google Cloud Storage with AI integration: Trigger: Start manually by clicking Execute Workflow. Edit Fields: Provide input values (e.g., bucket name or image description). List Buckets: Retrieve all existing buckets in the project (branch: view only). Create Bucket: If needed, create a new bucket to store objects. Prompt Generation Agent: Use an AI model to generate a creative text prompt. Generate Image: Convert the AI-generated prompt into an image. Upload Object: Store the generated image as an object in the selected bucket. Delete Object: Clean up by removing the uploaded object if required. This shows the full lifecycle: Bucket → Object (Create/Upload/Delete) combined with AI image generation. How to set up Trigger the workflow: Use the When clicking Execute workflow node to start manually. Provide inputs: In Edit Fields, specify details such as bucket name or description text for the image. List buckets: Use the List Buckets node to see what exists. Create a bucket: Use Create Bucket if you want a new storage bucket. Generate prompt & image: The Prompt Generation Agent uses an OpenAI Chat Model to create an image prompt. The Generate an Image node turns this prompt into an actual image. Upload to bucket: Use Create Object to upload the generated image into your GCS bucket. Delete object (optional): Use Delete Object to remove the file from the bucket as a cleanup step. Requirements An active Google Cloud account with Cloud Storage API enabled. A Service Account Key (JSON) credential added in n8n for GCS. An OpenAI API Key configured in n8n for the prompt and image generation nodes. Basic familiarity with running workflows in n8n. How to customize the workflow Different object types:** Instead of images, upload PDFs, logs, or text files. Automatic cleanup:** Skip the delete step if you want objects to persist. Schedule trigger:** Replace manual execution with a weekly or daily schedule. Dynamic prompts:** Accept user input from a form or webhook to generate images. Multi-bucket management:** Extend the logic to manage multiple buckets across projects. Notifications:** Add a Slack/Email step after upload to notify your team with the object URL. ✅ By the end of this tutorial, you’ll understand how to: Work with Buckets (list, create). Work with Objects (upload, delete). Integrate AI image generation with Google Cloud Storage.
by Simeon Penev
Who’s it for Growth, marketing, sales, and founder teams that want a decision-ready Ideal Customer Profile (ICP)—grounded in their own site content. How it works / What it does On form submission* collects *Website URL* and *Business Name** and redirects to Google Drive Folder after the final node. Crawl and Scrape the Website Content* - crawls and scrape *20 pages** from the website. ICP Creator* builds a *Markdown ICP** with: A) Executive Summary B) One-Pager ICP C) Tiering & Lead Scoring D) Demand Gen & ABM Plays E) Evidence Log F) Section Confidence Facts vs. Inferences, confidence scores and tables. Markdown to Google Doc* converts Markdown to Google Docs batchUpdate requests. Then this is used in *Update a document** for updating the empty doc. Create a document* + *Update a document* generate *“ICP for <Business Name>”** in your Drive folder and apply formatting. How to set up 1) Add credentials: Firecrawl (Authorization header), OpenAI (Chat), Google Docs OAuth2. 2) Replace placeholders: {{API_KEY}}, {{google_drive_folder_id}}, {{google_drive_folder_url}}. 3) Publish and open the Form URL to test. Requirements Firecrawl API key • OpenAI API key • Google account with access to the target Drive folder. Resources Google OAuth2 Credentials Setup - https://docs.n8n.io/integrations/builtin/credentials/google/oauth-generic/ OpenAI API key - https://docs.n8n.io/integrations/builtin/credentials/openai/ Firecrawl API key - https://take.ms/lGcUp
by Easy8.ai
Auto-Generate SEO FAQ Answers from Google Sheets with OpenAI Intro/Overview This workflow automates the process of generating SEO-optimized FAQ answers using AI, pulling questions from a Google Sheet and writing answers back into the same sheet. It’s ideal for content marketers, SEO specialists, and digital teams looking to scale FAQ content generation with minimal manual input. By combining the power of Google Sheets, AI, and WordPress, the workflow transforms raw questions into structured, keyword-targeted answers tailored for specific audiences — ready for use on landing pages, blogs, or help centers, and automatically publishes them as WordPress posts. How it works Schedule Trigger**: Executes the workflow at a set interval to check for new or unprocessed questions in the Google Sheet. Get Questions from Sheet**: Reads from a specific Google Sheet, targeting columns for: Question (FAQ prompt) KW (target SEO keyword) Audience (intended reader) Article (desired WordPress post title) Filter**: Ensures only rows without an existing answer are processed (i.e., empty "Answer" column). Generate FAQ Answer**: Passes the question, keyword, and audience to the OpenAI Chat Model using a structured prompt to generate: A concise TL;DR-style summary A detailed, SEO-optimized markdown-formatted answer OpenAI Chat Model**: Utilizes GPT-4 Turbo with a controlled temperature (0.7) and token limit (1000) to produce structured, on-brand, keyword-optimized content. Parse FAQ Answer**: Extracts and formats the AI response into separate fields for writing back to the sheet. Update Sheet with Answer**: Writes the AI-generated answer into the Answer column of the same row in the source Google Sheet. WordPress Node**: Publishes each generated answer as a new WordPress post Uses “Create Post” operation Title: Taken from the Article column in the sheet Content: Uses the detailed AI-generated answer Requires valid WordPress credentials (REST API / Application Password) How to Use Importing the Workflow Download or import the workflow JSON into your n8n instance. Credential Setup Connect your Google Sheets credentials. Add your OpenAI API Key in the relevant node. Connect your WordPress credentials for content publishing. Node Assignment Update the following: Google Sheet ID Sheet range (ensure it includes all relevant columns) Timezone & Schedule Adjust the Schedule Trigger node to match your preferred time and frequency (e.g., every weekday at 9 AM). Testing Guidance Add a few sample FAQ entries in your sheet. Run the workflow manually to verify: Prompt quality Answer accuracy Proper sheet update Successful WordPress post creation Example Use Cases Marketing teams generating bulk FAQ content for landing pages SEO professionals creating keyword-optimized responses for user queries Agencies producing personalized FAQ sections for multiple client niches SaaS companies automating knowledge base content with targeted messaging Content teams publishing AI-generated FAQs directly to WordPress blogs Requirements ✅ Google Account with access to the target Google Sheet ✅ OpenAI API Key (GPT-4 Turbo or equivalent) ✅ WordPress account with REST API or Application Password access ✅ Google Sheet with the following columns: Question: The FAQ prompt KW: Target keyword for SEO Audience: Intended reader persona Article: Desired WordPress post title Answer: Output column (leave empty initially) Customization (Optional Section) Tone & Style**: Modify the system prompt to reflect your brand voice (e.g., friendly, expert, concise). Model**: Use a different AI model (e.g., Gemini, Claude, or OpenAI GPT-4.1). Output Format**: Adjust the markdown output to use different heading levels, bullet styles, or HTML if required. Audience Logic**: Expand the input options to fine-tune responses for more specific demographics or buyer personas. Multi-output Options**: Extend the workflow to post content to Notion, CMS, or documentation platforms alongside Google Sheets and WordPress. This automation accelerates content creation, automatically keeps your FAQ sections SEO-friendly, and publishes the results directly to WordPress — keeping your content pipeline running hands-free once deployed.
by Oneclick AI Squad
This n8n workflow automates task creation and scheduled reminders for users via a Telegram bot, ensuring timely notifications across multiple channels like email and Slack. It streamlines task management by validating inputs, storing tasks securely, and delivering reminders while updating statuses for seamless follow-up. Key Features Enables users to create tasks directly in chat via webhook integration. Triggers periodic checks for due tasks and processes them individually for accurate reminders. Routes reminders to preferred channels (Telegram, email, or Slack) based on user settings. Validates inputs, handles errors gracefully, and logs task data for persistence and auditing. Workflow Process The Webhook Entry Point node receives task creation requests from users via chat (e.g., Telegram bot), including details like user ID, task description, and channel preferences. The Input Validation node checks for required fields (e.g., user ID, task description); if validation fails, it routes to the Error Response node. The Save to Database node stores validated task data securely in a database (e.g., PostgreSQL, MongoDB, or MySQL) for persistence. The Success Response node (part of Response Handlers) returns a confirmation message to the user in JSON format. The Schedule Trigger node runs every 3 minutes to check for pending reminders (with a 5-minute buffer for every hour to avoid duplicates). The Fetch Due Tasks node queries the database for tasks due within the check window (e.g., reminders set for within 3 minutes). The Tasks Check node verifies if fetched tasks exist and are eligible for processing. The Split Items node processes each due task individually to handle them in parallel without conflicts. The Channel Router node directs reminders to the appropriate channel based on task settings (e.g., email, Slack, or Telegram). The Email Sender node sends HTML-formatted reminder emails with task details and setup instructions. The Slack Sender node delivers Slack messages using webhooks, including task formatting and user mentions. The Telegram Sender node sends Telegram messages via bot API, including task ID, bot setup, and conversation starters. The Update Task Status node marks the task as reminded in the database (e.g., updating status to "sent" with timestamp). The Workflow Complete! node finalizes the process, logging completion and preparing for the next cycle. Setup Instructions Import the workflow into n8n and configure the Webhook Entry Point with your Telegram bot's webhook URL and authentication. Set up database credentials in the Save to Database and Fetch Due Tasks nodes (e.g., connect to PostgreSQL or MongoDB). Configure channel-specific credentials: Telegram bot token for Telegram Sender, email SMTP for Email Sender, and Slack webhook for Slack Sender. Adjust the Schedule Trigger interval (e.g., every 3 minutes) and add any custom due-time logic in Fetch Due Tasks. Test the workflow by sending a sample task creation request via the webhook and simulating due tasks to verify reminders and status updates. Monitor executions in n8n dashboard and tweak validation rules or response formats as needed for your use case. Prerequisites Telegram bot setup with webhook integration for task creation and messaging. Database service (e.g., PostgreSQL, MongoDB, or MySQL) for task storage and querying. Email service (e.g., SMTP provider) and Slack workspace for multi-channel reminders. n8n instance with webhook and scheduling enabled. Basic API knowledge for bot configuration and channel routing. Modification Options Customize the Input Validation node to add fields like priority levels or recurring task flags. Extend the Channel Router to include additional channels (e.g., Microsoft Teams or SMS via Twilio). Modify the Schedule Trigger to use dynamic intervals based on task urgency or user preferences. Enhance the Update Task Status node to trigger follow-up actions, like archiving completed tasks. Adjust the Telegram Sender node for richer interactions, such as inline keyboards for task rescheduling. Explore More AI Workflows: Get in touch with us for custom n8n automation!
by Jose Bossa
Automated Social Media Video Posting 👥 Who's it for This workflow is perfect for content creators, social media managers, and businesses who want to schedule and automatically post videos 📹 to multiple platforms (Instagram, LinkedIn, TikTok) without manual effort. Save hours every week! ⏰ 🤖 What it does It automatically reads scheduled posts from Google Sheets, checks if it's the right time to post, downloads videos from Google Drive, uploads them to multiple social media platforms simultaneously, updates the posting status, and sends you a Telegram notification with the results. Complete hands-free social media management! 🚀 ⚙️ How it works ⏰ Schedule Trigger – Runs twice daily at 9 AM and 9 PM 📊 Google Sheets (Read) – Fetches posts with status "Listo para postear" (Ready to post) ⚙️ Code Node – Converts trigger time to readable Spanish format (e.g., "16 de Octubre a las 9 am") 🔍 If Condition – Checks if current time matches the scheduled post time in the sheet 📝 Format Drive Content – Extracts and organizes post data (Title, Copy, Video URL) 🆔 Social Media Account IDs – Prepares account identifiers (can be customized for specific accounts) 🎬 Upload a video – Posts video simultaneously to Instagram, LinkedIn, and TikTok using UploadPost API 📊 Google Sheets (Update) – Changes post status to "Posteado" (Posted) to avoid duplicates 📱 Telegram Notification – Sends detailed success report with URLs for each platform 📋 Requirements Google Sheets** with your content calendar Google Drive** to store your videos UploadPost API account** (supports Instagram, LinkedIn, TikTok): Click aquí 👉 UploadPost Telegram Bot** for notifications n8n instance** with required node packages Google Sheets Structure Your spreadsheet should have these columns: Title – Post title Copy – Post caption/description Video Link – Google Drive download URL Status – Post status ("Listo para postear" or "Posteado") Fecha.Hora – Scheduled time (format: "16 de Octubre a las 9 am") row_number – Auto-generated row identifier 🛠️ How to set up Create your Google Sheets calendar: Set up columns as specified above Use status "Listo para postear" for scheduled posts Format dates as "DD de Mes a las HH am/pm" (Spanish format) Upload videos to Google Drive: Get shareable download links (format: https://drive.google.com/uc?export=download&id=FILE_ID) Ensure videos meet platform requirements (duration, format, size) Configure UploadPost API: Create account and get API credentials Connect your Instagram, LinkedIn, and TikTok accounts Add credentials to the "Upload a video" node Set up Google Sheets credentials: Connect OAuth2 for both read and update operations Update documentId with your spreadsheet ID Verify sheet name matches (default: "Video") Configure Telegram notifications: Create a Telegram bot via @BotFather Get your chat ID Add credentials to "Send a text message" node Customize posting times: Modify Schedule Trigger hours (default: 9 AM and 9 PM) Times are in Santiago, Chile timezone (America/Santiago) Test the workflow: Create a test entry with current time Run manually to verify all connections work Check Telegram for success notification Activate the workflow ✅ 🎨 How to customize Change posting schedule:** Modify triggerAtHour values in Schedule Trigger (add more times if needed) Add more platforms:** Extend the platform array in "Upload a video" node (supports YouTube, Facebook, Twitter) Customize notification format:** Edit the Telegram message template to include/exclude information Change timezone:** Modify the timeZone parameter in the Code node (default: "America/Santiago") Filter by platform:** Add a filter node before upload to post only to specific platforms on certain days Add approval workflow:** Insert an approval step before posting using Telegram or Slack Multiple accounts per platform:** Modify "Social Media Account IDs" node to specify different account IDs Error handling:** Add error notification paths to alert you if uploads fail Batch posting:** Remove returnFirstMatch option to post multiple videos at once 💡 Pro Tips Time format must match exactly** between Schedule Trigger and Google Sheets for the workflow to trigger Videos should be optimized for each platform before upload (aspect ratio, length, file size) Test with a private account first before going live Keep video files under 100MB for best performance across platforms Use the row_number column to track and update specific posts The workflow runs twice daily, so schedule posts accordingly (9 AM or 9 PM slots) ⚠️ Important Notes Posts marked as "Posteado" won't be processed again (prevents duplicates) Video must be publicly accessible from the Google Drive link UploadPost API has rate limits depending on your plan Telegram notifications show success status and post URLs for each platform The Code node converts times to Spanish format - modify if you need different language/format
by Daniel
Transform any website into a structured knowledge repository with this intelligent crawler that extracts hyperlinks from the homepage, intelligently filters images and content pages, and aggregates full Markdown-formatted content—perfect for fueling AI agents or building comprehensive company dossiers without manual effort. 📋 What This Template Does This advanced workflow acts as a lightweight web crawler: it scrapes the homepage to discover all internal links (mimicking a sitemap extraction), deduplicates and validates them, separates image assets from textual pages, then fetches and converts non-image page content to clean Markdown. Results are seamlessly appended to Google Sheets for easy analysis, export, or integration into vector databases. Automatically discovers and processes subpage links from the homepage Filters out duplicates and non-HTTP links for efficient crawling Converts scraped content to Markdown for AI-ready formatting Categorizes and stores images, links, and full content in a single sheet row per site 🔧 Prerequisites Google account with Sheets access for data storage n8n instance (cloud or self-hosted) Basic understanding of URLs and web links 🔑 Required Credentials Google Sheets OAuth2 API Setup Go to console.cloud.google.com → APIs & Services → Credentials Click "Create Credentials" → Select "OAuth client ID" → Choose "Web application" Add authorized redirect URIs: https://your-n8n-instance.com/rest/oauth2-credential/callback (replace with your n8n URL) Download the client ID and secret, then add to n8n as "Google Sheets OAuth2 API" credential type During setup, grant access to Google Sheets scopes (e.g., spreadsheets) and test the connection by listing a sheet ⚙️ Configuration Steps Import the workflow JSON into your n8n instance In the "Set Website" node, update the website_url value to your target site (e.g., https://example.com) Assign your Google Sheets credential to the three "Add ... to Sheet" nodes Update the documentId and sheetName in those nodes to your target spreadsheet ID and sheet name/ID Ensure your sheet has columns: "Website", "Links", "Scraped Content", "Images" Activate the workflow and trigger manually to test scraping 🎯 Use Cases Knowledge base creation: Crawl a company's site to aggregate all content into Sheets, then export to Notion or a vector DB for internal wikis AI agent training: Extract structured Markdown from industry sites to fine-tune LLMs on domain-specific data like legal docs or tech blogs Competitor intelligence: Build dossiers by crawling rival websites, separating assets and text for SEO audits or market analysis Content archiving: Preserve dynamic sites (e.g., news portals) as static knowledge dumps for compliance or historical research ⚠️ Troubleshooting No links extracted: Verify the homepage has tags; test with a simple site like example.com and check HTTP response in executions Sheet update fails: Confirm column names match exactly (case-sensitive) and credential has edit permissions; try a new blank sheet Content truncated: Google Sheets limits cells to ~50k chars—adjust the .slice(0, 50000) in "Add Scraped Content to Sheet" or split into multiple rows Rate limiting errors: Add a "Wait" node after "Scrape Links" with 1-2s delay if the site blocks rapid requests
by vinci-king-01
Social Media Sentiment Analysis Dashboard with AI and Real-time Monitoring 🎯 Target Audience Social media managers and community managers Marketing teams monitoring brand reputation PR professionals tracking public sentiment Customer service teams identifying trending issues Business analysts measuring social media ROI Brand managers protecting brand reputation Product managers gathering user feedback 🚀 Problem Statement Manual social media monitoring is overwhelming and often misses critical sentiment shifts or trending topics. This template solves the challenge of automatically collecting, analyzing, and visualizing social media sentiment data across multiple platforms to provide actionable insights for brand management and customer engagement. 🔧 How it Works This workflow automatically monitors social media platforms using AI-powered sentiment analysis, processes mentions and conversations, and provides real-time insights through a comprehensive dashboard. Key Components Scheduled Trigger - Runs the workflow at specified intervals to maintain real-time monitoring AI-Powered Sentiment Analysis - Uses advanced NLP to analyze sentiment, emotions, and topics Multi-Platform Integration - Monitors Twitter, Reddit, and other social platforms Real-time Alerting - Sends notifications for critical sentiment changes or viral content Dashboard Integration - Stores all data in Google Sheets for comprehensive analysis and reporting 📊 Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | timestamp | DateTime | When the mention was recorded | "2024-01-15T10:30:00Z" | | platform | String | Social media platform | "Twitter" | | username | String | User who posted the content | "@john_doe" | | content | String | Full text of the post/comment | "Love the new product features!" | | sentiment_score | Number | Sentiment score (-1 to 1) | 0.85 | | sentiment_label | String | Sentiment classification | "Positive" | | emotion | String | Primary emotion detected | "Joy" | | topics | Array | Key topics identified | ["product", "features"] | | engagement | Number | Likes, shares, comments | 1250 | | reach_estimate | Number | Estimated reach | 50000 | | influence_score | Number | User influence metric | 0.75 | | alert_priority | String | Alert priority level | "High" | 🛠️ Setup Instructions Estimated setup time: 20-25 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Google Sheets account with API access Slack workspace for notifications (optional) Social media API access (Twitter, Reddit, etc.) Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai npm install n8n-nodes-slack 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for sentiment analysis data Configure the sheet name (default: "Sentiment Analysis") 4. Configure Social Media Monitoring Update the websiteUrl parameters in ScrapeGraphAI nodes Add URLs for social media platforms you want to monitor Customize the user prompt to extract specific sentiment data Set up keywords, hashtags, and brand mentions to track 5. Set up Notification Channels Configure Slack webhook or API credentials Set up email service credentials for alerts Define sentiment thresholds for different alert levels Test notification delivery 6. Configure Schedule Trigger Set monitoring frequency (every 15 minutes, hourly, etc.) Choose appropriate time zones for your business hours Consider social media platform rate limits 7. Test and Validate Run the workflow manually to verify all connections Check Google Sheets for proper data formatting Test sentiment analysis with sample content 🔄 Workflow Customization Options Modify Monitoring Targets Add or remove social media platforms Change keywords, hashtags, or brand mentions Adjust monitoring frequency based on platform activity Extend Sentiment Analysis Add more sophisticated emotion detection Implement topic clustering and trend analysis Include influencer identification and scoring Customize Alert System Set different thresholds for different sentiment levels Create tiered alert systems (info, warning, critical) Add sentiment trend analysis and predictions Output Customization Add data visualization and reporting features Implement sentiment trend charts and graphs Create executive dashboards with key metrics Add competitor sentiment comparison 📈 Use Cases Brand Reputation Management**: Monitor and respond to brand mentions Crisis Management**: Detect and respond to negative sentiment quickly Customer Feedback Analysis**: Understand customer satisfaction and pain points Product Launch Monitoring**: Track sentiment around new product releases Competitor Analysis**: Monitor competitor sentiment and engagement Influencer Identification**: Find and engage with influential users 🚨 Important Notes Respect social media platforms' terms of service and rate limits Implement appropriate delays between requests to avoid rate limiting Regularly review and update your monitoring keywords and parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Consider privacy implications and data protection regulations 🔧 Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Google Sheets permission errors: Check OAuth2 scope and permissions Sentiment analysis errors: Review the Code node's JavaScript logic Rate limiting: Adjust monitoring frequency and implement delays Alert delivery failures: Check notification service credentials Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations Social media platform API documentation Sentiment analysis best practices and guidelines
by Oneclick AI Squad
This automated n8n workflow processes student applications on a scheduled basis, validates data, updates databases, and sends welcome communications to students and guardians. Main Components Trigger at Every Day 7 am** - Scheduled trigger that runs the workflow daily Read Student Data** - Reads pending applications from Excel/database Validate Application Data** - Checks data completeness and format Process Application Data** - Processes validated applications Update Student Database** - Updates records in the student database Prepare Welcome Email** - Creates personalized welcome messages Send Email** - Sends welcome emails to students/guardians Success Response** - Confirms successful processing Error Response** - Handles any processing errors Essential Prerequisites Excel file with student applications (student_applications.xlsx) Database access for student records SMTP server credentials for sending emails File storage access for reading application data Required Excel File Structure (student_applications.xlsx): Application ID | First Name | Last Name | Email | Phone Program Interest | Grade Level | School | Guardian Name | Guardian Phone Application Date | Status | Notes Expected Input Data Format: { "firstName": "John", "lastName": "Doe", "email": "john.doe@example.com", "phone": "+1234567890", "program": "Computer Science", "gradeLevel": "10th Grade", "school": "City High School", "guardianName": "Jane Doe", "guardianPhone": "+1234567891" } Key Features ⏰ Scheduled Processing:** Runs daily at 7 AM automatically 📊 Data Validation:** Ensures application completeness 💾 Database Updates:** Maintains student records 📧 Auto Emails:** Sends welcome messages ❌ Error Handling:** Manages processing failures Quick Setup Import workflow JSON into n8n Configure schedule trigger (default: 7 AM daily) Set Excel file path in "Read Student Data" node Configure database connection in "Update Student Database" node Add SMTP settings in "Send Email" node Test with sample data Activate workflow Parameters to Configure excel_file_path: Path to student applications file database_connection: Student database credentials smtp_host: Email server address smtp_user: Email username smtp_password: Email password admin_email: Administrator notification email