by Davide
This workflow automates the entire process of collecting, analyzing, and reporting customer reviews from Feedaty (similar to Trustpilot) using ScrapeGraphAI, transforming raw user feedback into a structured, management-ready reputation report in PDF using new Gemini 3 model and ConvertAPI & Upload to Google Drive. Key Advantages ✅ End-to-End Automation From data collection to final PDF delivery, the entire reputation analysis process is fully automated, eliminating manual scraping, copy-paste work, and reporting overhead. ✅ AI-Driven, Management-Ready Insights The workflow does not just summarize reviews it interprets them strategically, producing insights that are immediately useful for: Management Marketing Customer Support Operations Product & UX teams ✅ Structured & Consistent Reporting Every execution produces reports with the same structure, metrics, and logic, making it ideal for: Periodic reputation monitoring Trend analysis over time Internal performance reviews ✅ Scalable & Configurable Easily adaptable to any Feedaty company profile Page limits and review volume can be adjusted without changing logic Can be scheduled or extended to multiple brands ✅ Data Quality & Compliance No personal data exposure Explicit handling of missing or ambiguous information No assumptions or hallucinated insights Fully transparent and audit-friendly output ✅ Seamless Stakeholder Distribution Automatic upload to Google Drive ensures reports are centralized, shareable, and accessible, with no additional manual steps. Ideal Use Cases Brand & reputation monitoring Customer experience audits Quarterly or monthly executive reports Pre-sales or investor documentation Customer support performance evaluation How it works This workflow automates the entire process of collecting, analyzing, and reporting customer feedback from Feedaty. It starts by scraping live reviews from a specified company's Feedaty page using ScrapeGraphAI, extracting review details like date, rating, and text. Each review is then individually analyzed for sentiment (Positive, Neutral, or Negative) using an AI model. All processed reviews are aggregated and passed to a specialized AI agent that performs a comprehensive company-level reputation analysis, generating a structured management report. Finally, the report is converted into an HTML/PDF format and uploaded to a designated Google Drive folder, creating a fully automated pipeline from data collection to actionable insights delivery. Set up steps Configure Parameters: Set the Feedaty company identifier (e.g., maxisport) and the maximum number of review pages to scrape in the "Set Parameters" node. API Credentials: Ensure the following credentials are configured in n8n: ScrapeGraphAI API (for web scraping) Google Gemini API (for AI sentiment analysis and report generation) Google Drive OAuth2 (for file upload) ConvertAPI (for HTML to PDF conversion) Customize Output: Optionally adjust the "Limit reviews" node to control the number of reviews processed and modify the AI agent's system prompt in "Company Reputation Management" to tailor the report format. Destination Folder: Verify the Google Drive folder ID in the "Upload file" node points to the correct destination for the generated reports. Execution: Trigger the workflow manually via the "When clicking ‘Test workflow’" node to run the complete scraping, analysis, and reporting pipeline. 👉 Subscribe to my new YouTube channel. Here I’ll share videos and Shorts with practical tutorials and FREE templates for n8n. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Atta
Never guess your SEO strategy again. This advanced workflow automates the most time-consuming part of SEO: auditing competitor articles and identifying exactly where your brand can outshine them. It extracts deep content from top-ranking URLs, compares it against your specific brand identity, and generates a ready-to-use "Action Plan" for your content team. The workflow uses Decodo for high-fidelity scraping, Gemini 2.5 Flash for strategic gap analysis, and Google Sheets as a dynamic "Brand Brain" and reporting dashboard. ✨ Key Features Brand-Centric Auditing:* Unlike generic SEO tools, this engine uses a live Google Sheet containing your *Brand Identity** to find "Content Gaps" specific to your unique value proposition. Automated SERP Itemization:** Converts a simple list of keywords into a filtered list of top-performing competitor URLs. Deep Markdown Extraction:** Uses Decodo Universal to bypass bot-blockers and extract clean Markdown content, preserving headers and structure for high-fidelity AI analysis. Structured Action Plans:** Outputs machine-readable JSON containing the competitor's H1, their "Winning Factor," and a 1-sentence "Checkmate" instruction for your writers. ⚙️ How it Works Data Foundation: The workflow triggers (Manual or Scheduled) and pulls your Global Config (e.g., result limits) and Brand Identity from a dedicated Google Sheet. Market Discovery: It retrieves your target keywords and uses the Decodo Google Search node to identify the top competitors. A Code Node then "itemizes" these results into individual URLs. Intelligence Harvesting: Decodo Universal scrapes each URL, and an HTML 5 node extracts the body content into Markdown format to minimize token noise for the AI. Strategic Audit: The AI Content Auditor (powered by Gemini) receives the competitor’s text and your Brand Identity. It identifies what the competitor missed that your brand excels at. Reporting Deck: The final Strategy Master Writer node appends the analysis—including the "Content Gap" and "Action Plan"—into a master Google Sheet for your marketing team. 📥 Component Installation This workflow relies on the Decodo node for search and scraping precision. Install Node: Click the + button in n8n, search for "Decodo," and add it to your canvas. Credentials: Use your Decodo API key. (Tip: Use a residential proxy setting for difficult sites like Reddit or Stripe). Gemini: Ensure you have the Google Gemini Chat Model node connected to the AI Agent. 🎁 Get a free Web Scraping API subscription here 👉🏻 https://visit.decodo.com/X4YBmy 🛠️ Setup Instructions 1. Google Sheets Configuration Create a spreadsheet with the following three tabs: Target Keywords**: One column named Target Keyword. Brand Identity**: One cell containing your brand mission, USPs, and target audience. Competitor Audit Feed**: Headers for Keyword, URL, Rank, Winning Factor, Content Gap, and Action Plan. Clone the spreadsheet here. 2. Global Configuration In the Config (Set) node, define your serp_results_amount (e.g., 10). This controls how many competitors are analyzed per keyword. ➕ How to Adapt the Template Competitor Exclusion:* Add a *Filter** node after "Market Discovery" to automatically skip domains like amazon.com or reddit.com if they aren't relevant to your niche. Slack Alerts:* Connect a *Slack** node after the AI analysis to notify your content manager immediately when a high-impact "Action Plan" is generated for a priority keyword. Multi-Model Verification:* Swap Gemini with *Claude 3.5 Sonnet* or *GPT-4o** in the Strategic Audit section to compare different AI perspectives on the same competitor content.
by Karol
How it works This workflow turns any URL sent to a Telegram bot into ready-to-publish social posts: Trigger: Telegram message (checks if it contains a URL). Fetch & parse: Downloads the page and extracts readable text + title. AI writing: Generates platform-specific copy (Facebook, Instagram, LinkedIn). Image: Creates an AI image and stores it in Supabase Storage. Publish: Posts to Facebook Pages, Instagram Business, LinkedIn. Logging: Updates Google Sheets with post URLs and sends a Telegram confirmation (image + links). Setup Telegram – create a bot, connect via n8n Telegram credentials. OpenAI / Gemini – add API key in n8n Credentials and select it in the AI nodes. Facebook/Instagram (Graph API) – create a credential called facebookGraph with: • accessToken (page-scoped or system user) • pageId (for Facebook Page photos) • igUserId (Instagram Business account ID) • optional fbApiVersion (default v19.0) LinkedIn – connect with OAuth2 in the LinkedIn node (leave as credential). Supabase – credential supabase with url and apiKey. Ensure a bucket exists (default used in the Set node is social-media). Google Sheets – replace YOUR_GOOGLE_SHEET_ID and Sheet1. Grant your n8n Google OAuth2 access. Notes • No API keys are stored in the template. Everything runs via n8n Credentials. • You can change bucket name, image size/quality, and AI prompts in the respective nodes. • The confirmation message on Telegram includes direct permalinks to the published posts. Required credentials • Telegram Bot • OpenAI (or Gemini) • Facebook/Instagram Graph • LinkedIn OAuth2 • Supabase (url + apiKey) • Google Sheets OAuth2 Inputs • A Telegram message that contains a URL. Outputs • Social posts published on Facebook, Instagram, LinkedIn. • Row appended/updated in Google Sheets with post URLs and image link. • Telegram confirmation with the generated image + post links.
by Oneclick AI Squad
AI Customer Call Analyzer — Voice → Insights → CRM with GPT-4 Converts raw sales call recordings into structured CRM intelligence. Uploads audio → transcribes via Whisper → GPT-4 extracts intent, sentiment, objections, next steps → updates CRM and sends a structured summary to the sales team. How it works Upload Call Recording - Webhook receives audio file upload (mp3, wav, m4a) from sales rep portal Validate & Prepare Audio - Checks file type, size limits, extracts call metadata Transcribe via Whisper - Sends audio to OpenAI Whisper API for high-accuracy transcription Wait — Transcription Buffer - Holds until transcription is confirmed complete GPT-4 Call Intelligence - Extracts intent, sentiment, objections, buying signals, action items MCP Context Enrichment - Pulls CRM history and enriches analysis with account context Update CRM Record - Writes structured insights back to CRM (HubSpot / Salesforce) Send Sales Summary - Emails rep and manager with call scorecard and next steps Audit Log - Records all processing steps for compliance and coaching Setup Steps Import this workflow into n8n Configure credentials: OpenAI API - For Whisper transcription and GPT-4 analysis HubSpot / Salesforce - CRM update target Google Sheets - Audit log and call registry SMTP / Gmail - Sales summary delivery Set your CRM API endpoint and field mapping in the update node Configure your sales team email list in the notify node Activate the workflow Sample Upload Payload { "callId": "CALL-20250222-0042", "repEmail": "jane.smith@company.com", "repName": "Jane Smith", "contactEmail": "buyer@prospect.com", "contactName": "Bob Johnson", "companyName": "Acme Corp", "dealStage": "negotiation", "callDurationSecs": 1847, "audioUrl": "https://storage.company.com/calls/call-0042.mp3" } Features Whisper-powered transcription** with speaker diarization hints GPT-4 intent and sentiment** extraction with confidence scores Objection and buying signal** detection Auto CRM field mapping** — no manual data entry Sales scorecard** with talk ratio, next step clarity, deal risk Full audit trail** for call coaching and compliance Explore More LinkedIn & Social Automation: Contact us to design AI-powered lead nurturing, content engagement, and multi-platform reply workflows tailored to your growth strategy.
by Mariela Slavenova
This template crawls a website from its sitemap, deduplicates URLs in Supabase, scrapes pages with Crawl4AI, cleans and validates the text, then stores content + metadata in a Supabase vector store using OpenAI embeddings. It’s a reliable, repeatable pipeline for building searchable knowledge bases, SEO research corpora, and RAG datasets. ⸻ Good to know • Built-in de-duplication via a scrape_queue table (status: pending/completed/error). • Resilient flow: waits, retries, and marks failed tasks. • Costs depend on Crawl4AI usage and OpenAI embeddings. • Replace any placeholders (API keys, tokens, URLs) before running. • Respect website robots/ToS and applicable data laws when scraping. How it works Sitemap fetch & parse — Load sitemap.xml, extract all URLs. De-dupe — Normalize URLs, check Supabase scrape_queue; insert only new ones. Scrape — Send URLs to Crawl4AI; poll task status until completed. Clean & score — Remove boilerplate/markup, detect content type, compute quality metrics, extract metadata (title, domain, language, length). Chunk & embed — Split text, create OpenAI embeddings. Store — Upsert into Supabase vector store (documents) with metadata; update job status. Requirements • Supabase (Postgres + Vector extension enabled) • Crawl4AI API key (or header auth) • OpenAI API key (for embeddings) • n8n credentials set for HTTP, Postgres/Supabase How to use Configure credentials (Supabase/Postgres, Crawl4AI, OpenAI). (Optional) Run the provided SQL to create scrape_queue and documents. Set your sitemap URL in the HTTP Request node. Execute the workflow (manual trigger) and monitor Supabase statuses. Query your documents table or vector store from your app/RAG stack. Potential Use Cases This automation is ideal for: Market research teams collecting competitive data Content creators monitoring web trends SEO specialists tracking website content updates Analysts gathering structured data for insights Anyone needing reliable, structured web content for analysis Need help customizing? Contact me for consulting and support: LinkedIn
by Growth AI
SEO Content Generation Workflow - n8n Template Instructions Who's it for This workflow is designed for SEO professionals, content marketers, digital agencies, and businesses who need to generate optimized meta tags, H1 headings, and content briefs at scale. Perfect for teams managing multiple clients or large keyword lists who want to automate competitor analysis and SEO content creation while maintaining quality and personalization. How it works The workflow automates the entire SEO content creation process by analyzing your target keywords against top competitors, then generating optimized meta elements and comprehensive content briefs. It uses AI-powered analysis combined with real competitor data to create SEO-friendly content that's tailored to your specific business context. The system processes keywords in batches, performs Google searches, scrapes competitor content, analyzes heading structures, and generates personalized SEO content using your company's database information for maximum relevance. Requirements Required Services and Credentials Google Sheets API**: For reading configuration and updating results Anthropic API**: For AI content generation (Claude Sonnet 4) OpenAI API**: For embeddings and vector search Apify API**: For Google search results Firecrawl API**: For competitor website scraping Supabase**: For vector database (optional but recommended) Template Spreadsheet Copy this template spreadsheet and configure it with your information: Template Link How to set up Step 1: Copy and Configure Template Make a copy of the template spreadsheet Fill in the Client Information sheet: Client name: Your company or client's name Client information: Brief business description URL: Website address Supabase database: Database name (prevents AI hallucination) Tone of voice: Content style preferences Restrictive instructions: Topics or approaches to avoid Complete the SEO sheet with your target pages: Page: Page you're optimizing (e.g., "Homepage", "Product Page") Keyword: Main search term to target Awareness level: User familiarity with your business Page type: Category (homepage, blog, product page, etc.) Step 2: Import Workflow Import the n8n workflow JSON file Configure all required API credentials in n8n: Google Sheets OAuth2 Anthropic API key OpenAI API key Apify API key Firecrawl API key Supabase credentials (if using vector database) Step 3: Test Configuration Activate the workflow Send your Google Sheets URL to the chat trigger Verify that all sheets are readable and credentials work Test with a single keyword row first Workflow Process Overview Phase 0: Setup and Configuration Copy template spreadsheet Configure client information and SEO parameters Set up API credentials in n8n Phase 1: Data Input and Processing Chat trigger receives Google Sheets URL System reads client configuration and SEO data Filters valid keywords and empty H1 fields Initiates batch processing Phase 2: Competitor Research and Analysis Searches Google for top 10 results per keyword Scrapes first 5 competitor websites Extracts heading structures (H1-H6) Analyzes competitor meta tags and content organization Phase 3: Meta Tags and H1 Generation AI analyzes keyword context and competitor data Accesses client database for personalization Generates optimized meta title (65 chars max) Creates compelling meta description (165 chars max) Produces user-focused H1 (70 chars max) Phase 4: Content Brief Creation Analyzes search intent percentages Develops content strategy based on competitor analysis Creates detailed MECE page structure Suggests rich media elements Provides writing recommendations and detail level scoring Phase 5: Data Integration and Updates Combines all generated content into unified structure Updates Google Sheets with new SEO elements Preserves existing data while adding new content Continues batch processing for remaining keywords How to customize the workflow Adjusting AI Models Replace Anthropic Claude with other LLM providers Modify system prompts for different content styles Adjust character limits for meta elements Modifying Competitor Analysis Change number of competitors analyzed (currently 5) Adjust scraping parameters in Firecrawl nodes Modify heading extraction logic in JavaScript nodes Customizing Output Format Update Google Sheets column mapping in Code node Modify structured output parser schema Change batch processing size in Split in Batches node Adding Quality Controls Insert validation nodes between phases Add error handling and retry logic Implement content quality scoring Extending Functionality Add keyword research capabilities Include image optimization suggestions Integrate social media content generation Connect to CMS platforms for direct publishing Best Practices Test with small batches before processing large keyword lists Monitor API usage and costs across all services Regularly update system prompts based on output quality Maintain clean data in your Google Sheets template Use descriptive node names for easier workflow maintenance Troubleshooting API Errors**: Check credential configuration and usage limits Scraping Failures**: Firecrawl nodes have error handling enabled Empty Results**: Verify keyword formatting and competitor availability Sheet Updates**: Ensure proper column mapping in final Code node Processing Stops**: Check batch processing limits and timeout settings
by Growth AI
SEO Content Generation Workflow (Basic Version) - n8n Template Instructions Who's it for This workflow is designed for SEO professionals, content marketers, digital agencies, and businesses who need to generate optimized meta tags, H1 headings, and content briefs at scale. Perfect for teams managing multiple clients or large keyword lists who want to automate competitor analysis and SEO content creation without the complexity of vector databases. How it works The workflow automates the entire SEO content creation process by analyzing your target keywords against top competitors, then generating optimized meta elements and comprehensive content briefs. It uses AI-powered analysis combined with real competitor data to create SEO-friendly content that's tailored to your specific business context. The system processes keywords in batches, performs Google searches, scrapes competitor content, analyzes heading structures, and generates personalized SEO content using your company information for maximum relevance. Requirements Required Services and Credentials Google Sheets API**: For reading configuration and updating results Anthropic API**: For AI content generation (Claude Sonnet 4) Apify API**: For Google search results Firecrawl API**: For competitor website scraping Template Spreadsheet Copy this template spreadsheet and configure it with your information: Template Link How to set up Step 1: Copy and Configure Template Make a copy of the template spreadsheet Fill in the Client Information sheet: Client name: Your company or client's name Client information: Brief business description URL: Website address Tone of voice: Content style preferences Restrictive instructions: Topics or approaches to avoid Complete the SEO sheet with your target pages: Page: Page you're optimizing (e.g., "Homepage", "Product Page") Keyword: Main search term to target Awareness level: User familiarity with your business Page type: Category (homepage, blog, product page, etc.) Step 2: Import Workflow Import the n8n workflow JSON file Configure all required API credentials in n8n: Google Sheets OAuth2 Anthropic API key Apify API key Firecrawl API key Step 3: Test Configuration Activate the workflow Send your Google Sheets URL to the chat trigger Verify that all sheets are readable and credentials work Test with a single keyword row first Workflow Process Overview Phase 0: Setup and Configuration Copy template spreadsheet Configure client information and SEO parameters Set up API credentials in n8n Phase 1: Data Input and Processing Chat trigger receives Google Sheets URL System reads client configuration and SEO data Filters valid keywords and empty H1 fields Initiates batch processing Phase 2: Competitor Research and Analysis Searches Google for top 10 results per keyword using Apify Scrapes first 5 competitor websites using Firecrawl Extracts heading structures (H1-H6) from competitor pages Analyzes competitor meta tags and content organization Processes markdown content to identify heading hierarchies Phase 3: Meta Tags and H1 Generation AI analyzes keyword context and competitor data using Claude Incorporates client information for personalization Generates optimized meta title (65 characters maximum) Creates compelling meta description (165 characters maximum) Produces user-focused H1 (70 characters maximum) Uses structured output parsing for consistent formatting Phase 4: Content Brief Creation Analyzes search intent percentages (informational, transactional, navigational) Develops content strategy based on competitor analysis Creates detailed MECE page structure with H2 and H3 sections Suggests rich media elements (images, videos, infographics, tables) Provides writing recommendations and detail level scoring (1-10 scale) Ensures SEO optimization while maintaining user relevance Phase 5: Data Integration and Updates Combines all generated content into unified structure Updates Google Sheets with new SEO elements Preserves existing data while adding new content Continues batch processing for remaining keywords Key Differences from Advanced Version This basic version focuses on core SEO functionality without additional complexity: No Vector Database**: Removes Supabase integration for simpler setup Streamlined Architecture**: Fewer dependencies and configuration steps Essential Features Only**: Core competitor analysis and content generation Faster Setup**: Reduced time to deployment Lower Costs**: Fewer API services required How to customize the workflow Adjusting AI Models Replace Anthropic Claude with other LLM providers in the agent nodes Modify system prompts for different content styles or languages Adjust character limits for meta elements in the structured output parser Modifying Competitor Analysis Change number of competitors analyzed (currently 5) by adding/removing Scrape nodes Adjust scraping parameters in Firecrawl nodes for different content types Modify heading extraction logic in JavaScript Code nodes Customizing Output Format Update Google Sheets column mapping in the final Code node Modify structured output parser schema for different data structures Change batch processing size in Split in Batches node Adding Quality Controls Insert validation nodes between workflow phases Add error handling and retry logic to critical nodes Implement content quality scoring mechanisms Extending Functionality Add keyword research capabilities with additional APIs Include image optimization suggestions Integrate social media content generation Connect to CMS platforms for direct publishing Best Practices Setup and Testing Always test with small batches before processing large keyword lists Monitor API usage and costs across all services Regularly update system prompts based on output quality Maintain clean data in your Google Sheets template Content Quality Review generated content before publishing Customize system prompts to match your brand voice Use descriptive node names for easier workflow maintenance Keep competitor analysis current by running regularly Performance Optimization Process keywords in small batches to avoid timeouts Set appropriate retry policies for external API calls Monitor workflow execution times and optimize bottlenecks Troubleshooting Common Issues and Solutions API Errors Check credential configuration in n8n settings Verify API usage limits and billing status Ensure proper authentication for each service Scraping Failures Firecrawl nodes have error handling enabled to continue on failures Some websites may block scraping - this is normal behavior Check if competitor URLs are accessible and valid Empty Results Verify keyword formatting in Google Sheets Ensure competitor websites contain the expected content structure Check if meta tags are properly formatted in system prompts Sheet Update Errors Ensure proper column mapping in final Code node Verify Google Sheets permissions and sharing settings Check that target sheet names match exactly Processing Stops Review batch processing limits and timeout settings Check for errors in individual nodes using execution logs Verify all required fields are populated in input data Template Structure Required Sheets Client Information: Business details and configuration SEO: Target keywords and page information Results Sheet: Where generated content will be written Expected Columns Keywords**: Target search terms Description**: Brief page description Type de page**: Page category Awareness level**: User familiarity level title, meta-desc, h1, brief**: Generated output columns This streamlined version provides all essential SEO content generation capabilities while being easier to set up and maintain than the advanced version with vector database integration.
by Dr. Firas
Generate AI Viral Videos with VEO3 and Auto-Publish to TikTok Who is this for? This workflow is for content creators, marketers, and social media managers who want to consistently produce viral-style short videos and publish them automatically to TikTok — without manual editing or uploading. What problem is this workflow solving? / Use case Creating short-form video content that stands out takes time: ideation, scriptwriting, video generation, and publishing. This workflow automates the entire pipeline — from idea generation to TikTok upload — enabling you to scale your content strategy and focus on creativity rather than repetitive tasks. What this workflow does Generates viral video ideas** daily using GPT-5 Creates structured prompts** for before/after transformation videos Renders cinematic vertical videos** with VEO3 (9:16 format) Saves ideas and metadata** into Google Sheets for tracking Uploads videos automatically to TikTok** via Blotato integration Updates status in Google Sheets** once the video is live The result: a fully automated daily viral video publishing system. Setup Google Sheets Connect your Google Sheets account. Create a sheet with columns for idea, caption, environment, sound, production, and final_output. OpenAI Add your OpenAI API credentials (for GPT-5 mini / GPT-4.1 mini). VEO3 (Kie API) Set up your API key in the HTTP Request node (Generate Video with VEO3). Blotato Connect your Blotato account for TikTok publishing. Schedule Trigger Adjust the Start Daily Content Generation node to fit your preferred posting frequency. How to customize this workflow to your needs Platforms**: Extend publishing to YouTube Shorts or Instagram Reels by duplicating the TikTok step. Frequency**: Change the Schedule Trigger to post multiple times per day or only a few times per week. Creative Style**: Modify the system prompts to align with your brand’s style (cinematic, minimalist, neon, etc.). Tracking**: Enhance the Google Sheets logging with engagement metrics by pulling TikTok analytics via Blotato. This workflow helps you build a hands-free AI-powered content engine, turning raw ideas into published viral videos every day. 📄 🎥 Watch This Tutorial: Step by Step 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
by Avkash Kakdiya
How it works This workflow enriches and personalizes your lead profiles by integrating HubSpot contact data, scraping social media information, and using AI to generate tailored outreach emails. It streamlines the process from contact capture to sending a personalized email — all automatically. The system fetches new or updated HubSpot contacts, verifies and enriches their Twitter/LinkedIn data via Phantombuster, merges the profile and engagement insights, and finally generates a customized email ready for outreach. Step-by-step 1. Trigger & Input HubSpot Contact Webhook: Fires when a contact is created or updated in HubSpot. Fetch Contact: Pulls the full contact details (email, name, company, and social profiles). Update Google Sheet: Logs Twitter/LinkedIn usernames and marks their tracking status. 2. Validation Validate Twitter/LinkedIn Exists: Checks if the contact has a valid social profile before proceeding to scraping. 3. Social Media Scraping (via Phantombuster) Launch Profile Scraper & 🎯 Launch Tweet Scraper: Triggers Phantombuster agents to fetch profile details and recent tweets. Wait Nodes: Ensures scraping completes (30–60 seconds). Fetch Profile/Tweet Results: Retrieves output files from Phantombuster. Extract URL: Parses the job output to extract the downloadable .json or .csv data file link. 4. Data Download & Parsing Download Profile/Tweet Data: Downloads scraped JSON files. Parse JSON: Converts the raw file into structured data for processing. 5. Data Structuring & Merging Format Profile Fields: Maps stats like bio, followers, verified status, likes, etc. Format Tweet Fields: Captures tweet data and associates it with the lead’s email. Merge Data Streams: Combines tweet and profile datasets. Combine All Data: Produces a single, clean object containing all relevant lead details. 6. AI Email Generation & Delivery Generate Personalized Email: Feeds the merged data into OpenAI GPT (via LangChain) to craft a custom HTML email using your brand details. Parse Email Content: Cleans AI output into structured subject and body fields. Sends Email: Automatically delivers the personalized email to the lead via Gmail. Benefits Automated Lead Enrichment — Combines CRM and real-time social media data with zero manual research. Personalized Outreach at Scale — AI crafts unique, relevant emails for each contact. Improved Engagement Rates — Targeted messages based on actual social activity and profile details. Seamless Integration — Works directly with HubSpot, Google Sheets, Gmail, and Phantombuster. Time & Effort Savings — Replaces hours of manual lookup and email drafting with an end-to-end automated flow.
by Hugo Le Poole
Generate AI voice receptionist agents for local businesses using VAPI Automate the creation of personalized AI phone receptionists for local businesses by scraping Google Maps, analyzing websites, and deploying voice agents to VAPI. Who is this for? Agencies** offering AI voice solutions to local businesses Consultants** helping SMBs modernize their phone systems Developers** building lead generation tools for voice AI services Entrepreneurs** launching AI receptionist services at scale What this workflow does This workflow automates the entire process of creating customized AI voice agents: Collects business criteria through a form (city, keywords, quantity) Scrapes Google Maps for matching local businesses using Apify Fetches and analyzes each business website Generates tailored voice agent prompts using Claude AI Automatically provisions voice assistants via VAPI API Logs all created agents to Google Sheets for tracking The AI adapts prompts based on business type (salon, restaurant, dentist, spa) with appropriate tone, services, and booking workflows. Setup requirements Apify account** with Google Maps Scraper actor access Anthropic API key** for prompt generation OpenRouter API key** for website analysis VAPI account** with API access Google Sheets** connected via OAuth How to set up Import the workflow template Add your Apify credentials to the scraping node Configure Anthropic and OpenRouter API keys Replace YOUR_VAPI_API_KEY in the HTTP Request node header Connect your Google Sheets account Create a Google Sheet with columns: Business Name, Category, Address, Phone, Agent ID, Agent URL Update the Sheet URL in both Google Sheets nodes Activate the workflow and submit the form Customization options Business templates**: Edit the prompt in "Generate Agent Messages" to add new business categories Voice settings**: Modify ElevenLabs voice parameters (stability, similarity boost) LLM model**: Switch between GPT-4, Claude, or other models via OpenRouter Output format**: Customize the results page HTML in the final Form node
by TOMOMITSU ASANO
Intelligent Invoice Processing with AI Classification and XML Export Summary Automated invoice processing pipeline that extracts data from PDF invoices, uses AI Agent for intelligent expense categorization, generates XML for accounting systems, and routes high-value invoices for approval. Detailed Description A comprehensive accounts payable automation workflow that monitors for new PDF invoices, extracts text content, uses AI to classify expenses and detect anomalies, converts to XML format for accounting system integration, and implements approval workflows for high-value or unusual invoices. Key Features PDF Text Extraction**: Extract from File node parses invoice PDFs automatically AI-Powered Classification**: AI Agent categorizes expenses, suggests GL codes, detects anomalies XML Export**: Convert structured data to accounting-compatible XML format Approval Workflow**: Route invoices over $5,000 or low confidence for human review Multi-Trigger Support**: Google Drive monitoring or manual webhook upload Comprehensive Logging**: Archive all processed invoices to Google Sheets Use Cases Accounts payable automation Expense report processing Vendor invoice management Financial document digitization Audit trail generation Required Credentials Google Drive OAuth (for PDF source folder) OpenAI API key Slack Bot Token Gmail OAuth Google Sheets OAuth Node Count: 24 (19 functional + 5 sticky notes) Unique Aspects Uses Extract from File node for PDF text extraction (rarely used) Uses XML node for JSON to XML conversion (very rare) Uses AI Agent node for intelligent classification Uses Google Drive Trigger for file monitoring Implements approval workflow with conditional routing Webhook response** mode for API integration Workflow Architecture [Google Drive Trigger] [Manual Webhook] | | +----------+-----------+ | v [Filter PDF Files] | v [Download Invoice PDF] | v [Extract PDF Text] | v [Parse Invoice Data] (Code) | v [AI Invoice Classifier] <-- [OpenAI Chat Model] | v [Parse AI Classification] | v [Convert to XML] | v [Format XML Output] | v [Needs Approval?] (If) / \ Yes (>$5000) No (Auto) | | [Email Approval] [Slack Notify] | | +------+-------+ | v [Archive to Google Sheets] | v [Respond to Webhook] Configuration Guide Google Drive: Set folder ID to monitor in Drive Trigger node Approval Threshold: Default $5,000, adjust in "Needs Approval?" node Email Recipients: Configure finance-approvers@example.com Slack Channel: Set #finance-notifications for updates GL Codes: AI suggests codes; customize in AI prompt if needed Google Sheets: Configure document for invoice archive
by giangxai
Overview Automatically generate viral short-form health videos using AI and publish them to social platforms with n8n and Veo 3. This workflow collects viral ideas, analyzes engagement patterns, generates AI video scripts, renders videos with Veo 3, and handles publishing and tracking fully automated, with no manual editing. Who is this for? This template is ideal for: Content creators building faceless health channels (Shorts, Reels, TikTok) Affiliate marketers promoting health products with video content AI marketers running high-volume short-form content funnels Automation builders combining LLMs, video AI, and n8n Teams that want a scalable, repeatable system for viral AI video production If you want to create health niche videos at scale without manually scripting, rendering, and uploading each video, this workflow is for you. What problem is this workflow solving? Creating viral short-form health videos usually involves many manual steps and disconnected tools, such as: Manually collecting and validating viral content ideas Writing hooks and scripts for each video Switching between AI tools for analysis and video generation Waiting for videos to render and checking status manually Uploading videos and tracking what has been published This workflow connects all these steps into a single automated pipeline and removes repetitive manual work. What this workflow does This automated AI health video workflow: Runs on a defined schedule Collects viral health content ideas from external sources Normalizes and stores ideas in Google Sheets Loads pending viral ideas for processing Analyzes each idea and generates AI-optimized video scripts Creates AI videos automatically using the Veo 3 API Waits for video rendering and checks completion status Retrieves the final rendered videos Optionally aggregates or merges video assets Publishes videos to social platforms Updates Google Sheets with processing and publishing results The entire process runs end-to-end with minimal human intervention. Setup 1. Prepare Google Sheets Create a Google Sheet to manage your content pipeline with columns such as: idea / topic – Viral idea or source content analysis – AI analysis or hook summary script – Generated video script status – pending / processing / completed / failed video_url – Final rendered video link publish_result – Publishing status or notes Only rows marked as pending will be processed by the workflow. 2. Connect Google Sheets Authenticate your Google Sheets account in n8n Select the spreadsheet in the load and update nodes Ensure the workflow can write status updates back to the same sheet 3. Configure AI & Veo 3 Add credentials for your AI model (e.g. Gemini or similar) Configure prompt logic for health niche content Add your Veo 3 API credentials Test video creation with a small number of ideas before scaling 4. Configure Publishing & Schedule Set up publishing credentials for your target social platforms Open the Schedule triggers and define how often the workflow runs The schedule controls how frequently new AI health videos are created and published How to customize this workflow to your needs You can adapt this workflow without changing the core structure: Replace viral idea sources with your own research or internal data Adjust AI prompts for different health sub-niches Add manual approval steps before video creation Disable publishing and use the workflow only for video generation Add retry logic for failed renders or API errors Extend the workflow with analytics or performance tracking Best practices Start with a small batch of test ideas Keep status values consistent in Google Sheets Focus on strong hooks for health-related content Monitor rendering and publishing nodes during early runs Adjust schedule frequency based on API limits Documentation For a full walkthrough and advanced customization ideas, see the Video Guide.