by lin@davoy.tech
The YogiAI workflow automates sending daily yoga pose reminders and related information via Line Push Messages . This automation leverages data from a Google Sheets database containing yoga pose details such as names, image URLs, and links to ensure users receive personalized and engaging content every day. Purpose Provide users with daily yoga pose suggestions tailored to their practice. Deliver visually appealing and informative content through Line's Flex Messages, including images and clickable links. Log user interactions and preferences back into Google Sheets to refine future recommendations. Key Features Automated Daily Reminders : Sends a curated list of yoga poses at a scheduled time (21:30 Bangkok time). Dynamic Content Generation : Uses AI to rewrite and format messages in a user-friendly manner, complete with emojis and clear instructions. Integration with Google Sheets : Pulls data from a predefined Google Sheet and logs interactions for continuous improvement. Customizable Messaging : Ensures JSON outputs are properly formatted for Line’s Flex Message API, allowing for interactive and visually rich content. Data Source Google Sheets Structure The workflow relies on a Google Sheet structured as follows: PoseName : The name of the yoga pose. uri : The image URL representing the pose. url : A clickable link directing users to more information about the pose. Sample Data Layout Supine Angle https://example.com/SupineAngle-tn146.png https://example.com/pose/SupineAngle Warrior II https://example.com/WarriorII-tn146.png https://example.com/pose/WarriorII *Note : Ensure that you update the Google Sheet with your own data. Refer to this sample sheet for reference. * Scheduled Trigger The workflow is triggered daily at 21:30 (9:30 PM) Bangkok Time (Asia/Bangkok) . This ensures timely delivery of reminders to users, keeping them engaged with their yoga practice. Workflow Process Data Retrieval Node: Get PoseName Fetches yoga pose details from the specified range in the Google Sheet. Content Generation Node: WritePosesToday Utilizes Azure OpenAI to craft user-friendly text, complete with emojis and clear instructions. Node: RewritePosesToday Formats the AI-generated text specifically for Line messaging, ensuring compatibility and visual appeal. JSON Formatting Node: WriteJSONflex Generates JSON structures required for Line’s Flex Messages, enabling carousel displays of yoga pose images and links. Node: Fix JSON Ensures all JSON outputs are correctly formatted before being sent via Line. Message Delivery Node: Line Push with Flex Bubble Sends the final message, including both text and Flex Message carousels, directly to users via Line Push Messages. Logging Interactions Nodes: YogaLog & YogaLog2 Logs each interaction back into Google Sheets to track which poses were sent and how often they appear, refining future recommendations. Setup Prerequisites Google Sheets Account : Set up a Google Sheet with the required structure and populate it with your yoga pose data. Line Developer Account : Create a Line channel to obtain necessary credentials for sending push messages. Azure OpenAI Account : Configure access to Azure OpenAI services for generating and formatting content. Intended Audience This workflow is ideal for: Yoga Instructors : Seeking to engage students with daily pose suggestions. Fitness Enthusiasts : Looking to maintain consistency in their yoga practice. Content Creators : Interested in automating personalized and visually appealing content distribution.
by Jez
Summary This n8n workflow implements an AI-powered "Local Event Finder" agent. It takes user criteria (like event type, city, date, and interests), uses a suite of search tools (Brave Web Search, Brave Local Search, Google Gemini Search) and a web scraper (Jina AI) to find relevant events, and returns formatted details. The entire agent is exposed as a single, easy-to-use MCP (Multi-Capability Peer) tool, making it simple to integrate into other workflows or applications. This template cleverly combines the MCP server endpoint and the AI agent logic into a single n8n workflow file for ease of import and management. Key Features Intelligent Multi-Tool Search:** Dynamically utilizes web search, precise local search, and advanced Gemini semantic search to find events. Detailed Information via Web Scraping:** Employs Jina AI to extract comprehensive details directly from event web pages. Simplified MCP Tool Exposure:** Makes the complex event-finding logic available as a single, callable tool for other MCP-compatible clients (e.g., Roo Code, Cline, other n8n workflows). Customizable AI Behavior:** The core AI agent's behavior, tool usage strategy, and output formatting can be tailored by modifying its System Prompt. Modular Design:** Uses distinct nodes for LLM, memory, and each external tool, allowing for easier modification or extension. Benefits Simplifies Client-Side Integration:** Offloads the complexity of event searching and data extraction from client applications. Provides Richer Event Data:** Goes beyond simple search links to extract and format key event details. Flexible & Adaptable:** Can be adjusted to various event search needs and can incorporate new tools or data sources. Efficient Processing:** Leverages specialized tools for different aspects of the search process. Nodes Used MCP Trigger Tool Workflow Execute Workflow Trigger AI Agent Google Gemini Chat Model (ChatGoogleGenerativeAI) Simple Memory (Window Buffer Memory) MCP Client (for Brave Search tools via Smithery) Google Gemini Search Tool Jina AI Tool Prerequisites An active n8n instance. Google AI API Key:** For the Gemini LLM (Google Gemini Chat Model node) and the Google Gemini Search Tool. Ensure your key is enabled for these services. Jina AI API Key:** For the jina_ai_web_page_scraper node. A free tier is often available. Access to a Brave Search MCP Provider (Optional but Recommended):** This template uses MCP Client nodes configured for Brave Search via a provider like Smithery. You'll need an account/API key for your chosen Brave Search MCP provider to configure the smithery brave search credential. Alternatively, you could adapt these to call Brave Search API directly if you manage your own access, or replace them with other search tools. Setup Instructions Import Workflow: Download the JSON file for this template and import it into your n8n instance. Configure Credentials: Google Gemini LLM: Locate the Google Gemini Chat Model node. Select or create a "Google Gemini API" credential (named Google Gemini Context7 in the template) using your Google AI API Key. Google Gemini Search Tool: Locate the google_gemini_event_search node. Select or create a "Gemini API" credential (named Gemini Credentials account in the template) using your Google AI API Key (ensure it's enabled for Search/Vertex AI). Jina AI Web Scraper: Locate the jina_ai_web_page_scraper node. Select or create a "Jina AI API" credential (named Jina AI account in the template) using your Jina AI API Key. Brave Search (via MCP): You'll need an MCP Client HTTP API credential to connect to your Brave Search MCP provider (e.g., Smithery). Create a new "MCP Client HTTP API" credential in n8n. Name it, for example, smithery brave search. Configure it with the Base URL and any required authentication (e.g., API key in headers) for your Brave Search MCP provider. Locate the brave_web_search and brave_local_search MCP Client nodes in the workflow. Assign the smithery brave search (or your named credential) to both of these nodes. Activate Workflow: Ensure the workflow is active. Note MCP Trigger Path: Locate the local_event_finder (MCP Trigger) node. The Path field (e.g., 0ca88864-ec0a-4c27-a7ec-e28c5a900697) combined with your n8n webhook base URL forms the endpoint for client calls. Example Endpoint: YOUR_N8N_INSTANCE_URL/webhooks/PATH-TO-MCP-SERVER Customization AI Behavior:** Modify the "System Message" parameter within the event_finder_agent node to change the AI's persona, its strategy for using tools, or the desired output format. LLM Model:** Swap the Google Gemini Chat Model node with another compatible LLM node (e.g., OpenAI Chat Model) if desired. You'll need to adjust credentials and potentially the system prompt. Tools:** Add, remove, or replace tool nodes (e.g., use a different search provider, add a weather API tool) and update the event_finder_agent's system prompt and tool configuration accordingly. Scraping Depth:** Be mindful of the jina_ai_web_page_scraper's usage due to potential timeouts. The system prompt already guides the LLM on this, but you can adjust its usage instructions.
by Automate With Marc
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Automate your entire video content creation pipeline with this AI-powered, no-code workflow built in n8n. Watch Step-by-step video guide here: https://www.youtube.com/watch?v=x7nHpcggpX8&t=5s This template connects a suite of smart tools to help you generate scroll-stopping short video ideas based on daily trending topics and auto-deliver them via email—ready for production in Veo 3. 🔧 How it works: Scheduled Trigger (Daily) Kicks off the process each day at your chosen time. Tavily Agent (Web Search) Searches the latest trends, viral moments, or market news based on your e-commerce brand (e.g. “Sally’s Closet”). OpenAI GPT-4 Agent (Creative Brainstorming) Generates high-conversion marketing video ideas based on your brand’s tone and what’s trending. Prompt Formatter for Veo 3 Converts the idea into a cinematic-style prompt, optimized for Veo’s video generation engine (via FAL API). Send via Gmail The final Veo 3 prompt is emailed to you or your creative team for immediate use or manual refinement. Watch full step-by-step Tutorial Build Video: https://youtu.be/x7nHpcggpX8 🧠 Use Cases: E-commerce brands that need fresh daily content Marketing teams looking to automate creative ideation Solopreneurs building a lean video production engine Anyone experimenting with Veo 3 prompt-based storytelling 🛠️ Tools used: n8n Scheduled Trigger Tavily Node (for real-time web search) Langchain Agent (GPT-4 via OpenAI) FAL API (Veo 3 prompt delivery) Gmail Node (send final output) ⚡️ Ready-to-use. Fully editable. Zero coding required. 💡 Pro Tip: You can hook this up with the Veo 3 generation API (FAL) to complete the automation end-to-end!
by Adam Janes
This workflow gives you the ability to reply to a long email with a voice note, rather than having to type everything out. ChatGPT will format your audio response and create an email draft for you. How it works When a new email arrives in your inbox, the workflow checks if it needs a response, and it it does, it sends a message to you on Telegram via a VoiceEmailer bot. When you reply to that message with an audio message, the second part of this workflow is triggered. It checks if the message is in the right format, transcribes the audio, and creates a draft response that shows up in the same email thread. Set up steps Add your credentials for Gmail and OpenAI Create an Telegram bot following the instructions here. Connect your telegram credentials so the workflow will use your bot. Turn on the workflow, and message the bot from your telegram. Find the Chat ID from the Executions tab of your workflow, and enter it in as a variable.
by Floyd Mahou
🔍 How it works This workflow turns WhatsApp into a smart email command center using AI. Users can speak or type instructions like: "Send a follow-up to Claire” "Write a draft email to Claire to confirm tomorrow’s meeting at 5 PM” "What is the name of Claire's firm?” The agent transcribes voice notes, extracts intent with GPT, interacts with Gmail (send, draft, search), and replies with a confirmation via WhatsApp — either as text or a voice message. ⚙️ Key Modules Used WhatsApp Business Webhook (Meta) OpenAI Whisper (voice transcription) GPT (intent + content generation) Gmail (search, draft, send) Airtable (contact lookup + memory logging) 🧠 Memory Layer (Optional) The agent logs key fields in Airtable: Recipient email Company / job title And more... This creates a lightweight "gut memory” so the agent feels context-aware. 🗺️ Setup Steps Connect WhatsApp Business API (via Meta Developer Console) Add OpenAI and Gmail credentials in n8n Link your Airtable base for contacts and logging 🧩 Best Use Cases Hands-free email reply while commuting Fast Gmail access for busy consultants / solopreneurs Custom business agents for service-based professionals ⏱️ Estimated Setup Time 30–60 minutes ✅ Requirements WhatsApp Business Cloud access OpenAI API Key Gmail or Google Workspace Airtable account (free plan OK) n8n instance (cloud or self-hosted with HTTPS)
by Incrementors
Google Maps Business Phone No Scraper with Bright Data & Sheets Overview This n8n workflow automates the process of scraping business phone numbers and information from Google Maps using the Bright Data API and saves the results to Google Sheets. Workflow Components 1. Form Trigger - Submit Location and Keywords Type: Form Trigger Purpose: Start the workflow when a form is submitted Fields: Location (required) Keywords (required) Configuration: Form Title: "GMB" Webhook ID: 8b72dcdf-25a1-4b63-bb44-f918f7095d5d 2. Bright Data API - Request Business Data Type: HTTP Request Purpose: Sends scraping request to Bright Data API Method: POST URL: https://api.brightdata.com/datasets/v3/trigger Query Parameters: dataset_id: gd_m8ebnr0q2qlklc02fz include_errors: true type: discover_new discover_by: location limit_per_input: 2 Headers: Authorization: Bearer BRIGHT_DATA_API_KEY Request Body: { "input": [ { "country": "{{ $json.Location }}", "keyword": "{{ $json.keywords }}", "lat": "" } ], "custom_output_fields": [ "url", "country", "name", "address", "description", "open_hours", "reviews_count", "rating", "reviews", "services_provided", "open_website", "phone_number", "permanently_closed", "photos_and_videos", "people_also_search" ] } 3. Check Scraping Status Type: HTTP Request Purpose: Check if data scraping is completed Method: GET URL: https://api.brightdata.com/datasets/v3/progress/{{ $json.snapshot_id }} Query Parameters: format: json Headers: Authorization: Bearer BRIGHT_DATA_API_KEY 4. Check If Status Ready Type: Conditional (IF) Purpose: Determine if scraping is ready or needs to wait Condition: {{ $json.status }} equals "ready" 5. Wait Before Retry Type: Wait Purpose: Pause 1 minute before checking status again Duration: 1 minute Webhook ID: 7047efad-de41-4608-b95c-d3e0203ef620 6. Check Records Exist Type: Conditional (IF) Purpose: Proceed only if business records are found Condition: {{ $json.records }} not equals 0 7. Fetch Business Data Type: HTTP Request Purpose: Get business information including phone numbers Method: GET URL: https://api.brightdata.com/datasets/v3/snapshot/{{ $json.snapshot_id }} Query Parameters: format: json Headers: Authorization: Bearer BRIGHT_DATA_API_KEY 8. Save to Google Sheets Type: Google Sheets Purpose: Store business data in Google Sheets Operation: Append Document ID: YOUR_GOOGLE_SHEET_ID Sheet Name: GMB Column Mapping: Name:** {{ $json.name }} Address:** {{ $json.address }} Rating:** {{ $json.rating }} Phone Number:** {{ $json.phone_number }} URL:** {{ $json.url }} Workflow Flow Start: User submits form with location and keywords Request: Send scraping request to Bright Data API Monitor: Check scraping status periodically Wait Loop: If not ready, wait 1 minute and check again Validate: Ensure records exist before proceeding Fetch: Retrieve the scraped business data Save: Store results in Google Sheets Setup Requirements API Keys & Credentials Bright Data API Key:** Replace BRIGHT_DATA_API_KEY with your actual API key Google Sheets OAuth2:** Configure with your Google Sheets credential ID Google Sheet ID:** Replace YOUR_GOOGLE_SHEET_ID with your actual sheet ID Google Sheets Setup Create a Google Sheet with a tab named "GMB" Ensure the following columns exist: Name Address Rating Phone Number URL Workflow Status Active:** No (currently inactive) Execution Order:** v1 Version ID:** 0bed9bf1-00a3-4eb6-bf7c-cf07bee006a2 Workflow ID:** Hm7iTSgpu2of6gz4 Notes The workflow includes a retry mechanism with 1-minute waits Data validation ensures only successful scrapes are processed All business information is automatically saved to Google Sheets The form trigger allows easy initiation of scraping jobs For any questions or support, please contact: info@incrementors.com or fill out this form: https://www.incrementors.com/contact-us/
by David Ashby
Complete MCP server exposing all ProfitWell Tool operations to AI agents. Zero configuration needed - all 2 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every ProfitWell Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n ProfitWell Tool tool with full error handling 📋 Available Operations (2 total) Every possible ProfitWell Tool operation is included: 🔧 Company (1 operations) • Get settings for your company 🔧 Metric (1 operations) • Get a metric 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native ProfitWell Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every ProfitWell Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
🛠️ Plivo Tool MCP Server Complete MCP server exposing all Plivo Tool operations to AI agents. Zero configuration needed - all 3 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Plivo Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Plivo Tool tool with full error handling 📋 Available Operations (3 total) Every possible Plivo Tool operation is included: 🔧 Call (1 operations) • Make a call 🔧 Mms (1 operations) • Send an MMS 🔧 Sms (1 operations) • Send an SMS 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Plivo Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Plivo Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Shiv Gupta
Pinterest Keyword-Based Content Scraper with AI Agent & BrightData Automation Overview This n8n workflow automates Pinterest content scraping based on user-provided keywords using BrightData's API and Claude Sonnet 4 AI agent. The system intelligently processes keywords, initiates scraping jobs, monitors progress, and formats the extracted data into structured outputs. Architecture Components 🧠 AI-Powered Controller Claude Sonnet 4 Model**: Processes and understands keywords before initiating scrape AI Agent**: Acts as the intelligent controller coordinating all scraping steps 📥 Data Input Form Trigger**: User-friendly keyword input interface Keywords Field**: Required input field for Pinterest search terms 🚀 Scraping Pipeline Launch Scraping Job: Sends keywords to BrightData API Status Monitoring: Continuously checks scraping progress Data Retrieval: Downloads completed scraped content Data Processing: Formats and structures the raw data Storage: Saves results to Google Sheets Workflow Nodes 1. Pinterest Keyword Input Type**: Form Trigger Purpose**: Entry point for user keyword submission Configuration**: Form title: "Pinterest" Required field: "Keywords" 2. Anthropic Chat Model Type**: Language Model (Claude Sonnet 4) Model**: claude-sonnet-4-20250514 Purpose**: AI-powered keyword processing and workflow orchestration 3. Keyword-based Scraping Agent Type**: AI Agent Purpose**: Orchestrates the entire scraping process Instructions**: Initiates Pinterest scraping with provided keywords Monitors scraping status until completion Downloads final scraped data Presents raw scraped data as output 4. BrightData Pinterest Scraping Type**: HTTP Request Tool Method**: POST Endpoint**: https://api.brightdata.com/datasets/v3/trigger Parameters**: dataset_id: gd_lk0sjs4d21kdr7cnlv include_errors: true type: discover_new discover_by: keyword limit_per_input: 2 Purpose**: Creates new scraping snapshot based on keywords 5. Check Scraping Status Type**: HTTP Request Tool Method**: GET Endpoint**: https://api.brightdata.com/datasets/v3/progress/{snapshot_id} Purpose**: Monitors scraping job progress Returns**: Status values like "running" or "ready" 6. Fetch Pinterest Snapshot Data Type**: HTTP Request Tool Method**: GET Endpoint**: https://api.brightdata.com/datasets/v3/snapshot/{snapshot_id} Purpose**: Downloads completed scraped data Trigger**: Executes when status is "ready" 7. Format & Extract Pinterest Content Type**: Code Node (JavaScript) Purpose**: Parses and structures raw scraped data Extracted Fields**: URL Post ID Title Content Date Posted User Likes & Comments Media Image URL Categories Hashtags 8. Save Pinterest Data to Google Sheets Type**: Google Sheets Node Operation**: Append Mapped Columns**: Post URL Title Content Image URL 9. Wait for 1 Minute (Disabled) Type**: Code Tool Purpose**: Adds delay between status checks (currently disabled) Duration**: 60 seconds Setup Requirements Required Credentials Anthropic API Credential ID: ANTHROPIC_CREDENTIAL_ID Required for Claude Sonnet 4 access BrightData API API Key: BRIGHT_DATA_API_KEY Required for Pinterest scraping service Google Sheets OAuth2 Credential ID: GOOGLE_SHEETS_CREDENTIAL_ID Required for data storage Configuration Placeholders Replace the following placeholders with actual values: WEBHOOK_ID_PLACEHOLDER: Form trigger webhook ID GOOGLE_SHEET_ID_PLACEHOLDER: Target Google Sheets document ID WORKFLOW_VERSION_ID: n8n workflow version INSTANCE_ID_PLACEHOLDER: n8n instance identifier WORKFLOW_ID_PLACEHOLDER: Unique workflow identifier Data Flow User Input (Keywords) ↓ AI Agent Processing (Claude) ↓ BrightData Scraping Job Creation ↓ Status Monitoring Loop ↓ Data Retrieval (when ready) ↓ Content Formatting & Extraction ↓ Google Sheets Storage Output Data Structure Each scraped Pinterest pin contains: URL**: Direct link to Pinterest pin Post ID**: Unique Pinterest identifier Title**: Pin title/heading Content**: Pin description text Date Posted**: Publication timestamp User**: Pinterest username Engagement**: Likes and comments count Media**: Media type information Image URL**: Direct image link Categories**: Pin categorization tags Hashtags**: Associated hashtags Comments**: User comments text Usage Instructions Initial Setup: Configure all required API credentials Replace placeholder values with actual IDs Create target Google Sheets document Running the Workflow: Access the form trigger URL Enter desired Pinterest keywords Submit the form to initiate scraping Monitoring Progress: The AI agent will automatically handle status monitoring No manual intervention required during scraping Accessing Results: Structured data will be automatically saved to Google Sheets Each run appends new data to existing sheet Technical Notes Rate Limiting**: BrightData API has built-in rate limiting Data Limits**: Current configuration limits 2 pins per keyword Status Polling**: Automatic status checking until completion Error Handling**: Includes error capture in scraping requests Async Processing**: Supports long-running scraping jobs Customization Options Adjust Data Limits**: Modify limit_per_input parameter Enable Wait Timer**: Activate the disabled wait node for longer jobs Custom Data Fields**: Modify the formatting code for additional fields Alternative Storage**: Replace Google Sheets with other storage options Sample Google Sheets Template Create a copy of the sample sheet structure: https://docs.google.com/spreadsheets/d/SAMPLE_SHEET_ID/edit Required columns: Post URL Title Content Image URL Troubleshooting Authentication Errors**: Verify all API credentials are correctly configured Scraping Failures**: Check BrightData API status and rate limits Data Formatting Issues**: Review the JavaScript formatting code for parsing errors Google Sheets Errors**: Ensure proper OAuth2 permissions and sheet access For any questions or support, please contact: Email or fill out this form
by Shahrear
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Automatically transform resume submissions into comprehensive candidate profiles with AI-powered parsing, GitHub analysis, and instant team notifications. What this workflow does Monitors Gmail for incoming resume attachments Extracts structured data using VLM Run AI document parsing Analyzes GitHub profiles with deep repository intelligence (30+ frameworks detected) Creates comprehensive candidate profiles with technical skills assessment Delivers multi-channel notifications via Google Sheets, Slack, and candidate emails Setup Prerequisites: Gmail account, VLM Run API credentials, Google Sheets access, Slack workspace, self-hosted n8n. You need to install VLM Run community node Quick Setup: Configure Gmail OAuth2 for resume monitoring Add VLM Run API credentials for document parsing Create Google Sheets candidate database Set up Slack integration for team notifications Update spreadsheet/channel IDs in workflow nodes Test with sample resume and activate Perfect for HR departments and technical recruiting teams Startup hiring and talent acquisition agencies Developer assessment and skill evaluation Remote team hiring and candidate screening Any organization seeking data-driven hiring decisions Key Benefits Eliminates manual data entry** - AI extracts all contact info, skills, and experience GitHub intelligence engine** - Analyzes repositories, calculates experience, detects technologies Comprehensive skill assessment** - Identifies programming languages, frameworks, and project metrics Professional candidate experience** - Automated acknowledgment emails with personalized touches Instant team collaboration** - Rich Slack notifications with GitHub profile highlights Structured data storage** - Searchable candidate database with 20+ data columns Saves hours per candidate** - Transforms 30-minute manual reviews into instant insights How to customize Extend by adding: Integration with ATS systems (Greenhouse, Lever, BambooHR) LinkedIn profile analysis and social media insights Automated interview scheduling based on qualifications Skills-based candidate scoring and ranking algorithms Integration with code assessment platforms Multi-language resume support and translation Custom evaluation criteria and filtering rules Advanced GitHub metrics (code quality, contribution patterns) This workflow revolutionizes technical hiring by combining AI-powered resume parsing with deep GitHub analysis, delivering comprehensive candidate intelligence that empowers data-driven hiring decisions while maintaining a professional candidate experience.
by Davide
This workflow automates the process of creating a complete SEO-optimized blog post, including generating content, titles, images, and meta tags, and publishing it on WordPress. It leverages AI models (like DeepSeek and OpenRouter) for content generation and SEO optimization, and integrates with Google Sheets, WordPress, and OpenAI for image generation. This is a powerful tool for automating the creation and optimization of blog posts, saving time and ensuring high-quality, SEO-friendly content. It integrates multiple tools and AI models to deliver a seamless content creation experience. Below is a breakdown of the workflow: 1. How It Works The workflow is designed to streamline the creation of SEO-friendly blog posts. Here's how it works: Trigger: The workflow starts with a Manual Trigger node, which initiates the process when the user clicks "Test workflow." Fetch Context: The Google Sheets node retrieves the blog post context (e.g., topic, keywords) from a predefined Google Sheet. Generate Content: The Generate Article node uses an AI model (DeepSeek) to create an SEO-friendly blog post based on the fetched context. The Generate Title node creates a compelling, keyword-rich title for the blog post. Publish to WordPress: The Add Draft to WP node creates a draft post in WordPress with the generated content and title. Generate and Upload Image: The Generate Image node uses OpenAI to create a realistic, blog-appropriate image. The Upload Image node uploads the image to WordPress media. The Set Image node associates the uploaded image as the featured image for the blog post. SEO Optimization: The SEO Expert node analyzes the blog post and generates optimized meta titles and descriptions using an AI model (OpenRouter). The Set Metatag node updates the WordPress post with the generated meta tags. Update Google Sheets: The Update Sheet and Finish Work nodes update the Google Sheet with the post's details, including the URL, title, and metadata. 2. Set Up Steps To set up and use this workflow in n8n, follow these steps: Google Sheets Setup: Create a Google Sheet with columns for ID POST, PROMPT, TITLE, URL, METATITLE, and METADESCRIPTION. Link the Google Sheet to the Get Context node by providing the Document ID and Sheet Name. WordPress Integration: Set up WordPress credentials in n8n for the Add Draft to WP, Upload Image, and Set Image nodes. Ensure the WordPress site is accessible via its REST API. AI Model Configuration: Configure the DeepSeek and OpenRouter credentials in n8n for the Generate Article, Generate Title, and SEO Expert nodes. Ensure the AI models are correctly set up to generate content and meta tags. Image Generation: Set up OpenAI credentials for the Generate Image node to create blog post images. Meta Tag Optimization: The SEO Expert node uses OpenRouter to generate meta titles and descriptions. Ensure the node is configured to analyze the blog post content and produce SEO-friendly tags. Workflow Execution: Click the "Test workflow" button to trigger the workflow. The workflow will: Fetch the blog post context from Google Sheets. Generate the article, title, and image. Publish the draft to WordPress. Upload and set the featured image. Generate and apply meta tags. Update the Google Sheet with the post details. Optional Customization: Modify the workflow to include additional SEO optimizations, such as internal linking, keyword density analysis, or social media sharing. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Karam Ghazzi
Description 📄 Turn your Slack workspace into a smart AI-powered HelpDesk using this workflow. This automation listens to Slack messages and uses an AI assistant (powered by OpenAI or any other LLM) to respond to employee questions about HR, IT, or internal policies by referencing your internal documentation (such as the Policy Handbook). If the answer isn't available, it can optionally email the relevant department (HR or IT) and ask them to update the handbook. It remembers recent messages per user, cleans up intermediate responses to keep Slack threads tidy, and ensures your team gets consistent and helpful answers—without manually searching docs or escalating simple questions. Perfect for growing teams who want to streamline internal support using n8n, Slack, and AI. How it works 🛠️ This workflow turns n8n into a Slack-based HelpDesk assistant powered by AI. It listens to Slack messages using the Events API, detects whether a real user is asking a question, and responds using OpenAI (or another LLM of your choice). Here's how it works step-by-step: Webhook Trigger: The workflow starts when a message is posted in Slack via the Events API. It filters out any messages from bots to avoid loops. Identify the User: It fetches the full Slack profile of the user who posted the message and stores their name. Send Receipt Message: An initial message is sent to the user saying, “I’m on it!”, confirming their request is being processed. AI Response Handling: The message is processed using the OpenAI Chat model (GPT-4o by default). Before responding, it checks if the query matches any HR or IT policy from the Policy Handbook. If the question can’t be answered based on internal data, it can optionally alert the HR or IT department via Gmail (after user confirmation). Memory Retention: It keeps track of the last 5 interactions per user using Simple Memory, so it remembers previous context in a Slack conversation. Cleanup and Final Reply: It deletes the initial receipt message and sends a final, clean response to the user. How to use 🚀 Clone the Workflow: Download or import the JSON workflow into your n8n instance. Connect Your Credentials: Slack API (for messaging) Google Sheets API (for department contact info) Google Docs API (for the Policy Handbook) Gmail API (optional, for notifying departments) OpenAI or another AI model Slack Setup: Set up a Slack App and enable the Events API. Subscribe to message events and point them to the Webhook URL generated by the workflow. Customize Responses: Edit the initial and final Slack message nodes if you want to personalize the wording. Swap out the LLM (ChatGPT) with your preferred model in the AI Agent node. Adjust AI Behavior: Tune the prompt logic in the “AI Agent” node if you want the AI to behave differently or access different data sources. Expand Memory or Integrations: Use external databases to store longer histories. Integrate with tools like Asana, Notion, or CRM platforms for further automation. Requirements 📋 n8n (self-hosted or cloud) Slack Developer Account & App OpenAI (or any LLM provider) Google Sheets with department contact details Google Docs containing the policy Handbook Gmail account (optional, for email alerts) Knowledge of Slack Events API setup