by Jez
Summary This n8n workflow implements an AI-powered "Local Event Finder" agent. It takes user criteria (like event type, city, date, and interests), uses a suite of search tools (Brave Web Search, Brave Local Search, Google Gemini Search) and a web scraper (Jina AI) to find relevant events, and returns formatted details. The entire agent is exposed as a single, easy-to-use MCP (Multi-Capability Peer) tool, making it simple to integrate into other workflows or applications. This template cleverly combines the MCP server endpoint and the AI agent logic into a single n8n workflow file for ease of import and management. Key Features Intelligent Multi-Tool Search:** Dynamically utilizes web search, precise local search, and advanced Gemini semantic search to find events. Detailed Information via Web Scraping:** Employs Jina AI to extract comprehensive details directly from event web pages. Simplified MCP Tool Exposure:** Makes the complex event-finding logic available as a single, callable tool for other MCP-compatible clients (e.g., Roo Code, Cline, other n8n workflows). Customizable AI Behavior:** The core AI agent's behavior, tool usage strategy, and output formatting can be tailored by modifying its System Prompt. Modular Design:** Uses distinct nodes for LLM, memory, and each external tool, allowing for easier modification or extension. Benefits Simplifies Client-Side Integration:** Offloads the complexity of event searching and data extraction from client applications. Provides Richer Event Data:** Goes beyond simple search links to extract and format key event details. Flexible & Adaptable:** Can be adjusted to various event search needs and can incorporate new tools or data sources. Efficient Processing:** Leverages specialized tools for different aspects of the search process. Nodes Used MCP Trigger Tool Workflow Execute Workflow Trigger AI Agent Google Gemini Chat Model (ChatGoogleGenerativeAI) Simple Memory (Window Buffer Memory) MCP Client (for Brave Search tools via Smithery) Google Gemini Search Tool Jina AI Tool Prerequisites An active n8n instance. Google AI API Key:** For the Gemini LLM (Google Gemini Chat Model node) and the Google Gemini Search Tool. Ensure your key is enabled for these services. Jina AI API Key:** For the jina_ai_web_page_scraper node. A free tier is often available. Access to a Brave Search MCP Provider (Optional but Recommended):** This template uses MCP Client nodes configured for Brave Search via a provider like Smithery. You'll need an account/API key for your chosen Brave Search MCP provider to configure the smithery brave search credential. Alternatively, you could adapt these to call Brave Search API directly if you manage your own access, or replace them with other search tools. Setup Instructions Import Workflow: Download the JSON file for this template and import it into your n8n instance. Configure Credentials: Google Gemini LLM: Locate the Google Gemini Chat Model node. Select or create a "Google Gemini API" credential (named Google Gemini Context7 in the template) using your Google AI API Key. Google Gemini Search Tool: Locate the google_gemini_event_search node. Select or create a "Gemini API" credential (named Gemini Credentials account in the template) using your Google AI API Key (ensure it's enabled for Search/Vertex AI). Jina AI Web Scraper: Locate the jina_ai_web_page_scraper node. Select or create a "Jina AI API" credential (named Jina AI account in the template) using your Jina AI API Key. Brave Search (via MCP): You'll need an MCP Client HTTP API credential to connect to your Brave Search MCP provider (e.g., Smithery). Create a new "MCP Client HTTP API" credential in n8n. Name it, for example, smithery brave search. Configure it with the Base URL and any required authentication (e.g., API key in headers) for your Brave Search MCP provider. Locate the brave_web_search and brave_local_search MCP Client nodes in the workflow. Assign the smithery brave search (or your named credential) to both of these nodes. Activate Workflow: Ensure the workflow is active. Note MCP Trigger Path: Locate the local_event_finder (MCP Trigger) node. The Path field (e.g., 0ca88864-ec0a-4c27-a7ec-e28c5a900697) combined with your n8n webhook base URL forms the endpoint for client calls. Example Endpoint: YOUR_N8N_INSTANCE_URL/webhooks/PATH-TO-MCP-SERVER Customization AI Behavior:** Modify the "System Message" parameter within the event_finder_agent node to change the AI's persona, its strategy for using tools, or the desired output format. LLM Model:** Swap the Google Gemini Chat Model node with another compatible LLM node (e.g., OpenAI Chat Model) if desired. You'll need to adjust credentials and potentially the system prompt. Tools:** Add, remove, or replace tool nodes (e.g., use a different search provider, add a weather API tool) and update the event_finder_agent's system prompt and tool configuration accordingly. Scraping Depth:** Be mindful of the jina_ai_web_page_scraper's usage due to potential timeouts. The system prompt already guides the LLM on this, but you can adjust its usage instructions.
by Lakshit Ukani
Who is this for? Content creators, social media managers, digital marketers, and businesses looking to automate video production without expensive equipment or technical expertise. What problem is this workflow solving? Traditional video creation requires cameras, editing software, voice recording equipment, and hours of post-production work. This workflow eliminates all these barriers by automatically generating professional videos with audio using just text prompts. What this workflow does This automated workflow takes video ideas from Google Sheets, generates optimized prompts using AI, creates videos through Google's V3 model via Fal AI, monitors the generation progress, and saves the final video URLs back to your spreadsheet for easy access and management. Setup Sign up for Fal AI account and obtain API key Create Google Sheet with video ideas and status columns Configure n8n with required credentials (Google Sheets, Fal AI API) Import the workflow template Set up authentication for all connected services Test with sample video idea How to customize this workflow to your needs Modify the AI prompts to match your brand voice, adjust video styles and camera movements, change polling intervals for video generation status, customize Google Sheet column mappings, and add additional processing steps like thumbnail generation or social media posting.
by Incrementors
Google Maps Business Phone No Scraper with Bright Data & Sheets Overview This n8n workflow automates the process of scraping business phone numbers and information from Google Maps using the Bright Data API and saves the results to Google Sheets. Workflow Components 1. Form Trigger - Submit Location and Keywords Type: Form Trigger Purpose: Start the workflow when a form is submitted Fields: Location (required) Keywords (required) Configuration: Form Title: "GMB" Webhook ID: 8b72dcdf-25a1-4b63-bb44-f918f7095d5d 2. Bright Data API - Request Business Data Type: HTTP Request Purpose: Sends scraping request to Bright Data API Method: POST URL: https://api.brightdata.com/datasets/v3/trigger Query Parameters: dataset_id: gd_m8ebnr0q2qlklc02fz include_errors: true type: discover_new discover_by: location limit_per_input: 2 Headers: Authorization: Bearer BRIGHT_DATA_API_KEY Request Body: { "input": [ { "country": "{{ $json.Location }}", "keyword": "{{ $json.keywords }}", "lat": "" } ], "custom_output_fields": [ "url", "country", "name", "address", "description", "open_hours", "reviews_count", "rating", "reviews", "services_provided", "open_website", "phone_number", "permanently_closed", "photos_and_videos", "people_also_search" ] } 3. Check Scraping Status Type: HTTP Request Purpose: Check if data scraping is completed Method: GET URL: https://api.brightdata.com/datasets/v3/progress/{{ $json.snapshot_id }} Query Parameters: format: json Headers: Authorization: Bearer BRIGHT_DATA_API_KEY 4. Check If Status Ready Type: Conditional (IF) Purpose: Determine if scraping is ready or needs to wait Condition: {{ $json.status }} equals "ready" 5. Wait Before Retry Type: Wait Purpose: Pause 1 minute before checking status again Duration: 1 minute Webhook ID: 7047efad-de41-4608-b95c-d3e0203ef620 6. Check Records Exist Type: Conditional (IF) Purpose: Proceed only if business records are found Condition: {{ $json.records }} not equals 0 7. Fetch Business Data Type: HTTP Request Purpose: Get business information including phone numbers Method: GET URL: https://api.brightdata.com/datasets/v3/snapshot/{{ $json.snapshot_id }} Query Parameters: format: json Headers: Authorization: Bearer BRIGHT_DATA_API_KEY 8. Save to Google Sheets Type: Google Sheets Purpose: Store business data in Google Sheets Operation: Append Document ID: YOUR_GOOGLE_SHEET_ID Sheet Name: GMB Column Mapping: Name:** {{ $json.name }} Address:** {{ $json.address }} Rating:** {{ $json.rating }} Phone Number:** {{ $json.phone_number }} URL:** {{ $json.url }} Workflow Flow Start: User submits form with location and keywords Request: Send scraping request to Bright Data API Monitor: Check scraping status periodically Wait Loop: If not ready, wait 1 minute and check again Validate: Ensure records exist before proceeding Fetch: Retrieve the scraped business data Save: Store results in Google Sheets Setup Requirements API Keys & Credentials Bright Data API Key:** Replace BRIGHT_DATA_API_KEY with your actual API key Google Sheets OAuth2:** Configure with your Google Sheets credential ID Google Sheet ID:** Replace YOUR_GOOGLE_SHEET_ID with your actual sheet ID Google Sheets Setup Create a Google Sheet with a tab named "GMB" Ensure the following columns exist: Name Address Rating Phone Number URL Workflow Status Active:** No (currently inactive) Execution Order:** v1 Version ID:** 0bed9bf1-00a3-4eb6-bf7c-cf07bee006a2 Workflow ID:** Hm7iTSgpu2of6gz4 Notes The workflow includes a retry mechanism with 1-minute waits Data validation ensures only successful scrapes are processed All business information is automatically saved to Google Sheets The form trigger allows easy initiation of scraping jobs For any questions or support, please contact: info@incrementors.com or fill out this form: https://www.incrementors.com/contact-us/
by David Ashby
🛠️ Twilio Tool MCP Server Complete MCP server exposing all Twilio Tool operations to AI agents. Zero configuration needed - all 2 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Twilio Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Twilio Tool tool with full error handling 📋 Available Operations (2 total) Every possible Twilio Tool operation is included: 🔧 Call (1 operations) • Make a call 🔧 Sms (1 operations) • Send an SMS/MMS/WhatsApp message 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Twilio Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Twilio Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
Complete MCP server exposing all ProfitWell Tool operations to AI agents. Zero configuration needed - all 2 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every ProfitWell Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n ProfitWell Tool tool with full error handling 📋 Available Operations (2 total) Every possible ProfitWell Tool operation is included: 🔧 Company (1 operations) • Get settings for your company 🔧 Metric (1 operations) • Get a metric 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native ProfitWell Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every ProfitWell Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
🛠️ Plivo Tool MCP Server Complete MCP server exposing all Plivo Tool operations to AI agents. Zero configuration needed - all 3 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Plivo Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Plivo Tool tool with full error handling 📋 Available Operations (3 total) Every possible Plivo Tool operation is included: 🔧 Call (1 operations) • Make a call 🔧 Mms (1 operations) • Send an MMS 🔧 Sms (1 operations) • Send an SMS 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Plivo Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Plivo Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
🛠️ Philips Hue Tool MCP Server Complete MCP server exposing all Philips Hue Tool operations to AI agents. Zero configuration needed - all 4 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Philips Hue Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Philips Hue Tool tool with full error handling 📋 Available Operations (4 total) Every possible Philips Hue Tool operation is included: 🔧 Light (4 operations) • Delete a light • Get a light • Get many lights • Update a light 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Philips Hue Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Philips Hue Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
Complete MCP server exposing all Mocean Tool operations to AI agents. Zero configuration needed - all 2 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Mocean Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Mocean Tool tool with full error handling 📋 Available Operations (2 total) Every possible Mocean Tool operation is included: 🔧 Sms (1 operations) • Send an SMS 🔧 Voice (1 operations) • Send an SMS 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Mocean Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Mocean Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Shiv Gupta
Pinterest Keyword-Based Content Scraper with AI Agent & BrightData Automation Overview This n8n workflow automates Pinterest content scraping based on user-provided keywords using BrightData's API and Claude Sonnet 4 AI agent. The system intelligently processes keywords, initiates scraping jobs, monitors progress, and formats the extracted data into structured outputs. Architecture Components 🧠 AI-Powered Controller Claude Sonnet 4 Model**: Processes and understands keywords before initiating scrape AI Agent**: Acts as the intelligent controller coordinating all scraping steps 📥 Data Input Form Trigger**: User-friendly keyword input interface Keywords Field**: Required input field for Pinterest search terms 🚀 Scraping Pipeline Launch Scraping Job: Sends keywords to BrightData API Status Monitoring: Continuously checks scraping progress Data Retrieval: Downloads completed scraped content Data Processing: Formats and structures the raw data Storage: Saves results to Google Sheets Workflow Nodes 1. Pinterest Keyword Input Type**: Form Trigger Purpose**: Entry point for user keyword submission Configuration**: Form title: "Pinterest" Required field: "Keywords" 2. Anthropic Chat Model Type**: Language Model (Claude Sonnet 4) Model**: claude-sonnet-4-20250514 Purpose**: AI-powered keyword processing and workflow orchestration 3. Keyword-based Scraping Agent Type**: AI Agent Purpose**: Orchestrates the entire scraping process Instructions**: Initiates Pinterest scraping with provided keywords Monitors scraping status until completion Downloads final scraped data Presents raw scraped data as output 4. BrightData Pinterest Scraping Type**: HTTP Request Tool Method**: POST Endpoint**: https://api.brightdata.com/datasets/v3/trigger Parameters**: dataset_id: gd_lk0sjs4d21kdr7cnlv include_errors: true type: discover_new discover_by: keyword limit_per_input: 2 Purpose**: Creates new scraping snapshot based on keywords 5. Check Scraping Status Type**: HTTP Request Tool Method**: GET Endpoint**: https://api.brightdata.com/datasets/v3/progress/{snapshot_id} Purpose**: Monitors scraping job progress Returns**: Status values like "running" or "ready" 6. Fetch Pinterest Snapshot Data Type**: HTTP Request Tool Method**: GET Endpoint**: https://api.brightdata.com/datasets/v3/snapshot/{snapshot_id} Purpose**: Downloads completed scraped data Trigger**: Executes when status is "ready" 7. Format & Extract Pinterest Content Type**: Code Node (JavaScript) Purpose**: Parses and structures raw scraped data Extracted Fields**: URL Post ID Title Content Date Posted User Likes & Comments Media Image URL Categories Hashtags 8. Save Pinterest Data to Google Sheets Type**: Google Sheets Node Operation**: Append Mapped Columns**: Post URL Title Content Image URL 9. Wait for 1 Minute (Disabled) Type**: Code Tool Purpose**: Adds delay between status checks (currently disabled) Duration**: 60 seconds Setup Requirements Required Credentials Anthropic API Credential ID: ANTHROPIC_CREDENTIAL_ID Required for Claude Sonnet 4 access BrightData API API Key: BRIGHT_DATA_API_KEY Required for Pinterest scraping service Google Sheets OAuth2 Credential ID: GOOGLE_SHEETS_CREDENTIAL_ID Required for data storage Configuration Placeholders Replace the following placeholders with actual values: WEBHOOK_ID_PLACEHOLDER: Form trigger webhook ID GOOGLE_SHEET_ID_PLACEHOLDER: Target Google Sheets document ID WORKFLOW_VERSION_ID: n8n workflow version INSTANCE_ID_PLACEHOLDER: n8n instance identifier WORKFLOW_ID_PLACEHOLDER: Unique workflow identifier Data Flow User Input (Keywords) ↓ AI Agent Processing (Claude) ↓ BrightData Scraping Job Creation ↓ Status Monitoring Loop ↓ Data Retrieval (when ready) ↓ Content Formatting & Extraction ↓ Google Sheets Storage Output Data Structure Each scraped Pinterest pin contains: URL**: Direct link to Pinterest pin Post ID**: Unique Pinterest identifier Title**: Pin title/heading Content**: Pin description text Date Posted**: Publication timestamp User**: Pinterest username Engagement**: Likes and comments count Media**: Media type information Image URL**: Direct image link Categories**: Pin categorization tags Hashtags**: Associated hashtags Comments**: User comments text Usage Instructions Initial Setup: Configure all required API credentials Replace placeholder values with actual IDs Create target Google Sheets document Running the Workflow: Access the form trigger URL Enter desired Pinterest keywords Submit the form to initiate scraping Monitoring Progress: The AI agent will automatically handle status monitoring No manual intervention required during scraping Accessing Results: Structured data will be automatically saved to Google Sheets Each run appends new data to existing sheet Technical Notes Rate Limiting**: BrightData API has built-in rate limiting Data Limits**: Current configuration limits 2 pins per keyword Status Polling**: Automatic status checking until completion Error Handling**: Includes error capture in scraping requests Async Processing**: Supports long-running scraping jobs Customization Options Adjust Data Limits**: Modify limit_per_input parameter Enable Wait Timer**: Activate the disabled wait node for longer jobs Custom Data Fields**: Modify the formatting code for additional fields Alternative Storage**: Replace Google Sheets with other storage options Sample Google Sheets Template Create a copy of the sample sheet structure: https://docs.google.com/spreadsheets/d/SAMPLE_SHEET_ID/edit Required columns: Post URL Title Content Image URL Troubleshooting Authentication Errors**: Verify all API credentials are correctly configured Scraping Failures**: Check BrightData API status and rate limits Data Formatting Issues**: Review the JavaScript formatting code for parsing errors Google Sheets Errors**: Ensure proper OAuth2 permissions and sheet access For any questions or support, please contact: Email or fill out this form
by David Ashby
🛠️ Gotify Tool MCP Server Complete MCP server exposing all Gotify Tool operations to AI agents. Zero configuration needed - all 3 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Gotify Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Gotify Tool tool with full error handling 📋 Available Operations (3 total) Every possible Gotify Tool operation is included: 💬 Message (3 operations) • Create a message • Delete a message • Get many messages 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Gotify Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Gotify Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by lin@davoy.tech
The YogiAI workflow automates sending daily yoga pose reminders and related information via Line Push Messages . This automation leverages data from a Google Sheets database containing yoga pose details such as names, image URLs, and links to ensure users receive personalized and engaging content every day. Purpose Provide users with daily yoga pose suggestions tailored to their practice. Deliver visually appealing and informative content through Line's Flex Messages, including images and clickable links. Log user interactions and preferences back into Google Sheets to refine future recommendations. Key Features Automated Daily Reminders : Sends a curated list of yoga poses at a scheduled time (21:30 Bangkok time). Dynamic Content Generation : Uses AI to rewrite and format messages in a user-friendly manner, complete with emojis and clear instructions. Integration with Google Sheets : Pulls data from a predefined Google Sheet and logs interactions for continuous improvement. Customizable Messaging : Ensures JSON outputs are properly formatted for Line’s Flex Message API, allowing for interactive and visually rich content. Data Source Google Sheets Structure The workflow relies on a Google Sheet structured as follows: PoseName : The name of the yoga pose. uri : The image URL representing the pose. url : A clickable link directing users to more information about the pose. Sample Data Layout Supine Angle https://example.com/SupineAngle-tn146.png https://example.com/pose/SupineAngle Warrior II https://example.com/WarriorII-tn146.png https://example.com/pose/WarriorII *Note : Ensure that you update the Google Sheet with your own data. Refer to this sample sheet for reference. * Scheduled Trigger The workflow is triggered daily at 21:30 (9:30 PM) Bangkok Time (Asia/Bangkok) . This ensures timely delivery of reminders to users, keeping them engaged with their yoga practice. Workflow Process Data Retrieval Node: Get PoseName Fetches yoga pose details from the specified range in the Google Sheet. Content Generation Node: WritePosesToday Utilizes Azure OpenAI to craft user-friendly text, complete with emojis and clear instructions. Node: RewritePosesToday Formats the AI-generated text specifically for Line messaging, ensuring compatibility and visual appeal. JSON Formatting Node: WriteJSONflex Generates JSON structures required for Line’s Flex Messages, enabling carousel displays of yoga pose images and links. Node: Fix JSON Ensures all JSON outputs are correctly formatted before being sent via Line. Message Delivery Node: Line Push with Flex Bubble Sends the final message, including both text and Flex Message carousels, directly to users via Line Push Messages. Logging Interactions Nodes: YogaLog & YogaLog2 Logs each interaction back into Google Sheets to track which poses were sent and how often they appear, refining future recommendations. Setup Prerequisites Google Sheets Account : Set up a Google Sheet with the required structure and populate it with your yoga pose data. Line Developer Account : Create a Line channel to obtain necessary credentials for sending push messages. Azure OpenAI Account : Configure access to Azure OpenAI services for generating and formatting content. Intended Audience This workflow is ideal for: Yoga Instructors : Seeking to engage students with daily pose suggestions. Fitness Enthusiasts : Looking to maintain consistency in their yoga practice. Content Creators : Interested in automating personalized and visually appealing content distribution.
by Dr. Firas
Google Maps Data Extraction Workflow for Lead Generation This workflow is ideal for sales teams, marketers, entrepreneurs, and researchers looking to efficiently gather detailed business information from Google Maps for: Lead generation Market analysis Competitive research Who Is This Workflow For? Sales professionals** aiming to build targeted contact lists Marketers** looking for localized business data Researchers** needing organized, comprehensive business information Problem This Workflow Solves Manually gathering business contact details from Google Maps is: Tedious Error-prone Time-consuming This workflow automates data extraction to increase efficiency, accuracy, and productivity. What This Workflow Does Automates extraction of business data (name, address, phone, email, website) from Google Maps Crawls and extracts additional website content Integrates OpenAI to enhance data processing Stores structured results in Google Sheets for easy access and analysis Uses Google Search API to fill in missing information Setup Import the provided n8n workflow JSON into your n8n instance. Set your OpenAI and Google Sheets API credentials. Provide your Google Maps Scraper and Website Content Crawler API keys. Ensure SerpAPI is configured to enhance data completeness. Customizing This Workflow to Your Needs Adjust scraping parameters: Location Business category Country code Customize Google Sheets output format to fit your current data structure Integrate additional AI processing steps or APIs for richer data enrichment Final Notes This structured approach ensures: Accurate and compliant data extraction** from Google Maps Streamlined lead generation Actionable and well-organized data ready for business use 📄 Documentation: Notion Guide Demo Video 🎥 Watch the full tutorial here: YouTube Demo