by Rahul Joshi
Description Keep your CRM pipeline clean and actionable by automatically archiving inactive deals, logging results to Google Sheets, and sending Slack summary reports. This workflow ensures your sales team focuses on active opportunities while maintaining full audit visibility. 🚀📈 What This Template Does Triggers daily at 9 AM to check all GoHighLevel CRM opportunities. ⏰ Filters deals that have been inactive for 10+ days using last activity or update date. 🔍 Automatically archives inactive deals to keep pipelines clutter-free. 📦 Formats and logs deal details into Google Sheets for record-keeping. 📊 Sends a Slack summary report with total archived count, value, and deal names. 💬 Key Benefits ✅ Keeps pipelines organized by removing stale opportunities. ✅ Saves time through fully automated archiving and reporting. ✅ Maintains a transparent audit trail in Google Sheets. ✅ Improves sales visibility with automated Slack summaries. ✅ Easily adjustable inactivity threshold and scheduling. Features Daily scheduled trigger (9 AM) with adjustable cron expression. GoHighLevel CRM integration for fetching and updating opportunities. Conditional logic to detect inactivity periods. Google Sheets logging with automatic updates. Slack integration for real-time reporting and team visibility. Requirements GoHighLevel API credentials (OAuth2) with opportunity access. Google Sheets OAuth2 credentials with edit permissions. Slack Bot token with chat:write permission. A connected n8n instance (cloud or self-hosted). Target Audience Sales and operations teams managing CRM hygiene. Business owners wanting automated inactive deal cleanup. Agencies monitoring client pipelines across teams. CRM administrators ensuring data accuracy and accountability. Step-by-Step Setup Instructions Connect your GoHighLevel OAuth2 credentials in n8n. 🔑 Link your Google Sheets document and replace the Sheet ID. 📋 Configure Slack credentials and specify your target channel. 💬 Adjust inactivity threshold (default: 10 days) as needed. ⚙️ Update the cron schedule (default: 9 AM daily). ⏰ Test the workflow manually to verify end-to-end automation. ✅
by Daniel Lianes
Auto-scrape Twitter accounts to WhatsApp groups This workflow provides automated access to real-time Twitter/X content through intelligent scraping and AI processing. It keeps you at the cutting edge of breaking news, emerging trends, and industry developments by eliminating the need to manually check multiple social media accounts and delivering curated updates directly to your communication channels. Overview This workflow automatically handles the complete Twitter monitoring process using advanced scraping techniques and AI analysis. It manages API authentication, multi-source data collection, intelligent content filtering, and message delivery with built-in error handling and rate limiting for reliable automation. Core Function: Real-time social media monitoring that transforms Twitter noise into actionable intelligence, ensuring you're always first to know about the latest trends, product launches, and industry shifts that shape your field. Key Capabilities Real-time trend detection** - Catch breaking news and emerging topics as they happen on X/Twitter Multi-source Twitter monitoring** - Track specific accounts AND trending keyword searches simultaneously AI-powered trend analysis** - Gemini 2.5 Pro filters noise and surfaces only the latest developments that matter Stay ahead of the curve** - Identify emerging technologies, viral discussions, and industry shifts before they go mainstream Flexible delivery options** - Pre-configured for WhatsApp, but easily adaptable for Telegram, Slack, Discord, or even blog content generation Rate limit protection** - Built-in delays and error handling using TwitterAPI.io's reliable, cost-effective infrastructure Tools Used n8n**: The automation platform orchestrating the entire workflow TwitterAPI.io**: Reliable access to Twitter/X data without API complexities OpenRouter**: Gateway to advanced AI models for content processing Gemini 2.5 Pro**: Google's latest AI for intelligent content analysis and formatting Evolution API**: WhatsApp Business API integration for message delivery Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install IMPORTANT: Before importing this workflow, you need to install the Evolution API community node: Install Community Node First: Go to Settings > Community Nodes in your n8n instance Add Evolution API: Install n8n-nodes-evolution-api package Restart n8n: Allow the new nodes to load properly Import the Workflow: Download the .json file and import it into your n8n instance Configure Twitter Access: Set up TwitterAPI.io credentials and add target accounts/keywords Set Up AI Processing: Add your OpenRouter API key for Gemini 2.5 Pro access Configure WhatsApp: Set up Evolution API and add your target group ID Test & Deploy: Run a test execution and schedule for daily operation Use Cases Stay Ahead of Breaking News**: Be the first to know about industry announcements, product launches, and major developments the moment they hit X/Twitter Spot Trends Before They Explode**: Identify emerging technologies, viral topics, and shifting conversations while they're still building momentum Competitive Intelligence**: Monitor what industry leaders, competitors, and influencers are discussing in real-time Brand Surveillance**: Track mentions, discussions, and sentiment around your brand as conversations develop Content Creation Pipeline**: Gather trending topics, viral discussions, and timely content ideas for blogs, newsletters, or social media strategy Market Research**: Collect real-time social sentiment and emerging market signals from X/Twitter conversations Multi-platform Distribution**: While configured for WhatsApp, the structured output can easily feed Telegram bots, Slack channels, Discord servers, or automated blog generation systems FIND YOUR WHATSAPP GROUPS The workflow includes a helper node to easily find your WhatsApp group IDs: Use the Fetch Groups node: The workflow includes a dedicated node that fetches all your available WhatsApp groups Run the helper: Execute just that node to see a list of all groups with their IDs Copy the group ID: Find your target group in the list and copy its ID Update the delivery node: Paste the group ID into the final WhatsApp sending node Group ID format: Always ends with @g.us (example: 120363419788967600@g.us) Pro tip: Test with a small private group first before deploying to your main team channels. Connect with Me LinkedIn**: https://www.linkedin.com/in/daniel-lianes/ Discovery Call**: https://cal.com/averis/asesoria Consulting Session**: https://cal.com/averis/consultoria-personalizada Was this helpful? Let me know! I truly hope this was helpful. Your feedback is very valuable and helps me create better resources. Want to take automation to the next level? If you're looking to optimize your business processes or need expert help with a project, here's how I can assist you: Advisory (Discovery Call): Do you have a process in your business that you'd like to automate but don't know where to start? In this initial call, we'll explore your needs and see if automation is the ideal solution for you. Schedule a Discovery Call Personalized Consulting (Paid Session): If you already have a specific problem, an integration challenge, or need hands-on help building a custom workflow, this session is for you. Together, we'll find a powerful solution for your case. Book Your Consulting Session Stay Up to Date For more tricks, ideas, and news about automation and AI, let's connect on LinkedIn! Follow me on LinkedIn #n8n #automation #twitter #whatsapp #ai #socialmedia #monitoring #intelligence #gemini #scraping #workflow #nocode #businessautomation #socialmonitoring #contentcuration #teamcommunication #brandmonitoring #trendanalysis #marketresearch #productivity
by Vitorio Magalhães
🎯 What this workflow does This workflow automatically monitors Reddit subreddits for new image posts and downloads them to Google Drive. It's perfect for content creators, meme collectors, or anyone who wants to automatically archive images from their favorite subreddits without manual work. The workflow intelligently prevents duplicate downloads by checking existing files in Google Drive and sends you Telegram notifications about the download status, so you always know when new content has been saved. 🚀 Key Features Multi-subreddit monitoring**: Configure multiple subreddits to monitor simultaneously Smart duplicate detection**: Never downloads the same image twice Automated scheduling**: Runs on a customizable cron schedule Real-time notifications**: Get instant Telegram updates about download activity Rate limit friendly**: Built-in delays to respect Reddit's API limits Cloud storage integration**: Direct upload to organized Google Drive folders 📋 Prerequisites Before using this workflow, you'll need: Reddit Developer Account**: Create an app at reddit.com/prefs/apps Google Cloud Project**: With Drive API enabled and OAuth2 credentials Telegram Bot**: Created via @BotFather with your chat ID Basic n8n knowledge**: Understanding of credentials and node configuration ⚙️ Setup Instructions 1. Configure Reddit API Access Visit reddit.com/prefs/apps and create a new "script" type application Note your Client ID and Client Secret Add Reddit OAuth2 credentials in n8n 2. Set up Google Drive Integration Enable Google Drive API in Google Cloud Console Create OAuth2 credentials with appropriate scopes Configure Google Drive OAuth2 credentials in n8n Update the folder ID in the workflow to your desired destination 3. Configure Telegram Notifications Create a bot via @BotFather on Telegram Get your chat ID (message @userinfobot) Add Telegram API credentials in n8n 4. Customize Your Settings Update the Settings node with: Your Telegram chat ID List of subreddits to monitor (e.g., ['memes', 'funny', 'pics']) Optional: Adjust wait time between requests Optional: Modify the cron schedule 🔄 How it works Scheduled Trigger: The workflow starts automatically based on your cron configuration Random Selection: Picks a random subreddit from your configured list Fetch Posts: Retrieves the latest 30 posts from the subreddit's "new" section Image Filtering: Keeps only posts with i.redd.it image URLs Duplicate Check: Searches Google Drive to avoid re-downloading existing images Download & Upload: Downloads new images and uploads them to your Drive folder Notification: Sends a Telegram message with the download summary 🛠️ Customization Options Scheduling Modify the cron trigger to run hourly, daily, or at custom intervals Add timezone considerations for your location Content Filtering Add upvote threshold filters to get only popular content Filter by image dimensions or file size Implement NSFW content filtering Storage & Organization Create subfolders by subreddit Add date-based folder organization Implement file naming conventions Notifications & Monitoring Add Discord webhook notifications Create download statistics tracking Log failed downloads for debugging 📊 Use Cases Content Creators**: Automatically collect memes and trending images for social media Digital Marketers**: Monitor visual trends across different communities Researchers**: Archive visual content from specific subreddits for analysis Personal Use**: Build a curated collection of images from your favorite subreddits 🎯 Best Practices Respect Rate Limits**: Keep the wait time between requests to avoid being blocked Monitor Storage**: Regularly check Google Drive storage usage Subreddit Selection**: Choose active subreddits with regular image posts Credential Security**: Use n8n's credential system and never hardcode API keys 🚨 Important Notes This workflow only downloads images from i.redd.it (Reddit's image host) Some subreddits may have bot restrictions Reddit's API has rate limits (~60 requests per minute) Ensure your Google Drive has sufficient storage space Always comply with Reddit's Terms of Service and content policies
by Ranjan Dailata
This workflow automates company research and intelligence extraction from Glassdoor using Decode API for data retrieval and Google Gemini for AI-powered summarization. Who this is for This workflow is ideal for: Recruiters, analysts, and market researchers looking for structured insights from company profiles. HR tech developers and AI research teams needing a reliable way to extract and summarize Glassdoor data automatically. Venture analysts or due diligence teams conducting company research combining structured and unstructured content. Anyone who wants instant summaries and insights from Glassdoor company pages without manual scraping. What problem this workflow solves Manual Data Extraction**: Glassdoor company details and reviews are often scattered and inconsistent, requiring time-consuming copy-paste efforts. Unstructured Insights**: Raw reviews contain valuable opinions but are not organized for analytical use. Fragmented Company Data**: Key metrics like ratings, pros/cons, and FAQs are mixed with irrelevant data. Need for AI Summarization**: Business users need a concise, executive-level summary that combines employee sentiment, culture, and overall performance metrics. This workflow automates data mining, summarization, and structuring, transforming Glassdoor data into ready-to-use JSON and Markdown summaries. What this workflow does The workflow automates the end-to-end pipeline for Glassdoor company research: Trigger Start manually by clicking “Execute Workflow.” Set Input Fields Define company_url (e.g., a Glassdoor company profile link) and geo (country). Extract Raw Data from Glassdoor (Decodo Node) Uses the Decodo API to fetch company data — including overview, ratings, reviews, and frequently asked questions. Generate Structured Data (Google Gemini + Output Parser) The Structured Data Extractor node (powered by Gemini AI) processes raw data into well-defined fields: Company overview (name, size, website, type) Ratings breakdown Review snippets (pros, cons, roles) FAQs Key takeaways Summarize the Insights (Gemini AI Summarizer) Produces a detailed summary highlighting: Company reputation Work culture Employee sentiment trends Strengths and weaknesses Hiring recommendations Merge and Format Combines structured data and summary into a unified object for output. Export and Save Converts the final report into JSON and writes it to disk as C:\{{CompanyName}}.json. Binary Encoding for File Handling Prepares data in base64 for easy integration with APIs or downloadable reports. Setup Prerequisites n8n instance** (cloud or self-hosted) Decodo API credentials** (added as decodoApi) Google Gemini (PaLM) API credentials** Access to the Glassdoor company URLs Make sure to install the Decodo Community Node. Steps Import this workflow JSON file into your n8n instance. Configure your credentials for: Decodo API Google Gemini (PaLM) API Open the Set the Input Fields node and replace: company_url → with the Glassdoor URL geo → with the region (e.g., India, US, etc.) Execute the workflow. Check your output folder (C:\) for the exported JSON report. How to Customize This Workflow You can easily adapt this template to your needs: Add Sentiment Analysis** Include another Gemini or OpenAI node to rate sentiment (positive/negative/neutral) per review. Export to Notion or Google Sheets** Replace the file node with a Notion or Sheets integration for live dashboarding. Multi-Company Batch Mode** Convert the manual trigger to a spreadsheet or webhook trigger for bulk research automation. Add Visualization Layer** Connect the output to Looker Studio or Power BI for analytical dashboards. Change Output Format** Modify the final write node to generate Markdown or PDF summaries using the pypandoc or reportlab module. Summary This n8n workflow combines Decode web scrapping with Google Gemini’s reasoning and summarization power to build a fully automated Glassdoor Research Engine. With a single execution, it: Extracts structured company details Summarizes thousands of employee reviews Delivers insights in an easy-to-consume format Ideal for: Recruitment intelligence Market research Employer branding Competitive HR analysis
by Gegenfeld
AI Background Removal Workflow This workflow automatically removes backgrounds from images stored in Airtable using the APImage API 🡥, then downloads and saves the processed images to Google Drive. Perfect for batch processing product photos, portraits, or any images that need clean, transparent backgrounds. The source (Airtable) and the storage (Google Drive) can be changed to any service or database you want/use. 🧩 Nodes Overview 1. Remove Background (Manual Trigger) This manual trigger starts the background removal process when clicked. Customization Options: Replace with Schedule Trigger for automatic daily/weekly processing Replace with Webhook Trigger to start via API calls Replace with File Trigger to process when new files are added 2. Get a Record (Airtable) Retrieves media files from your Airtable "Creatives Library" database. Connects to the "Media Files" table in your Airtable base Fetches records containing image thumbnails for processing Returns all matching records with their thumbnail URLs and metadata Required Airtable Structure: Table with image/attachment field (currently expects "Thumbnail" field) Optional fields: File Name, Media Type, Upload Date, File Size Customization Options: Replace with Google Sheets, Notion, or any database node Add filters to process only specific records Change to different tables with image URLs 3. Code (JavaScript Processing) Processes Airtable records and prepares thumbnail data for background removal. Extracts thumbnail URLs from each record Chooses best quality thumbnail (large > full > original) Creates clean filenames by removing special characters Adds processing metadata and timestamps Key Features: // Selects best thumbnail quality if (thumbnail.thumbnails?.large?.url) { thumbnailUrl = thumbnail.thumbnails.large.url; } // Creates clean filename cleanFileName: (record.fields['File Name'] || 'unknown') .replace(//g, '_') .toLowerCase() Easy Customization for Different Databases: Product Database**: Change field mappings to 'Product Name', 'SKU', 'Category' Portfolio Database**: Use 'Project Name', 'Client', 'Tags' Employee Database**: Use 'Full Name', 'Department', 'Position' 4. Split Out Converts the array of thumbnails into individual items for parallel processing. Enables processing multiple images simultaneously Each item contains all thumbnail metadata for downstream nodes 5. APImage API (HTTP Request) Calls the APImage service to remove backgrounds from images. API Endpoint: POST https://apimage.org/api/ai-remove-background Request Configuration: Header**: Authorization: Bearer YOUR_API_KEY Body**: image_url: {{ $json.originalThumbnailUrl }} ✅ Setup Required: Replace YOUR_API_KEY with your actual API key Get your key from APImage Dashboard 🡥 6. Download (HTTP Request) Downloads the processed image from APImage's servers using the returned URL. Fetches the background-removed image file Prepares image data for upload to storage 7. Upload File (Google Drive) Saves processed images to your Google Drive in a "bg_removal" folder. Customization Options: Replace with Dropbox, OneDrive, AWS S3, or FTP upload Create date-based folder structures Use dynamic filenames with metadata Upload to multiple destinations simultaneously ✨ How To Get Started Set up APImage API: Double-click the APImage API node Replace YOUR_API_KEY with your actual API key Keep the Bearer prefix Configure Airtable: Ensure your Airtable has a table with image attachments Update field names in the Code node if different from defaults Test the workflow: Click the Remove Background trigger node Verify images are processed and uploaded successfully 🔗 Get your API Key 🡥 🔧 How to Customize Input Customization (Left Section) Replace the Airtable integration with any data source containing image URLs: Google Sheets** with product catalogs Notion** databases with image galleries Webhooks** from external systems File system** monitoring for new uploads Database** queries for image records Output Customization (Right Section) Modify where processed images are stored: Multiple Storage**: Upload to Google Drive + Dropbox simultaneously Database Updates**: Update original records with processed image URLs Email/Slack**: Send processed images via communication tools Website Integration**: Upload directly to WordPress, Shopify, etc. Processing Customization Batch Processing**: Limit concurrent API calls Quality Control**: Add image validation before/after processing Format Conversion**: Use Sharp node for resizing or format changes Metadata Preservation**: Extract and maintain EXIF data 📋 Workflow Connections Remove Background → Get a Record → Code → Split Out → APImage API → Download → Upload File 🎯 Perfect For E-commerce**: Batch process product photos for clean, professional listings Marketing Teams**: Remove backgrounds from brand assets and imagery Photographers**: Automate background removal for portrait sessions Content Creators**: Prepare images for presentations and social media Design Agencies**: Streamline asset preparation workflows 📚 Resources APImage API Documentation 🡥 Airtable API Reference 🡥 n8n Documentation 🡥 ⚡ Processing Speed: Handles multiple images in parallel for fast batch processing 🔒 Secure: API keys stored safely in n8n credentials 🔄 Reliable: Built-in error handling and retry mechanisms
by Guillaume Duvernay
Move beyond generic AI-generated content and create articles that are high-quality, factually reliable, and aligned with your unique expertise. This template orchestrates a sophisticated "research-first" content creation process. Instead of simply asking an AI to write an article from scratch, it first uses an AI planner to break your topic down into logical sub-questions. It then queries a Lookio assistant—which you've connected to your own trusted knowledge base of uploaded documents—to build a comprehensive research brief. Only then is this fact-checked brief handed to a powerful AI writer to compose the final article, complete with source links. This is the ultimate workflow for scaling expert-level content creation. Who is this for? Content marketers & SEO specialists:** Scale the creation of authoritative, expert-level blog posts that are grounded in factual, source-based information. Technical writers & subject matter experts:** Transform your complex internal documentation into accessible public-facing articles, tutorials, and guides. Marketing agencies:** Quickly generate high-quality, well-researched drafts for clients by connecting the workflow to their provided brand and product materials. What problem does this solve? Reduces AI "hallucinations":** By grounding the entire writing process in your own trusted knowledge base, the AI generates content based on facts you provide, not on potentially incorrect information from its general training data. Ensures comprehensive topic coverage:** The initial AI-powered "topic breakdown" step acts like an expert outliner, ensuring the final article is well-structured and covers all key sub-topics. Automates source citation:** The workflow is designed to preserve and integrate source URLs from your knowledge base directly into the final article as hyperlinks, boosting credibility and saving you manual effort. Scales expert content creation:** It effectively mimics the workflow of a human expert (outline, research, consolidate, write) but in an automated, scalable, and incredibly fast way. How it works This workflow follows a sophisticated, multi-step process to ensure the highest quality output: Decomposition: You provide an article title and guidelines via the built-in form. An initial AI call then acts as a "planner," breaking down the main topic into an array of 5-8 logical sub-questions. Fact-based research (RAG): The workflow loops through each of these sub-questions and queries your Lookio assistant. This assistant, which you have pre-configured by uploading your own documents, finds the relevant information and source links for each point. Consolidation: All the retrieved question-and-answer pairs are compiled into a single, comprehensive research brief. Final article generation: This complete, fact-checked brief is handed to a final, powerful AI writer (e.g., GPT-4o). Its instructions are clear: write a high-quality article using only the provided information and integrate the source links as hyperlinks where appropriate. Building your own RAG pipeline VS using Lookio or alternative tools Building a RAG system natively within n8n offers deep customization, but it requires managing a toolchain for data processing, text chunking, and retrieval optimization. An alternative is to use a managed service like Lookio, which provides RAG functionality through an API. This approach abstracts the backend infrastructure for document ingestion and querying, trading the granular control of a native build for a reduction in development and maintenance tasks. Implementing the template 1. Set up your Lookio assistant (Prerequisite): Lookio is a platform for building intelligent assistants that leverage your organization's documents as a dedicated knowledge base. First, sign up at Lookio. You'll get 50 free credits to get started. Upload the documents you want to use as your knowledge base. Create a new assistant and then generate an API key. Copy your Assistant ID and your API Key for the next step. 2. Configure the workflow: Connect your AI provider (e.g., OpenAI) credentials to the two Language Model nodes. In the Query Lookio Assistant (HTTP Request) node, paste your Assistant ID in the body and add your Lookio API Key for authentication (we recommend using a Bearer Token credential). 3. Activate the workflow: Toggle the workflow to "Active" and use the built-in form to generate your first fact-checked article! Taking it further Automate publishing:* Connect the final *Article result* node to a *Webflow* or *WordPress** node to automatically create a draft post in your CMS. Generate content in bulk:* Replace the *Form Trigger* with an *Airtable* or *Google Sheet** trigger to automatically generate a whole batch of articles from your content calendar. Customize the writing style:* Tweak the system prompt in the final *New content - Generate the AI output** node to match your brand's specific tone of voice, add SEO keywords, or include specific calls-to-action.
by Jay Emp0
🐱 MemeCoin Art Generator - using Gemini Flash NanoBanana & upload to Twitter Automatically generates memecoin art and posts it to Twitter (X) powered by Google Gemini, NanoBanana image generation, and n8n automation. 🧩 Overview This workflow creates viral style memecoin images (like Popcat) and posts them directly to Twitter with a witty, Gen Z style tweet. It combines text to image AI, scheduled triggers, and social publishing, all in one seamless flow. Workflow flow: Define your memecoin mascot (name, description, and base image URL). Generate an AI image prompt and a meme tweet. Feed the base mascot image into Gemini Image Generation API. Render a futuristic memecoin artwork using NanoBanana. Upload the final image and tweet automatically to Twitter. 🧠 Workflow Diagram ⚙️ Key Components | Node | Function | |------|-----------| | Schedule Trigger | Runs automatically at chosen intervals to start meme generation. | | Define Memecoin | Defines mascot name, description, and base image URL. | | AI Agent | Generates tweet text and creative image prompt using Google Gemini. | | Google Gemini Chat Model | Provides trending topic context and meme phrasing. | | Get Source Image | Fetches the original mascot image (e.g., Popcat). | | Convert Source Image to Base64 | Prepares image for AI based remixing. | | Generate Image using NanoBanana | Sends the prompt and base image to Gemini Image API for art generation. | | Convert Base64 to PNG | Converts the AI output to an image file. | | Upload to Twitter | Uploads generated image to Twitter via media upload API. | | Create Tweet | Publishes the tweet with attached image. | 🪄 How It Works 1️⃣ Schedule Trigger - starts the automation (e.g., hourly or daily). 2️⃣ Define Memecoin - stores your mascot metadata: memecoin_name: popcat mascot_description: cat with open mouth mascot_image: https://i.pinimg.com/736x/9d/05/6b/9d056b5b97c0513a4fc9d9cd93304a05.jpg 3️⃣ AI Agent - prompts Gemini to: Write a short 100 character tweet in Gen Z slang. Create an image generation prompt inspired by current meme trends. 4️⃣ NanoBanana API - applies your base image + AI prompt to create art. 5️⃣ Upload & Tweet - final image gets uploaded and posted automatically. 🧠 Example Output Base Source Image: Generated Image (AI remix): Published Tweet: Example tweet text: > Popcat's about to go absolutely wild, gonna moon harder than my last test score! 🚀📈 We up! #Popcat #Memecoin 🧩 Setup Tutorial 1️⃣ Prerequisites | Tool | Purpose | |------|----------| | n8n (Cloud or Self hosted) | Workflow automation platform | | Google Gemini API Key | For generating tweet and image prompts | | Twitter (X) API OAuth1 + OAuth2 | For uploading and posting tweets | 2️⃣ Import the Workflow Download memecoin art generator.json. In n8n, click Import Workflow → From File. Set up and connect credentials: Google Gemini API Twitter OAuth (Optional) Adjust Schedule Trigger frequency to your desired posting interval. 3️⃣ Customize Your MemeCoin In the Define Memecoin node, edit these fields to change your meme theme: memecoin_name: "doggo" mascot_description: "shiba inu in astronaut suit" mascot_image: "https://example.com/shiba.jpg" That’s it - next cycle will generate your new meme and post it. 4️⃣ API Notes Gemini Image Generation API Docs:** https://ai.google.dev/gemini-api/docs/image-generation#gemini-image-editing API Key Portal:** https://aistudio.google.com/api-keys
by Meak
Auto-Call Leads from Google Sheets with VAPI → Log Results + Book Calendar This workflow calls new leads from a Google Sheet using VAPI, saves the call results, and (if there’s a booking request) creates a Google Calendar event automatically. Benefits Auto-call each new lead from your call list Save full call outcomes back to Google Sheets Parse “today/tomorrow + time” into a real datetime (IST) Auto-create calendar events for bookings/deliveries Batch-friendly to avoid rate limits How It Works Trigger: New row in Google Sheets (call_list). Prepare: Normalize phone (adds +), then process in batches. Call: Send number to VAPI (/call) with your assistantId + phoneNumberId. Receive: VAPI posts results to your Webhook. Store: Append/Update Google Sheet with: name, role, company, phone, email, interest level, objections, next step, notes, etc. Parse Time: Convert today/tomorrow + HH:MM AM/PM to start/end in IST (+1 hour). Book: Create Google Calendar event with the parsed times. Respond: Send response back to VAPI to complete the cycle. Who Is This For Real estate / local service teams running outbound calls Agencies doing voice outreach and appointment setting Ops teams that want call logs + auto-booking in one place Setup Google Sheets Trigger:** select your spreadsheet Vapi_real-estate and tab call_list. VAPI Call:** set assistantId, phoneNumberId, and add Bearer token. Webhook:** copy the n8n webhook URL into VAPI so results post back. Google Calendar:** set the calendar ID (e.g., you@domain.com). Timezone:* the booking parser formats times to *Asia/Kolkata (IST)**. Batching:** adjust SplitInBatches size to control pace. ROI & Monetization Save 2–4 hours/week on manual dialing + data entry Faster follow-ups with instant booking creation Package as an “AI Caller + Auto-Booking” service ($1k–$3k/month) Strategy Insights In the full walkthrough, I show how to: Map VAPI tool call JSON safely into Sheets fields Handle missing/invalid times and default to safe slots Add no-answer / retry logic and opt-out handling Extend to send Slack/email alerts for hot leads Check Out My Channel For more voice automation workflows that turn leads into booked calls, check out my YouTube channel where I share the exact setups I use to win clients and scale to $20k+ monthly revenue.
by Palak Rathor
This template transforms uploaded brand assets into AI-generated influencer-style posts — complete with captions, images, and videos — using n8n, OpenAI, and your preferred image/video generation APIs. 🧠 Who it’s for Marketers, creators, or brand teams who want to speed up content ideation and visual generation. Perfect for social-media teams looking to turn product photos and brand visuals into ready-to-review creative posts. ⚙️ How it works Upload your brand assets — A form trigger collects up to three files: product, background, and prop. AI analysis & content creation — An OpenAI LLM analyzes your brand tone and generates post titles, captions, and visual prompts. Media generation — Connected image/video generation workflows create corresponding visuals. Result storage — All captions, image URLs, and video URLs are automatically written to a Google Sheet for review or publishing. 🧩 How to set it up Replace all placeholders in nodes: <<YOUR_SHEET_ID>> <<FILE_UPLOAD_BASE>> <<YOUR_API_KEY>> <<YOUR_N8N_DOMAIN>>/form/<<FORM_ID>> Add your own credentials in: Google Sheets HTTP Request AI/LLM nodes Execute the workflow or trigger via form. Check your connected Google Sheet for generated posts and media links. 🛠️ Requirements | Tool | Purpose | |------|----------| | OpenAI / compatible LLM key | Caption & idea generation | | Image/Video generation API | Creating visuals | | Google Sheets credentials | Storing results | | (Optional) n8n Cloud / self-hosted | To run the workflow | 🧠 Notes The workflow uses modular sub-workflows for image and video creation; you can connect your own generation nodes. All credentials and private URLs have been removed. Works seamlessly with both n8n Cloud and self-hosted setups. Output is meant for creative inspiration — review before posting publicly. 🧩 Why it’s useful Speeds up campaign ideation and content creation. Provides structured, reusable results in Google Sheets. Fully visual, modular, and customizable for any brand or industry. 🧠 Example Use Cases Influencer campaign planning Product launch creatives E-commerce catalog posts Fashion, lifestyle, or tech brand content ✅ Security & best practices No hardcoded keys or credentials included. All private URLs replaced with placeholders. Static data removed from the public JSON. Follows n8n’s template structure, node naming, and sticky-note annotation guidelines. 📦 Template info Name: AI-Powered Influencer Post Generator with Google Sheets and Image/Video APIs Category: AI / Marketing Automation / Content Generation Author: Palak Rathor Version: 1.0 (Public Release — October 2025)
by Yasir
🧠 Workflow Overview — AI-Powered Jobs Scraper & Relevancy Evaluator This workflow automates the process of finding highly relevant job listings based on a user’s resume, career preferences, and custom filters. It scrapes fresh job data, evaluates relevance using OpenAI GPT models, and automatically appends the results to your Google Sheet tracker — while skipping any jobs already in your sheet, so you don’t have to worry about duplicates. Perfect for recruiters, job seekers, or virtual assistants who want to automate job research and filtering. ⚙️ What the Workflow Does Takes user input through a form — including resume, preferences, target score, and Google Sheet link. Fetches job listings via an Apify LinkedIn Jobs API actor. Filters and deduplicates results (removes duplicates and blacklisted companies). Evaluates job relevancy using GPT-4o-mini, scoring each job (0–100) against the user’s resume & preferences. Applies a relevancy threshold to keep only top-matching jobs. Checks your Google Sheet for existing jobs and prevents duplicates. Appends new, relevant jobs directly into your provided Google Sheet. 📋 What You’ll Get A personal Job Scraper Form (public URL you can share or embed). Automatic job collection & filtering based on your inputs. Relevance scoring** (0–100) for each job using your resume and preferences. Real-time job tracking Google Sheet that includes: Job Title Company Name & Profile Job URLs Location, Salary, HR Contact (if available) Relevancy Score 🪄 Setup Instructions 1. Required Accounts You’ll need: ✅ n8n account (self-hosted or Cloud) ✅ Google account (for Sheets integration) ✅ OpenAI account (for GPT API access) ✅ Apify account (to fetch job data) 2. Connect Credentials In your n8n instance: Go to Credentials → Add New: Google Sheets OAuth2 API Connect your Google account. OpenAI API Add your OpenAI API key. Apify API Replace <your_apify_api> with your apify api key. Set Up Apify API Get your Apify API key Visit: https://console.apify.com/settings/integrations Copy your API key. Rent the required Apify actor before running this workflow Go to: https://console.apify.com/actors/BHzefUZlZRKWxkTck/input Click “Rent Actor”. Once rented, it can be used by your Apify account to fetch job listings. 3. Set Up Your Google Sheet Make a copy of this template: 📄 Google Sheet Template Enable Edit Access for anyone with the link. Copy your sheet’s URL — you’ll provide this when submitting the workflow form. 4. Deploy & Run Import this workflow (jobs_scraper.json) into your n8n workspace. Activate the workflow. Visit your form trigger endpoint (e.g. https://your-n8n-domain/webhook/jobs-scraper). Fill out the form with: Job title(s) Location Contract type, Experience level, Working mode, Date posted Target relevancy score Google Sheet link Resume text Job preferences or ranking criteria Submit — within minutes, new high-relevance job listings will appear in your Google Sheet automatically. 🧩 Example Use Cases Automate daily job scraping for clients or yourself. Filter jobs by AI-based relevance instead of keywords. Build a smart job board or job alert system. Support a career agency offering done-for-you job search services. 💡 Tips Adjust the “Target Relevancy Score” (e.g., 70–85) to control how strict the filtering is. You can add your own blacklisted companies in the Filter & Dedup Jobs node.
by Aadarsh Jain
Document Analyzer and Q&A Workflow AI-powered document and web page analysis using n8n and GPT model. Ask questions about any local file or web URL and get intelligent, formatted answers. Who's it for Perfect for researchers, developers, content analysts, students, and anyone who needs quick insights from documents or web pages without uploading files to external services. What it does Analyzes local files**: PDF, Markdown, Text, JSON, YAML, Word docs Fetches web content**: Documentation sites, blogs, articles Answers questions**: Using GPT model with structured, well-formatted responses Input format: path_or_url | your_question Examples: /Users/docs/readme.md | What are the installation steps? https://n8n.io | What is n8n? Setup Import workflow into n8n Add your OpenAI API key to credentials Link the credential to the "OpenAI Document Analyzer" node Activate the workflow Start chatting! Customize Change AI model → Edit "OpenAI Document Analyzer" node (switch to gpt-4o-mini for cost savings) Adjust content length → Modify maxLength in "Process Document Content" node (default: 15000 chars) Add file types → Update supportedTypes array in "Parse Document & Question" node Increase timeout → Change timeout value in "Fetch Web Content" node (default: 30s)
by n8n Automation Expert | Template Creator | 2+ Years Experience
Description 🎯 Overview An advanced automated trading bot that implements ICT (Inner Circle Trader) methodology and Smart Money Concepts for cryptocurrency trading. This workflow combines AI-powered market analysis with automated trade execution through Coinbase Advanced Trading API. ⚡ Key Features 📊 ICT Trading Strategy Implementation Kill Zone Detection**: Automatically identifies optimal trading sessions (Asian, London, New York kill zones) Smart Money Concepts**: Analyzes market structure breaks, liquidity grabs, fair value gaps, and order blocks Session Validation**: Real-time GMT time tracking with session strength calculations Structure Analysis**: Detects BOS (Break of Structure) and CHOCH (Change of Character) patterns 🤖 AI-Powered Analysis GPT-4 Integration**: Advanced market analysis using OpenAI's latest model Confidence Scoring**: AI generates confidence scores (0-100) for each trading signal Risk Assessment**: Automated risk level evaluation (LOW/MEDIUM/HIGH) ICT-Specific Prompts**: Custom prompts designed for Inner Circle Trader methodology 🔄 Automated Trading Flow Signal Reception: Receives trading signals via Telegram webhook Data Extraction: Parses symbol, action, price, and technical indicators Session Validation: Verifies current kill zone and trading session strength Market Data: Fetches real-time data from Coinbase Advanced Trading API AI Analysis: Processes signals through GPT-4 with ICT-specific analysis Quality Filter: Multi-condition filtering based on confidence, session, and structure Trade Execution: Automated order placement through Coinbase API Documentation: Records all trades and rejections in Notion databases 📱 Multi-Platform Integration Telegram Bot**: Receives signals and sends formatted notifications Coinbase Advanced**: Real-time market data and trade execution Notion Database**: Comprehensive trade logging and analysis tracking Webhook Support**: External system integration capabilities 🛠️ Setup Requirements API Credentials Needed: Coinbase Advanced Trading API** (API Key, Secret, Passphrase) OpenAI API Key** (GPT-4 access) Telegram Bot Token** and Chat ID Notion Integration** (Database IDs for trade records) Environment Variables: TELEGRAM_CHAT_ID=your_chat_id NOTION_TRADING_DB_ID=your_trading_database_id NOTION_REJECTED_DB_ID=your_rejected_signals_database_id WEBHOOK_URL=your_external_webhook_url 📈 Trading Logic Kill Zone Priority System: London & New York Sessions**: HIGH priority (0.9 strength) Asian & London Close**: MEDIUM priority (0.6 strength) Off Hours**: LOW priority (0.1 strength) Signal Validation Criteria: Signal quality must not be "LOW" Confidence score ≥ 60% Active kill zone session required ICT structure alignment confirmed 🎛️ Workflow Components Extract ICT Signal Data: Parses incoming Telegram messages for trading signals ICT Session Validator: Determines current kill zone and session strength Get Coinbase Market Data: Fetches real-time cryptocurrency data ICT AI Analysis: GPT-4 powered analysis with ICT methodology Parse ICT AI Analysis: Processes AI response with fallback mechanisms ICT Quality & Session Filter: Multi-condition signal validation Execute ICT Trade: Automated trade execution via Coinbase API Create ICT Trading Record: Logs successful trades to Notion Generate ICT Notification: Creates formatted Telegram alerts Log ICT Rejected Signal: Records filtered signals for analysis 🚀 Use Cases Automated ICT-based cryptocurrency trading Smart Money Concepts implementation Kill zone session trading AI-enhanced market structure analysis Professional trading documentation and tracking ⚠️ Risk Management Built-in session validation prevents off-hours trading AI confidence scoring filters low-quality signals Comprehensive logging for performance analysis Automated stop-loss and take-profit calculations This workflow is perfect for traders familiar with ICT methodology who want to automate their Smart Money Concepts trading strategy with AI-enhanced decision making.