by Luan Correia
๐ Overview This template uses Firecrawlโs /search API to perform AI-powered web scraping and screenshots โ no code required. Just type natural language prompts, and an AI Agent will convert them into precise Firecrawl queries. โ๏ธ Setup Get your Firecrawl API Key from https://firecrawl.dev Add it to n8n using HTTP Header Auth: Key: Authorization Value: Bearer YOUR_API_KEY ๐ What It Does Turns natural language into smart search queries Scrapes web data and captures full-page screenshots Returns titles, links, content, and images ๐ก Example Input: > Find AI automation pages on YouTube (exclude Shorts) Result: { "query": "intitle:AI automation site:youtube.com -shorts", "limit": 5 }
by Rahul Joshi
Description Keep your CRM and task system perfectly in sync โ automatically. This workflow monitors a Google Sheet for lead reply updates, instantly updates the corresponding contact in GoHighLevel (GHL), and creates a ClickUp follow-up task when a response is detected. It ensures your sales team never misses a warm lead and every reply is logged, tracked, and acted upon. โก What This Template Does Watches a Google Sheet for reply status changes (โYesโ in the Replied column). ๐ Filters only rows where a lead has replied. ๐ Updates the corresponding contact record in GoHighLevel (GHL). ๐ Automatically creates a follow-up task in ClickUp with lead details. ๐๏ธ Merges updates from both GHL and ClickUp into a single result. ๐ Logs sync status and timestamps in a tracking sheet (Sheet2) for auditing. ๐ Key Benefits Instant sync: Update GHL and ClickUp the moment a lead replies. โ๏ธ Zero manual work: Automated updates and follow-up task creation. ๐ซ Full traceability: Audit trail stored in a second Google Sheet. โ Increased sales responsiveness: Teams act immediately on new replies. ๐ฌ Multi-app harmony: Connects Google Sheets, GHL, and ClickUp seamlessly. ๐ Features Google Sheets trigger runs every minute to check for changes. โฑ๏ธ Conditional logic (If node) processes only โReplied = Yesโ leads. โ GHL contact update node for CRM synchronization. ๐ผ ClickUp task creation node with customizable priority and naming. ๐ Merge node to unify both paths before final sheet update. ๐ Timestamp and action tracking for audit clarity. ๐ Visual sticky notes explaining each stage. ๐งฉ Requirements n8n instance (cloud or self-hosted). ๐งฐ Google Sheet with headers: Name, GHL_ID, Replied, Email. ๐ Connected credentials for: Google Sheets API (Trigger + Update access) GoHighLevel OAuth2 API ClickUp API Separate โlogโ sheet (Sheet2) for sync tracking. ๐ Target Audience Sales teams using GoHighLevel for CRM and ClickUp for task management. ๐ฏ Agencies managing multiple client pipelines. ๐ค Business development reps who track lead replies manually. ๐ Founders automating lead follow-up and CRM hygiene. ๐ Step-by-Step Setup Instructions (Concise) Prepare a Google Sheet with columns: Name, GHL_ID, Email, Replied. ๐ Add a second sheet (โSheet2โ) for tracking updates. ๐ Connect credentials for Google Sheets, GoHighLevel, and ClickUp in n8n. ๐ Import and open this workflow. Update GHL contact field mappings if your CRM structure differs. โ๏ธ Adjust ClickUp team, space, and list IDs to match your setup. ๐ Execute manually once for testing, then enable it to run every minute. โถ๏ธ Security Best Practices Share your Google Sheet only with the n8n Google account (Editor). ๐ Keep API credentials securely stored in n8nโs credential manager. ๐ก๏ธ Log timestamps for traceability and rollback. โณ Periodically archive older sync logs to keep the sheet lightweight. ๐งน
by Oneclick AI Squad
This automated n8n workflow automates AWS S3 bucket and file operations (create, delete, upload, download, copy, list) by parsing simple email commands and sending back success or error confirmations. Good to Know The workflow processes email requests via a Start Workflow (GET Request) node. Data extraction from emails identifies S3 operation commands. Error handling is included for invalid or missing email data. Responses are sent via email for each action performed. How It Works Start Workflow (GET Request)** - Captures incoming email requests. Extract Data from Email** - Parses email content to extract S3 operation commands. Check Task Type** - Validates the type of task (e.g., create bucket, delete file). Create a Bucket** - Creates a new S3 bucket. Delete a Bucket** - Deletes an existing S3 bucket. Copy a File** - Copies a file within S3. Delete a File** - Deletes a file from S3. Download a File** - Downloads a file from S3. Upload a File** - Uploads a file to S3. Get Many Files** - Lists multiple files in a bucket. Check Success or Fail** - Determines the outcome of the operation. Send Success Email** - Sends a success confirmation email. Send Failed Email** - Sends a failure notification email. How to Use Import the workflow into n8n. Configure the Start Workflow (GET Request) node to receive email commands. Test the workflow with sample email commands (e.g., "create bucket: my-bucket", "upload file: document.pdf"). Monitor email responses and adjust command parsing if needed. Example Email for Testing List files from the bucket json-test in Mumbai region. Requirements AWS S3 credentials configured in n8n. Email service integration (e.g., SMTP settings). n8n environment with workflow execution permissions. Customizing This Workflow Adjust the Extract Data from Email node to support additional command formats. Modify the Send Success Email or Send Failed Email nodes to customize messages. Update the S3 nodes to include additional bucket or file attributes.
by Dr. Firas
Generate AI videos and carousels with Blotato and publish to Instagram & TikTok ๐ Documentation: Notion Guide Who is this for? This workflow is designed for content creators, marketers, solopreneurs, and automation enthusiasts who want to generate and publish short-form content on Instagram and TikTok automatically. It is ideal for users looking to combine AI-generated videos and carousels with Blotato and orchestrate everything using n8n. What this workflow does This workflow provides a complete end-to-end automation pipeline: Receives a message from Telegram containing a public URL and a publishing instruction. Creates a content source from the URL using Blotato. Retrieves and validates the extracted text content. Generates either: An AI tweet-card carousel for Instagram, or An AI-generated video for TikTok. Continuously checks the visual generation status until it is fully completed. Publishes the final media automatically to Instagram or TikTok. Sends a confirmation message back to Telegram once the post is successfully published. Setup To use this workflow, you will need: An active n8n instance A Blotato account with API access Instagram and/or TikTok accounts connected in Blotato A Telegram Bot for triggering the workflow and receiving notifications Setup steps: Import the workflow JSON into n8n. Add your Blotato API credentials. Configure the Telegram Trigger with your bot token. Select your Instagram and TikTok accounts in the Blotato post nodes. Activate the workflow. How to customize this workflow to your needs You can customize this workflow by: Changing the visual templates used in Blotato. Adjusting AI prompts to control tone, format, or content style. Adding additional publishing platforms after the posting step. Modifying polling behavior or adding timeouts for long visual renders. Replacing Telegram with another trigger such as Webhooks or Slack. The workflow is modular and easy to extend, making it suitable for a wide range of content automation use cases. ๐ฅ Watch This Tutorial ๐ Need help or want to customize this? ๐ฉ Contact: LinkedIn ๐บ YouTube: @DRFIRASS ๐ Workshops: Mes Ateliers n8n Need help customizing? Contact me for consulting and support : Linkedin / Youtube / ๐ Mes Ateliers n8n
by Rahul Joshi
##๐ Description This workflow acts as a real-time emergency alert system designed for personal safety scenarios. It receives distress signals via webhook, enriches the data with a live Google Maps link and timestamp, generates a clear AI-formatted alert message, instantly notifies a Telegram group, and logs the incident in Google Sheets for tracking and audit. โ๏ธ Step-by-Step Flow 1) Emergency Webhook (Trigger) 2) Receives a POST request containing: Name Phone number Latitude & Longitude Acts as the entry point for emergency alerts. 3) Generate Maps Link (Function Node) Converts latitude & longitude into a Google Maps URL Ensures responders can access live location instantly 4) Create Emergency Message (Function Node) Adds timestamp (IST) Structures raw alert data Prepares base emergency message context 5) AI Agent (OpenAI GPT-4o-mini) Formats the alert into a clean, urgent, human-readable message Ensures clarity, visibility, and consistency Adds emojis and structured layout for quick comprehension 6) Telegram Group Alert (Telegram Node) Sends real-time emergency notification to a predefined group Ensures immediate visibility for responders 7) Log to Google Sheets (Google Sheets Node) Stores alert data for records Fields logged: name, phone, maps link, message, timestamp Creates audit trail for safety tracking ๐งฉ Prerequisites โข Telegram Bot credentials + chat ID โข OpenAI API (GPT-4o-mini) โข Google Sheets OAuth connection โข Webhook endpoint exposed publicly ๐ก Key Benefits โ Instant emergency alert delivery โ Live location sharing via Google Maps โ AI-enhanced message clarity (no ambiguity) โ Real-time group notification for faster response โ Persistent logging for audit and follow-up ๐ฅ Perfect For Women safety applications SOS mobile apps or panic button systems Security teams and emergency response workflows Community safety networks and alert systems
by Sakar Dahal
This is the AI agent which will set the event in the Google Calendar and get the events from the calendar based on the prompt provided to AI agent based on Gemini.
by Alex Berman
Who is this for This workflow is built for real estate investors, wholesalers, and skip tracers who need to find contact details -- phone numbers, emails, and addresses -- for property owners at scale. It automates the entire lookup process using the ScraperCity People Finder API and stores clean results in Airtable for follow-up. How it works A manual trigger starts the workflow. A configuration node lets you define the list of property owner names (or phones/emails) to look up. The workflow submits a skip trace job to the ScraperCity People Finder API, which returns a runId for async tracking. An async polling loop checks the job status every 60 seconds until the result is marked SUCCEEDED. Once complete, the workflow downloads the results CSV and parses each contact record using a code node. Duplicate records are removed, and each unique contact is synced into an Airtable base as a new row with name, phone, email, and address fields. How to set up Create a ScraperCity API credential in n8n (HTTP Header Auth, header name Authorization, value Bearer YOUR_KEY). Update the Configure Search Inputs node with your target names, phones, or emails. Connect your Airtable credential and set your Base ID and Table name in the Sync Contacts to Airtable node. Requirements ScraperCity account with People Finder access (scrapercity.com) Airtable account with a base set up to receive contact data How to customize the workflow Change max_results in Configure Search Inputs to return more contacts per person. Swap the Airtable node for a Google Sheets node if preferred. Add a filter node after parsing to keep only records that have a verified phone number.
by osama goda
What this workflow does This AI agent researches any product across e-commerce marketplaces and generates a full market analysis report from a single chat message. Tell it what you're looking for, your budget, and optionally the region โ it handles the rest. The agent will: Search across marketplaces (Amazon, Noon, Jumia, AliExpress, and more) Scrape the top product pages for real pricing, ratings, and reviews Analyze the data and return a structured report including market overview, top products with links, buying insights, common complaints, and a recommendation with market gap analysis for sellers Key features Multi-marketplace support**: Amazon (.com, .eg, .sa, .ae), Noon, Jumia, AliExpress, eBay Regional awareness**: Automatically detects Egypt, Saudi Arabia, UAE, or defaults to global search Bilingual**: Works in English and Arabic (Egyptian dialect) Currency-aware**: Uses EGP, SAR, AED, or USD based on user input Market gap analysis**: Identifies selling opportunities, not just buying recommendations Example prompts "Research the best wireless earbuds under $30" "ุงุจุญุซูู ุนู ุฃุญุณู ู ุงูููุฉ ูููุฉ ูู ู ุตุฑ ุฃูู ู ู 2500 ุฌููู" "Find the best robot vacuum on Amazon under $200" "ุงุจุญุซูู ุนู ุฃุญุณู powerbank ูู ุงูุณุนูุฏูุฉ ุฃูู ู ู 100 ุฑูุงู" How it works User sends a product research request via the chat trigger The AI Agent plans a search strategy based on product, budget, and region Firecrawl Search finds relevant product and review pages across the web Firecrawl Scrape extracts detailed product data from the top results The AI analyzes everything and generates a structured report with actionable insights Set up steps (takes 2 minutes) Firecrawl API key: Sign up free at firecrawl.dev and add your API key to both Firecrawl nodes LLM API key: Add your OpenAI or OpenRouter API key to the Chat Model node. Recommended model: any model with a large context window (e.g., Gemini 2.5 Flash, GPT-4o) Start researching: Click "Open Chat" and type your product research query Who is this for E-commerce sellers doing product research Dropshippers finding winning products Affiliate marketers comparing products Anyone who wants to shop smarter Built with n8n AI Agent node Firecrawl (Search + Scrape) Compatible with OpenAI, OpenRouter, Google Gemini, and other LLM providers
by Oneclick AI Squad
This enterprise-grade n8n workflow automates the Instagram complaint handling process โ from detection to resolution โ using Claude AI, dynamic ticket assignment, and SLA enforcement. It converts customer complaints in comments into actionable support tickets with auto-assignment, escalation alerts, and full audit trails, ensuring timely responses and improved customer satisfaction with zero manual intervention. Key Features Real-time Instagram polling** for new comments AI-powered complaint detection* using *Claude 3.5 Sonnet** for sentiment and issue classification Automatic ticket creation** in Google Sheets (or integrable with Zendesk/Jira) Round-robin assignment** to team members from a dynamic roster SLA timer and monitoring** (e.g., 24-hour response window with escalation at 80% elapsed) Escalation engine** notifies managers via Slack if near breach Multi-channel notifications:** Slack for assignees and escalations Audit-ready:** Logs ticket details, assignments, and actions Scalable triggers:** Webhook or scheduled polling Workflow Process | Step | Node | Description | | ---- | ----------------------------------- | -------------------------------------------------------- | | 1 | Schedule Trigger | Runs every 15 minutes or via webhook (/complaint-handler) | | 2 | Get Instagram Posts | Fetches recent posts from Instagram Graph API | | 3 | Get Comments | Retrieves comments for the latest post | | 4 | Loop Over Comments | Processes each comment individually to avoid rate limits | | 5 | Detect Complaint (Claude AI) | Uses AI to classify if complaint, extract issue/severity | | 6 | IF Complaint | Branches: Proceed if yes, end if no | | 7 | Get Team Members | Loads team roster from TeamMembers sheet | | 8 | Assign Ticket | Sets assignee via round-robin logic | | 9 | Create Ticket (Google Sheet) | Appends new ticket with details and SLA due date | | 10 | Notify Assignee (Slack) | Alerts assigned team member | | 11 | Wait for SLA Check | Delays to near-SLA-breach point (e.g., 20 hours) | | 12 | Check Ticket Status | Looks up ticket status in sheet | | 13 | IF SLA Breach Near | Checks if unresolved; escalates if yes | | 14 | Escalate to Manager (Slack) | Notifies manager for urgent action | | 15 | End (Non-Complaint Path) | Terminates non-complaint branches | Setup Instructions 1. Import Workflow Open n8n โ Workflows โ Import from Clipboard Paste the JSON workflow 2. Configure Credentials | Integration | Details | | ----------------- | -------------------------------------------------- | | Instagram API | Access token from Facebook Developer Portal | | Claude AI | Anthropic API key for claude-3-5-sonnet-20241022 | | Google Sheets | Service account with spreadsheet access | | Slack | Webhook or OAuth app | 3. Update Spreadsheet IDs Ensure your Google Sheets include: SupportTickets TeamMembers 4. Set Triggers Webhook:** /webhook/complaint-handler (for real-time Instagram notifications if set up) Schedule:** Every 15 minutes 5. Run a Test Use manual execution to confirm: Ticket creation in sheet Slack notifications SLA wait and escalation logic (simulate by shortening wait time) Google Sheets Structure SupportTickets | ticketId | commentText | user | createdAt | assignedTo | status | slaDue | issueType | severity | |--------------|-------------|----------|--------------------|--------------------|--------|--------------------|---------------|----------| | TKT-12345678 | Sample complaint text | user123 | 2023-10-01T12:00:00Z | john@team.com | Open | 2023-10-02T12:00:00Z | Product Issue | Medium | TeamMembers | name | email | |-----------|-------------------| | John Doe | john@team.com | | Jane Smith| jane@team.com | System Requirements | Requirement | Version/Access | | --------------------- | ---------------------------------------------- | | n8n | v1.50+ (AI integrations supported) | | Claude AI API | claude-3-5-sonnet-20241022 | | Instagram Graph API| Business account access token | | Google Sheets API | https://www.googleapis.com/auth/spreadsheets | | Slack Webhook | Required for notifications | Optional Enhancements Integrate Zendesk/Jira for professional ticketing instead of Google Sheets Add email notifications to customers acknowledging complaints Use sentiment thresholds for prioritizing high-severity tickets Connect Twilio for SMS escalations Enable multi-platform support (e.g., Twitter/Facebook comments) Add reporting dashboard via Google Data Studio Implement auto-resolution for simple complaints using AI responses Result: A single automated system that detects, tickets, assigns, and enforces SLAs on Instagram complaints โ with full AI intelligence and zero manual work. Explore More AI Workflows: Get in touch with us for custom n8n automation!
by Matthew
Multi-Channel Cold Email Generator (LinkedIn + Website Fallback) Description This workflow automates the generation of hyper-personalized cold emails. It intelligently switches between two data sources: LinkedIn Activity and Company Website. If the lead has recent LinkedIn posts, the AI generates an icebreaker referencing their specific thoughts or news. If no posts are found, the workflow falls back to scraping their company website and generating an angle based on their business proposition. How it Works Fetch Data: Pulls a list of leads from a Google Sheet. Scrape LinkedIn: Uses Apify to attempt to scrape recent posts for the lead. Conditional Logic: Path A (Posts Found): Aggregates the posts, analyzes the context using GPT-4, and writes an email referencing the content. Path B (No Posts): Scrapes the URL provided in companyWebsite, converts the HTML to Markdown, analyzes the company value prop, and writes an email based on that. Save Results: Writes the generated Icebreaker, Intro, and CompanyType back to the original Google Sheet. Requirements n8n:** Self-hosted or Cloud. Google Sheets Account:** A sheet containing columns for email_final, linkedin_url, and companyWebsite. Apify Account:** You must have the LinkedIn Scraper actor (ID: A3cAPGpwBEG8RJwse or similar) configured and an API Token. OpenAI API Key:** Access to GPT-4 model is recommended for best quality. Setup Instructions Import the JSON: Copy the provided JSON template and paste it into your n8n canvas. Configure Credentials: Set up your Google Sheets and OpenAI credentials in n8n. Apify Token: Locate the Apify LinkedIn Scraper node (HTTP Request). In Header Parameters > Authorization, replace YOUR_APIFY_API_TOKEN with your actual Apify Bearer token. Google Sheet Configuration: Open the Fetch Leads node. Select your generic Sheet and specific Workbook. Open both Update Row nodes (there are two: one for the Website branch, one for the LinkedIn branch) and ensure they point to the same Sheet/Workbook. Customize AI Prompts: Open the two Write Email Copy nodes. In the system prompt, look for [YOUR_BUSINESS_TYPE] and [YOUR_COMPANY_NAME]. Replace these with your actual business details to ensure the AI generates relevant outreach. Customization Model Selection:** You can switch the OpenAI model to gpt-3.5-turbo to save costs, though the quality of the "Icebreakers" may decrease. Output Columns:* The workflow currently outputs Icebreaker, intro, and companyType. You can modify the *Update Row** nodes to map these to different column headers in your sheet if needed.
by Alex Berman
Who is this for This workflow is built for real estate investors, private investigators, recruiters, and sales teams who need to skip trace individuals -- finding contact details, addresses, and phone numbers from a name, email, or phone number -- and store the enriched records automatically in Notion. How it works A user fills out an n8n form with one or more search inputs (name, email, or phone number). The workflow submits that data to the ScraperCity People Finder API, which begins an async enrichment job. The workflow then polls the job status every 60 seconds until it completes. Once the scrape succeeds, the results are downloaded, parsed, deduplicated, and each enriched person record is written as a new page in a Notion database. How to set up Create a ScraperCity account at scrapercity.com and copy your API key. In n8n, create an HTTP Header Auth credential named "ScraperCity API Key" with the key Authorization and value Bearer YOUR_KEY. Create a Notion integration and share your target database with it. Create a Notion credential in n8n. Open the Configure Search Defaults node and set your preferred max_results value. Open the Save Person Record to Notion node and set your Notion Database ID. Requirements ScraperCity account (scrapercity.com) with People Finder access n8n instance (cloud or self-hosted) Notion workspace with a database for storing people records How to customize the workflow Change the form fields to accept bulk CSV input instead of a single name. Add a Filter node after parsing to only save records that include a valid phone number. Route results to Google Sheets or HubSpot instead of Notion by swapping the final node.
by Rahul Joshi
Description Keep your internal knowledge base fresh and reliable with this automated FAQ freshness monitoring system. ๐ง ๐ This workflow tracks FAQ update dates in Notion, calculates SLA compliance, logs results in Google Sheets, and sends Slack alerts for outdated items. Perfect for documentation teams ensuring content accuracy and operational visibility across platforms. ๐๐ฌ What This Template Does 1๏ธโฃ Triggers every Monday at 10:00 AM to start freshness checks. โฐ 2๏ธโฃ Fetches FAQ entries from your Notion database. ๐ 3๏ธโฃ Computes SLA status based on the last edited date (30-day threshold). ๐ 4๏ธโฃ Updates a Google Sheet with current FAQ details and freshness status. ๐ 5๏ธโฃ Filters out overdue FAQs that need review. ๐ 6๏ธโฃ Aggregates all overdue items into one report. ๐งพ 7๏ธโฃ Sends a consolidated Slack alert with direct Notion links and priority tags. ๐ฌ Key Benefits โ Maintains documentation freshness across systems. โ Reduces support friction from outdated FAQs. โ Centralizes visibility with Google Sheets reporting. โ Notifies your team in real time via Slack. โ Enables SLA-based documentation governance. Features Weekly automated schedule (every Monday at 10 AM). Notion database integration for FAQ retrieval. SLA computation and overdue filtering logic. Google Sheets sync for audit logging. Slack notification for overdue FAQ alerts. Fully configurable thresholds and alerting logic. Requirements Notion API credentials with database read access. Google Sheets OAuth2 credentials with edit access. Slack Bot Token with chat:write permission. Environment variables : NOTION_FAQ_DATABASE_ID GOOGLE_SHEET_FAQ_ID SLACK_FAQ_ALERT_CHANNEL_ID Target Audience Knowledge management and documentation teams ๐งพ SaaS product teams maintaining FAQ accuracy ๐ก Support operations and customer success teams ๐ฌ QA and compliance teams monitoring SLA adherence ๐ Step-by-Step Setup Instructions 1๏ธโฃ Connect Notion credentials and set your FAQ database ID. 2๏ธโฃ Create a Google Sheet with required headers (Title, lastEdited, slaStatus, etc.). 3๏ธโฃ Add your Slack credentials and specify the alert channel ID. 4๏ธโฃ Configure the cron schedule (0 10 * * 1) for Monday 10:00 AM checks. 5๏ธโฃ Run once manually to verify credentials and mappings. 6๏ธโฃ Activate for ongoing weekly freshness monitoring. โ