by VipinW
Apply to jobs automatically from Google Sheets with status tracking Who's it for Job seekers who want to streamline their application process, save time on repetitive tasks, and never miss following up on applications. Perfect for anyone managing multiple job applications across different platforms. What it does This workflow automatically applies to jobs from a Google Sheet, tracks application status, and keeps you updated with notifications. It handles the entire application lifecycle from submission to status monitoring. Key features: Reads job listings from Google Sheets with filtering by priority and status Automatically applies to jobs on LinkedIn, Indeed, and other platforms Updates application status in real-time Checks application status every 2 days and notifies you of changes Sends email notifications for successful applications and status updates Prevents duplicate applications and manages rate limiting How it works The workflow runs on two main schedules: Daily Application Process (9 AM, weekdays): Reads your job list from Google Sheets Filters for jobs marked as "Not Applied" with Medium/High priority Processes each job individually to prevent rate limiting Applies to jobs using platform-specific APIs (LinkedIn, Indeed, etc.) Updates the sheet with application status and reference ID Sends confirmation email for each application Status Monitoring (Every 2 days at 10 AM): Checks all jobs with "Applied" status Queries job platforms for application status updates Updates the sheet if status has changed Sends notification emails for status changes (interviews, rejections, etc.) Requirements Google account with Google Sheets access Gmail account for notifications Resume stored online (Google Drive, Dropbox, etc.) API access to job platforms (LinkedIn, Indeed) - optional for basic version n8n instance (self-hosted or cloud) How to set up Step 1: Create Your Job Tracking Sheet Create a Google Sheet with these exact column headers: | Job_ID | Company | Position | Status | Applied_Date | Last_Checked | Application_ID | Notes | Job_URL | Priority | |--------|---------|----------|--------|--------------|--------------|----------------|-------|---------|----------| | JOB001 | Google | Software Engineer | Not Applied | | | | | https://careers.google.com/jobs/123 | High | | JOB002 | Microsoft | Product Manager | Not Applied | | | | | https://careers.microsoft.com/jobs/456 | Medium | Column explanations: Job_ID**: Unique identifier (JOB001, JOB002, etc.) Company**: Company name Position**: Job title Status**: Not Applied, Applied, Under Review, Interview Scheduled, Rejected, Offer Applied_Date**: Auto-filled when application is submitted Last_Checked**: Auto-updated during status checks Application_ID**: Platform reference ID (auto-generated) Notes**: Additional information or application notes Job_URL**: Direct link to job posting Priority**: High, Medium, Low (Low priority jobs are skipped) Step 2: Configure Google Sheets Access In n8n, go to Credentials → Add Credential Select Google Sheets OAuth2 API Follow the OAuth setup process to authorize n8n Test the connection with your job tracking sheet Step 3: Set Up Gmail Notifications Add another credential for Gmail OAuth2 API Authorize n8n to send emails from your Gmail account Test by sending a sample email Step 4: Update Workflow Configuration In the "Set Configuration" node, update these values: spreadsheetId**: Your Google Sheet ID (found in the URL) resumeUrl**: Direct link to your resume (make sure it's publicly accessible) yourEmail**: Your email address for notifications coverLetterTemplate**: Customize your cover letter template Step 5: Customize Application Logic For basic version (no API access): The workflow includes placeholder HTTP requests that you can replace with actual job platform integrations. For advanced version (with API access): Replace LinkedIn/Indeed HTTP nodes with actual API calls Add your API credentials to n8n's credential store Update the platform detection logic for additional job boards Step 6: Test and Activate Add 1-2 test jobs to your sheet with "Not Applied" status Run the workflow manually to test Check that the sheet gets updated and you receive notifications Activate the workflow to run automatically How to customize the workflow Adding New Job Platforms Update Platform Detection: Modify the "Check Platform Type" node to recognize new job board URLs Add New Application Node: Create HTTP request nodes for new platforms Update Status Checking: Add status check logic for the new platform Customizing Application Strategy Rate Limiting**: Add "Wait" nodes between applications (recommended: 5-10 minutes) Application Timing**: Modify the cron schedule to apply during optimal hours Priority Filtering**: Adjust the filter conditions to match your criteria Multiple Resumes**: Use conditional logic to select different resumes based on job type Enhanced Notifications Slack Integration**: Replace Gmail nodes with Slack for team notifications Discord Webhooks**: Send updates to Discord channels SMS Notifications**: Use Twilio for urgent status updates Dashboard Updates**: Connect to Notion, Airtable, or other productivity tools Advanced Features AI-Powered Personalization**: Use OpenAI to generate custom cover letters Job Scoring**: Implement scoring logic based on job requirements vs. your skills Interview Scheduling**: Auto-schedule interviews when status changes Follow-up Automation**: Send follow-up emails after specific time periods Important Notes Platform Compliance Always respect rate limits to avoid being blocked Follow each platform's Terms of Service Use official APIs when available instead of web scraping Don't spam job boards with excessive applications Data Privacy Store credentials securely using n8n's credential store Don't hardcode API keys or personal information in nodes Regularly review and clean up old application data Ensure your resume link is secure but accessible Quality Control Start with a small number of jobs to test the workflow Review application success rates and adjust strategy Monitor for errors and set up proper error handling Keep your job list updated and remove expired postings This workflow transforms job searching from a manual, time-consuming process into an automated system that maximizes your application efficiency while maintaining quality and compliance.
by IvanCore
Disclaimer: This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Important distinction: This template manages Telegram Copilot's UserBots (client accounts), not Telegram Bots. UserBot vs. Bot: Key Differences 🔹 Telegram Copilot's UserBots Authenticate as real user accounts (phone number required) Can join groups/channels without "Bot" label Subject to Telegram's client API limits Require manual login (MFA supported) 🔹 Telegram Bots Use @BotFather-created tokens Limited to bot API functionality Can't initiate chats with unbidden users No phone number required This template solves the unique challenges of UserBot management through: Core Functionality 🛡️ Session Reliability Automatic crash recovery (5-step restart sequence) Persistent session monitoring (checks every 6h) Database cleanup via /clear command 📱 Multi-Device Support Manages sessions independently from mobile clients Tracks active devices via /stat command Isolates session data per credential 🔔 Smart Notifications Real-time alerts to admin chat Detailed error context with authState snapshots Success confirmations with session metadata Setup Guide Prerequisites Self-hosted n8n instance (community node required) Valid Telegram account for UserBot Telegram bot token for notifications TelePilot credentials with api_id/api_hash Configuration Steps Credential Setup Add TelePilot credentials in n8n Configure Telegram bot token in notification nodes Set admin chat ID for alerts Monitoring Customization Adjust check frequency in Schedule Trigger Modify alert thresholds in Filter nodes Configure retry logic in recovery sequence Session Management Test /start command flow Verify /stat output format Confirm notification delivery Workflow Customization Advanced Options Add secondary notification channels (Email, Slack) Implement escalating alert system Integrate with monitoring dashboards Customize recovery attempt limits Compliance Notes UserBots must comply with Telegram's Terms of Service Not intended for bulk messaging or spam Recommended for legitimate automation use cases Note: UserBots must comply with Telegram ToS. Not for spam/mass messaging. Why This Matters: UserBots enable automation scenarios impossible with regular bots (e.g., group management as normal user, reacting as human account). This workflow keeps them reliably online 24/7.
by darrell_tw
How it works Receive a chat input as an image prompt. Call OpenAI's gpt-image-1 API to generate an image. Split the returned images and process them one by one. Upload each generated image to Google Drive. Save image links and thumbnails to a Google Sheets document. Record token usage and estimated cost into a separate sheet. Set up steps Connect your OpenAI API credentials for image generation. Connect your Google Drive and Google Sheets accounts. Set the destination folder in Google Drive. Set the target Google Sheet and specify the correct sheet tabs. The setup usually takes around 5-10 minutes. Detailed field mappings are already pre-configured inside the workflow. Additional tips and instructions are included as sticky notes inside the workflow. Google Sheet copy url Copy Sheet Link
by Leandro Melo
Keep your Hostinger VPS servers secure with automated backups! This n8n (self-hosted) workflow for is designed to create daily snapshots and send server metrics effortlessly, ensuring you always have an up-to-date recovery copy. Key Features: ✅ Automated Snapshots: Daily execution with zero manual intervention. ✅ Smart Replacement: Hostinger allows only 1 snapshot per VPS—the workflow automatically replaces the previous one. ✅ Notifications: Alerts via WhatsApp (Evolution API) or other configurable channels for execution confirmation. Quick Setup: Prerequisites: Install the Community Node n8n-nodes-hostinger-api and n8n-nodes-evolution-api in your n8n instance. Generate a Hostinger API Key in their dashboard: hpanel.hostinger.com/profile/api. Workflow Configuration: Add the Hostinger API credential in the first node and reuse it across the workflow. Customize the schedule (e.g., daily at 2 AM) and notification method (Evolution API for WhatsApp, email, etc.). Important Note: Hostinger overwrites the previous snapshot with each new execution, keeping only the latest version. VPS Metrics avaliables (send in messages): 🔹Status: snapshot status 🔹Date: snapshot date time 🔹Server: server name 🔹IP: external server IP ⚙️ Métrics: 🔹 Number of vCPUs 🔹 Ram usage / avaliable 🔹 Hard Disk usage / avaliable 🔹 Operational Sys and version 🔹 Uptime time (days, hours)
by Custom Workflows AI
Introduction The Content SEO Audit Workflow is a powerful automated solution that generates comprehensive SEO audit reports for websites. By combining the crawling capabilities of DataForSEO with the search performance metrics from Google Search Console, this workflow delivers actionable insights into content quality, technical SEO issues, and performance optimization opportunities. The workflow crawls up to 1,000 pages of a website, analyzes various SEO factors including metadata, content quality, internal linking, and search performance, and then generates a professional, branded HTML report that can be shared directly with clients. The entire process is automated, transforming what would typically be hours of manual analysis into a streamlined workflow that produces consistent, thorough results. This workflow bridges the gap between technical SEO auditing and practical, client-ready deliverables, making it an invaluable tool for SEO professionals and digital marketing agencies. Who is this for? This workflow is designed for SEO consultants, digital marketing agencies, and content strategists who need to perform comprehensive content audits for clients or their own websites. It's particularly valuable for professionals who: Regularly conduct SEO audits as part of their service offerings Need to provide branded, professional reports to clients Want to automate the time-consuming process of content analysis Require data-driven insights to inform content strategy decisions Users should have basic familiarity with SEO concepts and metrics, as well as a basic understanding of how to set up API credentials in n8n. While no coding knowledge is required to run the workflow, users should be comfortable with configuring workflow parameters and following setup instructions. What problem is this workflow solving? Content audits are essential for SEO strategy but are traditionally labor-intensive and time-consuming. This workflow addresses several key challenges: Manual Data Collection: Gathering data from multiple sources (crawlers, Google Search Console, etc.) typically requires hours of work. This workflow automates the entire data collection process. Inconsistent Analysis: Manual audits can suffer from inconsistency in methodology. This workflow applies the same comprehensive analysis criteria to every page, ensuring thorough and consistent results. Report Generation: Creating professional, client-ready reports often requires additional design work after the analysis is complete. This workflow generates a fully branded HTML report automatically. Data Integration: Correlating technical SEO issues with actual search performance metrics is difficult when working with separate tools. This workflow seamlessly integrates crawl data with Google Search Console metrics. Scale Limitations: Manual audits become increasingly difficult with larger websites. This workflow can efficiently process up to 1,000 pages without additional effort. What this workflow does Overview The Content SEO Audit Workflow crawls a specified website, analyzes its content for various SEO issues, retrieves performance data from Google Search Console, and generates a comprehensive HTML report. The workflow identifies issues in five key categories: status issues (404 errors, redirects), content quality (thin content, readability), metadata SEO (title/description issues), internal linking (orphan pages, excessive click depth), and performance (underperforming content). The final report includes executive summaries, detailed issue breakdowns, and actionable recommendations, all branded with your company's colors and logo. Process Initial Configuration: The workflow begins by setting parameters including the target domain, crawl limits, company information, and branding colors. Website Crawling: The workflow creates a crawl task in DataForSEO and periodically checks its status until completion. Data Collection: Once crawling is complete, the workflow: Retrieves the raw audit data from DataForSEO Extracts all URLs with status code 200 (successful pages) Queries Google Search Console API for each URL to get clicks and impressions data Identifies 404 and 301 pages and retrieves their source links Data Analysis: The workflow analyzes the collected data to identify issues including: Technical issues: 404 errors, redirects, canonicalization problems Content issues: thin content, outdated content, readability problems SEO metadata issues: missing/duplicate titles and descriptions, H1 problems Internal linking issues: orphan pages, excessive click depth, low internal links Performance issues: underperforming pages based on GSC data Report Generation: Finally, the workflow: Calculates a health score based on the severity and quantity of issues Generates prioritized recommendations Creates a comprehensive HTML report with interactive tables and visualizations Customizes the report with your company's branding Provides the report as a downloadable HTML file Setup To set up this workflow, follow these steps: Import the workflow: Download the JSON file and import it into your n8n instance. Configure DataForSEO credentials: Create a DataForSEO account at https://app.dataforseo.com/api-access (they offer a free $1 credit for testing) Add a new "Basic Auth" credential in n8n following the HTTP Request Authentication guide Assign this credential to the "Create Task", "Check Task Status", "Get Raw Audit Data", and "Get Source URLs Data" nodes Configure Google Search Console credentials: Add a new "Google OAuth2 API" credential following the Google OAuth guide Ensure your Google account has access to the Google Search Console property you want to analyze Assign this credential to the "Query GSC API" node Update the "Set Fields" node with: dfs_domain: The website domain you want to audit dfs_max_crawl_pages: Maximum number of pages to crawl (default: 1000) dfs_enable_javascript: Whether to enable JavaScript rendering (default: false) company_name: Your company name for the report branding company_website: Your company website URL company_logo_url: URL to your company logo brand_primary_color: Your primary brand color (hex code) brand_secondary_color: Your secondary brand color (hex code) gsc_property_type: Set to "domain" or "url" depending on your Google Search Console property type Run the workflow: Click "Start" and wait for it to complete (approximately 20 minutes for 500 pages). Download the report: Once complete, download the HTML file from the "Download Report" node. How to customize this workflow to your needs This workflow can be adapted in several ways to better suit your specific requirements: Adjust crawl parameters: Modify the "Set Fields" node to change: The maximum number of pages to crawl (dfs_max_crawl_pages). This workflow supports up to 1000 pages. Whether to enable JavaScript rendering for JavaScript-heavy sites (dfs_enable_javascript) Customize issue detection thresholds: In the "Build Report Structure" code node, you can modify: Word count thresholds for thin content detection (currently 1500 words) Click depth thresholds (currently flags pages deeper than 4 clicks) Title and description length parameters (currently 40-60 chars for titles, 70-155 for descriptions) Readability score thresholds (currently flags Flesch-Kincaid scores below 55) Modify the report design: In the "Generate HTML Report" code node, you can: Adjust the HTML/CSS to change the report layout and styling Add or remove sections from the report Change the recommendations logic Modify the health score calculation algorithm Add additional data sources: You could extend the workflow by: Adding Pagespeed Insights data for performance metrics Incorporating backlink data from other APIs Adding keyword ranking data from rank tracking APIs Implement automated delivery: Add nodes after the "Download Report" to: Send the report directly to clients via email Upload it to cloud storage Create a PDF version of the report
by Giannis Kotsakiachidis
🏦 GoCardless ⇄ Maybe Finance — Automatic Multi-Bank Sync & Weekly Overview 💸 Who’s it for 🤔 Freelancers, founders, households, and side-hustlers who work with several bank accounts but want one, always-up-to-date budget inside Maybe Finance—no more CSV exports or copy-paste. How it works / What it does ⚙️ Schedule Trigger (cron) fires every Monday 📅 (switch to Manual Trigger while testing) Get access token — fresh 24 h GoCardless token 🔑 Fetch transactions for each account: Revolut Pro Revolut Personal ABN AMRO (add extra HTTP Request nodes for any other GoCardless-supported banks) Extract booked — keep only settled items 🗂️ Set transactions … — map every record to Maybe Finance’s schema 📝 Merge all arrays into one payload 🔄 Create transactions to Maybe — POSTs each item via API 🚀 Resend Email — sends you a “Weekly transactions overview” 📧 All done in a single run — your Maybe dashboard is refreshed and you get an inbox alert. How to set up 🛠️ Import the template into n8n (cloud or self-hosted). Create credentials GoCardless secret_id & secret_key Maybe Finance API key (Optional) Resend API key for email notifications One-time GoCardless config (run the blocks on the left): /token/new/ → obtain token /institutions → find institution IDs /agreements/enduser/ → create agreements /requisitions/ → get the consent URL & finish bank login /requisitions/{id} → copy the GoCardless account_ids Create the same accounts in Maybe Finance and run the HTTP GET request in the purple frame and copy their account_ids. Open each Set transactions … node and paste the correct Maybe account_id. Adjust the Schedule Trigger (e.g. daily, monthly). Save & activate 🎉 Requirements 📋 n8n 1.33 + GoCardless app (secret ID & key, live or sandbox) Maybe Finance account & API key (Optional) Resend account for email How to customize ✨ Include pending transactions**: change the Item Lists filter. Add more banks**: duplicate the “Get … transactions” → “Extract booked” → “Set transactions” path and plug its output into the Merge node. Different interval**: edit the cron rule in Schedule Trigger. Disable emails**: just remove or deactivate the Resend node. Send alerts to Slack / Teams**: branch after the Merge node and add a chat node. Happy budgeting! 💰
by Nadia Privalikhina
This n8n template offers a free and automated way to convert images from a Google Drive folder into a single PDF document. It uses Google Slides as an intermediary, allowing you to control the final PDF's page size and orientation. If you're looking for a no-cost solution to batch convert images to PDF and need flexibility over the output dimensions (like A4, landscape, or portrait), this template is for you! It's especially handy for creating photo albums, visual reports, or simple portfolios directly from your Google Drive. How it works The workflow first copies a Google Slides template you specify. The page setup of this template (e.g., A4 Portrait) dictates your final PDF's dimensions. It then retrieves all images from a designated Google Drive folder, sorts them by creation date. Each image is added to a new slide in the copied presentation. Finally, the entire Google Slides presentation is converted into a PDF and saved back to your Google Drive. How to use Connect your Google Drive and Google Slides accounts in the relevant nodes. In the "Set Pdf File Name" node, define the name for your output PDF. In the "CopyPdfTemplate" node: Select your Google Slides template file (this sets the PDF page size/orientation). Choose the Google Drive folder containing your source images. Ensure your images are in the specified folder. For best results, images should have an aspect ratio similar to your chosen Slides template. Run the workflow to generate your PDF by clicking 'Test Workflow' Requirements Google Drive account. Google Slides account. Google Slides Template stored on your Google Drive Customising this workflow Adjust the "Filter: Only Images" node if you use image formats other than PNG (e.g., image/jpeg for JPGs). Modify the image sorting logic in the "Sort by Created Date" node if needed.
by Adrian Bent
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow scrapes job listings on indeed via Apify, automatically gets that dataset, extracts information about the listing filters jobs off relevance, finds a decision maker at the company and updates a database (google sheets) with that info for outreach. All you need to do is run Apify actor then the database will update with the processed data. Benefits: Complete Job search Automation - A webhook monitors the Apify actor which sends a integration and starts the process AI-Powered Filter - Uses ChatGPT to analyze content/context, identify company goals, and filters based on job description Smart Duplicate Prevention - Automatically tracks processed job listings in a database to avoid redundancy Multi-Platform Intelligence - Combines Indeed scraping, web research via Tavily, and enriches each listing Niche Focus - Process content from multiple niches 6 currently (hardcoded) but can be changed to fit other niches (just prompt the "job filter" node) How It Works: Indeed Job Discovery: Search and apply filter for relevant job listings, copy and use URL in Apify Uses Apify's Indeed job scraper to scrape job listings from the URL of interest Automatically scrapes the information, stores it in a dataset and initiates a integration Oncoming Data Processing: Loops over 500 items (can be changed) with a batch size of 55 items (can be changed) to avoid running into API timeouts. Multiple filters to ensure all fields are scrapped with our required metrics (website must exist and number of employees < 250) Duplicate job listings are removed from oncoming batch to be processed Job Analysis & Filter: An additional filter to remove any job listing from the oncoming batch if it already exists in the google sheets database Then all new job listings gets pasted to chatGPT which uses information about the job post/description to determine if it is relevant to us All relevant jobs get a new field "verdict" which is either true or false and we keep the ones where verdict is true Enrich & Update Database: Uses Tavily to search for a decision maker (doesn't always finds one) and populate a row in google sheet with information about the job listing, the company and a decision maker at that company. Waits for 1 minute and 30 seconds to avoid google sheets and chatGPT API timeouts then loops back to the next batch to start filtering again until all job listings are processed Required Google Sheets Database Setup: Before running this workflow, create a Google Sheets database with these exact column headers: Essential Columns: jobUrl - Unique identifier for job listings title - Position Title descriptionText - Description of job listing hiringDemand/isHighVolumeHiring - Are they hiring at high volume? hiringDemand/isUrgentHire - Are they hiring at high urgency? isRemote - Is this job remote? jobType/0 - Job type: In person, Remote, Part-time, etc. companyCeo/name - CEO name collected from Tavily's search icebreaker - Column for holding custom icebreakers for each job listing (Not completed in the workflow. I will upload another that does this called "Personalized IJSFE") scrapedCeo - CEO name collected from Apify Scraper email - Email listed on for job listing companyName - Name of company that posted the job companyDescription - Description of the company that posted the job companyLinks/corporateWebsite - Website of the company that posted the job companyNumEmployees - Number of employees the company listed that they have location/country - Location of where the job is to take place salary/salaryText - Salary on job listing Setup Instructions: Create a new Google Sheet with these column headers in the first row Name the sheet whatever you please Connect your Google Sheets OAuth credentials in n8n Update the document ID in the workflow nodes The merge logic relies on the id column to prevent duplicate processing, so this structure is essential for the workflow to function correctly. Feel free to reach out for additional help or clarification at my gmail: terflix45@gmail.com and I'll get back to you as soon as I can. Set Up Steps: Configure Apify Integration: Sign up for an Apify account and obtain API key Get indeed job scraper actor and use Apify's integration to send a HTTP request to your n8n webhook (if test URL doesn't work use production URL) Use Apify node with Resource: Dataset, Operation: Get items and use your Api key as your credentials Set Up AI Services: Add OpenAI API credentials for job filtering Add Tavily API credentials for company research Set up appropriate rate limiting for cost control Database Configuration: Create Google Sheets database with provided column structure Connect Google Sheets OAuth credentials Configure the merge logic for duplicate detection Content Filtering Setup: Customize the AI prompts for your specific niche, requirements or interest Adjust the filtering criteria to fit your needs
by Lucas Walter
Reverse engineer short-form videos from Instagram and TikTok using Gemini AI Who's it for Content creators, AI video enthusiasts, and digital marketers who want to analyze successful short-form videos and understand their production techniques. Perfect for anyone looking to reverse-engineer viral content or create detailed prompts for AI video generation tools like Google Veo or Sora. How it works This automation takes any Instagram Reel or TikTok URL and performs a forensic analysis of the video content. The workflow downloads the video, converts it to base64, and uses Google's Gemini 2.5 Pro vision API to generate an extremely detailed "Generative Manifest" - a comprehensive prompt that could be used to recreate the video with AI tools. The analysis includes: Visual medium identification (film stock, camera sensor, lens characteristics) Color grading and lighting breakdown Shot-by-shot deconstruction with precise timing Camera movement and framing details Subject description and action choreography Environmental and atmospheric details How to set up Configure API credentials: Add your Apify API key for video scraping Set up Google Gemini API authentication Set up Slack integration (optional): Configure Slack OAuth for result sharing Update the channel ID where results should be posted Access the form: The workflow creates a web form where you can input video URLs Form accepts both Instagram Reel and TikTok URLs Requirements Apify account** with API access for video scraping Google Cloud account** with Gemini API enabled Slack workspace** (optional, for sharing results) Videos must be publicly accessible (no private accounts) How to customize the workflow Modify the analysis prompt:** Edit the "set_base_prompt" node to adjust the depth and focus of the video analysis Add different platforms:** Extend the switch node to handle other video platforms Integrate with other tools:** Replace Slack with email, Discord, or other notification systems
by Adam Janes
How it works: Whenever a new event is scheduled on your Google Calendar, this workflow generates a Meeting Briefing email, giving an overview of each person on the call and the company they work for. It makes use of the web search tool on the OpenAI Responses API to make lookups. The workflow triggers when a new event is added to the calendar, loops over each attendee, generating reports on each person and their company, collates the results, and sends the briefing as an email. Set up steps: Add your credentials for Google Calendar (for viewing events) and Gmail (to send the email) Add your OpenAI credentials as a Header Auth on the Company Search and Person Search nodes. Name: Authorization Value: Bearer {{ YOUR_API_KEY }} Edit the "Edit Fields" node with the email that you want to send the briefing to, and a short bit of context about yourself.
by Lucas Peyrin
How it works This template is a powerful, reusable utility for managing stateful, long-running processes. It allows a main workflow to be paused indefinitely at "checkpoints" and then be resumed by external, asynchronous events. This pattern is essential for complex automations and I often call it the "Async Portal" or "Teleport" pattern. The template consists of two distinct parts: The Main Process (Top Flow): This represents your primary business logic. It starts, performs some actions, and then calls the Portal to register itself before pausing at a Wait node (a "Checkpoint"). The Async Portal (Bottom Flow): This is the state-management engine. It uses Workflow Static Data as a persistent memory to keep track of all paused processes. When an external event (like a new chat message or an approval webhook) comes in with a specific session_id, the Portal looks up the corresponding paused workflow and "teleports" the new data to it by calling its unique resume_url. This architecture allows you to build sophisticated systems where the state is managed centrally, and your main business logic remains clean and easy to follow. When to use this pattern This is an advanced utility ideal for: Chatbots:** Maintaining conversation history and context across multiple user messages. Human-in-the-Loop Processes:** Pausing a workflow to wait for a manager's approval from an email link or a form submission. Multi-Day Sequences:** Building user onboarding flows or drip campaigns that need to pause for hours or days between steps. Any process that needs to wait for an unpredictable external event** without timing out. Set up steps This template is a utility designed to be copied into your own projects. The workflow itself is a live demonstration of how to use it. Copy the Async Portal: In your own project, copy the entire Async Portal (the bottom flow, starting with the A. Entry: Receive Session Info trigger) into your workflow. This will be your state management engine. Register Your Main Process: At the beginning of your main workflow, use an Execute Workflow node to call the Portal's trigger. You must pass it a unique session_id for the process and the resume_url from a Wait node. Add Checkpoints: Place Wait nodes in your main workflow wherever you need the process to pause and wait for an external event. Trigger the Portal: Configure your external triggers (e.g., your chatbot's webhook) to call the Portal's entry trigger, not your main workflow's trigger. You must pass the same session_id so the Portal knows which paused process to resume. To see it in action, follow the detailed instructions in the "How to Test This Workflow" sticky note on the canvas.
by PollupAI
Never forget to send a satisfaction survey again! This workflow helps you automatically send CSAT surveys when a Freshdesk ticket is marked “Resolved” – and logs every response in Google Sheets for easy analysis, reporting, and escalation workflows. 💡 Built for CS and ops teams who care about real feedback This template is perfect for: Customer Support Teams who want timely, consistent survey delivery after every resolved ticket. Ops Leads & Admins tired of managing spreadsheets and survey tools manually. Businesses using Freshdesk looking for a no-code feedback loop. Automation fans who want to track, trigger, and take action — automatically. 🧩 What problem does it solve? Manual survey processes are slow, inconsistent, and hard to scale. This automation ensures: Fast survey delivery when experiences are still fresh. No duplicate emails thanks to a built-in tracking system. Centralized feedback in a Google Sheet — no more digging through platforms. Data you can act on, like triggering Slack alerts for poor scores. ⚙️ How it works 📨 Part 1: Auto-send the survey when a ticket is resolved Trigger: Workflow runs on a schedule (or manually via “Test”). Pull ticket status from Freshdesk. Compare ticket status to the last known status in Google Sheets. Detect resolution: If status = “Resolved” (ID 4), move forward. Update the Google Sheet to track that the survey was sent. Fetch the customer’s email from Freshdesk. Create & send the survey email, personalized with ticket info and your brand. Convert Markdown → HTML for a well-formatted email. 📥 Part 2: Collect responses and store in Sheets Form Trigger: Customer clicks the survey link and fills in the form. Capture responses (e.g. rating + comments). Log feedback in a second Google Sheet for analysis. You can extend this by adding escalation steps (e.g. flagging 1–2 star ratings to managers). 🚀 Setup Instructions 🔐 Connect your tools Freshdesk**: Add your API credentials to the get tickets and get client nodes. Google Sheets**: Authenticate in the get existing tickets, update status, and save survey nodes. Email (SMTP)**: Add your SMTP details in the “Send Email” node, or swap in Gmail, SendGrid, etc. 🛠 Set your data In the Set your data node, enter: Your name, email, company, and position Your survey form link (see below) 🔗 Get the form link Activate the workflow (toggle it ON) Go to the “Survey” (Form Trigger) node Copy the Production URL Paste it into the survey link field in the Set your data node 🧾 Prepare your Google Sheets Sheet 1: Freshdesk Tickets (status tracking) Used by: get existing tickets update status Create a new empty Google Sheet. Add the Spreadsheet ID + Sheet Name into the nodes. Sheet 2: Feedback freshdesk (survey responses) Used by: save survey to google sheet Create a new sheet or tab. It will auto-create columns based on your survey form field labels. Add the Spreadsheet ID + Sheet Name/GID to the save node. 🔧 Customize the workflow 📝 Survey Questions Modify them in the Survey (Form Trigger) node. Adjust the save survey to google sheet node as needed (or use auto-map). 💬 Email Content Edit the subject and message in the Create the email text (Set) node. 🏷 Freshdesk Status ID If your “Resolved” status ID isn’t 4, update the second condition in the If ticket resolved node. 📉 Escalate poor feedback Add logic after the save survey to google sheet node: If rating is low: Notify Slack Create a new internal ticket Email a team lead 🔁 Schedule Trigger Adjust the Schedule Trigger node to your desired interval (e.g., hourly). 🔄 Use a Webhook Instead (Optional) If Freshdesk supports ticket webhook events, swap the schedule trigger for a Webhook Trigger node to send surveys instantly on ticket resolution. 🤖 Why Pollup AI is building this At Pollup AI, we help CS and support teams stop drowning in tools and manual tasks. This template is part of our growing AI agent library: plug-and-play automations that connect your tools, clean your data, and free up your time – without writing a line of code. Try this workflow and let Pollup AI handle the boring parts, so your team can focus on what customers are really saying. Learn more at Pollup AI