by Abdul Mir
Overview Impress your leads with ultra-personalized “thank you” emails that look hand-written — sent automatically seconds after they submit your intake form. This workflow instantly scrapes the prospect's website, extracts meaningful copy, and uses AI to write a custom thank-you message referencing something specific from their site. It gives the impression you immediately reviewed their business and crafted a thoughtful reply — without lifting a finger. Who’s it for Agencies and consultants using intake forms Freelancers booking discovery calls B2B businesses that want high-touch first impressions Sales teams automating initial follow-ups How it works Triggered when a form (e.g. Tally, Typeform) is submitted Scrapes the website URL provided in the form Converts HTML to Markdown and extracts plain copy Uses AI to write a personalized thank-you message referencing the site Waits briefly to simulate real typing delay Sends the message via Gmail (or any email provider) Example use case > Prospect submits a form with their website: coolstartup.ai > > 30 seconds later, they receive: > > “Thanks for reaching out! I just checked out Cool Startup’s homepage — love the clean UX and mission around AI for teams. Looking forward to diving into how we might collaborate!” How to set up Connect your form tool (e.g. Tally or Typeform) Connect Gmail or another email provider Customize the AI prompt to match your tone Set the wait time (e.g. 30 seconds) for a realistic delay Update your website scraping logic if needed Requirements Form tool with webhook support OpenAI (or other LLM) credentials Email sending integration (Gmail, Mailgun, Postmark, etc.) How to customize Edit the email tone (casual, formal, funny, etc.) Add CRM integration to log form submission and response Trigger additional workflows like lead scoring or Slack alerts Add fallback logic if the website doesn’t scrape cleanly
by Luis Acosta
🎧 Convert Unread Newsletters into Conversational AI Podcasts Turn email overload into audio insights — automatically. This workflow transforms unread newsletters sitting in your inbox into engaging, human-like audio conversations between two AI voices. It’s perfect for listening during your commute, workout, or while multitasking. Inspired by Google's NotebookLM, this automation brings long-form content to life by summarizing dense text into a natural dialogue using OpenAI and generating high-quality voice narration with ElevenLabs. The result? A dynamic audio file sent right back to your inbox — hands-free, screen-free, and stress-free. 💡 What this workflow does ✅ Connects to your Gmail inbox to fetch unread newsletters 🤖 Uses GPT-4o Mini to summarize and rephrase content as a conversation 🗣️ Sends the dialogue to ElevenLabs to generate voice clips (voice1 + voice2) 🔁 Merges all audio segments into a single podcast-like MP3 using FFmpeg 📬 Emails the final audio back to you for easy listening 🛠️ What you'll need A Gmail account with IMAP enabled An OpenAI API key (GPT-4o Mini recommended for cost/performance) An ElevenLabs API key + selected voice IDs A self-hosted or local n8n instance with FFmpeg installed Basic knowledge of binary data and audio handling in n8n ✨ Use cases Convert long newsletters into hands-free listening experiences Repurpose Substack or Beehiiv content for podcast-like distribution Build an internal voice dashboard for teams who prefer audio updates 🙌 Want to go further? This workflow is modular and extensible. You can add steps to: Upload the final audio to Spotify, SoundCloud, or Telegram Publish to a private podcast RSS feed Create a daily audio digest from multiple newsletters 📬 Contact & Feedback Need help customizing it? Have ideas or feedback? Feel free to reach out: 📩 Luis.acosta@news2podcast.com If you're building something more advanced with audio + AI, like automated podcast publishing to Spotify — let me know and I’ll figure out how I can help you!
by Davide
🤝🖊️🤖 This workflow automates the process of retrieving meeting transcripts from Fireflies.ai, extracting and summarizing relevant content using Google Gemini, and sending or drafting well-formatted summaries and emails via Gmail. Fireflies is an AI-powered meeting assistant that automatically records, transcribes, and summarizes meetings. It integrates with popular video conferencing tools like Zoom, Google Meet, and Microsoft Teams, helping teams capture key insights and action items without manual note-taking. This workflow automates meeting recap generation, from email detection to AI-powered summarization and delivery. Key Benefits 💡 Automated Insight Extraction**: Uses AI (OpenAI & Gemini) to extract and summarize key insights from meetings automatically. 📩 Instant Client Communication**: Generates ready-to-send meeting summaries and drafts without human intervention. 📥 Email Monitoring**: Listens to Gmail for specific meeting recap messages and reacts accordingly. 🔗 Seamless Fireflies Integration**: Dynamically pulls transcript data 🧠 Dual AI Models**: Combines the strengths of OpenAI and Gemini for rich, contextual summaries in multiple formats. 🛠 Modular Design**: Easily customizable and extensible for adding more destinations (e.g., Slack, Notion, CRM). 🧑💼 Ideal for Teams & Consultants**: Great for sales teams, project managers, or consultants who handle multiple client meetings daily. How It Works Trigger: The workflow starts with a Gmail Trigger node that monitors incoming emails with the subject "Your meeting recap". It checks for new emails every hour. Alternatively, it can be manually triggered using the "When clicking ‘Execute workflow’" node for testing. Alternatively, via Webhook. Email Processing: The "Get a message" node fetches the full email content. The "Set Meeting link" node extracts the meeting link from the email. The "Information Extractor" (powered by OpenAI) processes the email text to identify the meeting URL. Transcript Retrieval: A Code node parses the meeting ID from the URL. The "Get a transcript" node (Fireflies.ai integration) fetches the full meeting transcript using the extracted meeting ID. Transcript Processing: The "Set sentences" and "Set summary" nodes extract structured data (sentences, short summary, overview) from the transcript. The "Full transcript" node combines all transcript segments into a readable format. AI Summarization & Email Generation: Google Gemini models analyze and summarize the transcript in Italian ("Expert Meeting transcripts") and generate a client-friendly recap ("Meeting summary expert"). The "Email writer" node combines summaries into a cohesive email draft. The Markdown to HTML nodes format the content for email readability. Output: A "Draft email to client" node prepares the final recap. Two Gmail nodes ("Send Full meeting summary" and "Send a message1") dispatch the summaries to the specified recipient. Set Up Steps Configure Credentials: Ensure the following credentials are set up in n8n: Fireflies.ai API (for transcript retrieval). Gmail OAuth2 (for email triggering/sending). OpenAI API (for initial text extraction). Google Gemini (PaLM) (for summarization). Adjust Nodes: Update the "Gmail Trigger" node with the correct email filter (subject:Your meeting recap). Replace YOUR_EMAIL in the Gmail Send nodes with the recipient’s address. Verify the Code nodes (e.g., meeting ID extraction) match your URL structure. Deploy: Activate the workflow. Test using the Manual Trigger or wait for the Gmail trigger to execute automatically. Optional Customization: Modify the Google Gemini prompts for different summary styles. Adjust the email templates in the final Gmail nodes. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by WeblineIndia
YouTube Transcription, Summarization & Translation to Google Docs This workflow automates the end-to-end process of converting YouTube videos into structured, multilingual written content. It transcribes the video's speech, optionally summarizes it, translates it into the chosen language and stores the result in a well-formatted Google Doc—ready for review, sharing or publication. Who’s It For Content creators and bloggers repurposing video content. Educators and researchers converting lectures into readable notes. Marketing teams localizing video material for international audiences. Students summarizing and translating study material. YouTube viewers who want written notes or blog-ready formats. How It Works A Webhook triggers the flow with inputs: youtube_url, language and enable_summary. A Code node formats these inputs into videoId, originalUrl, language and enable_summary. An HTTP Request node sends the video to Supadata API for full transcription. Another Code node combines all transcript segments into one body of text. The Basic LLM Chain node uses the Google Gemini Chat Model to summarize and translate the transcript if requested. A Google Docs node creates a new document with a title based on videoId and language. A final Google Docs node appends the processed summary and translation into the created document. How to Set Up Webhook Input: Send a POST request with three fields: youtube_url, language, enable_summary. Configure Supadata API: Add the HTTP URL and Authorization Header for transcription. Set up Gemini Chat Model: Use Google Vertex AI/Gemini integration in the Basic LLM Chain node. Create Google Docs Credentials: Connect your Google account using OAuth2. Document Naming Logic: You may adjust document titles using expressions (e.g., {{ videoId }} - {{ language }}). Requirements Supadata API key (or any video-to-text API). Google account with Google Docs access. Google Gemini access via n8n’s LLM integration. n8n Cloud or self-hosted instance. Basic understanding of webhook setup (or a form frontend). How to Customize Change LLM model:** Swap Gemini with GPT-4 or Claude in the LLM Chain node. Summarization toggle:** Use the enable_summary flag to control verbosity. Document layout:** Customize headings, font styles and content sections in Google Docs. Multiple languages:** Extend the workflow to translate into multiple languages and generate one document per language. Sharing options:** Add Gmail or Slack nodes to notify users once the document is generated. Add‑ons Notion Export:** Send the document summary directly into Notion using the Notion node. Slack Notification:** Notify your team with a link to the Google Doc using the Slack node. Google Sheets Logging:** Log video URLs, timestamps, and language used for auditing. n8n Forms Integration:** Allow users to submit video URLs and language via a hosted n8n form. Use Case Examples Repurposing Videos into Blogs:** Automatically convert YouTube podcasts into multilingual blog posts. Educational Notes:** Extract and translate lecture content into shareable study documents. International Marketing Teams:** Summarize and localize product explainer videos for different countries. Transcription Library:** Create a searchable database of translated transcripts from niche educational YouTube channels. Common Troubleshooting | Issue | Possible Cause | Solution | | ------------------------------- | ------------------------------------------ | ---------------------------------------------------------- | | Webhook not triggering | Incorrect webhook URL or POST format | Double-check payload and content-type (application/json) | | Transcription API fails | Invalid video ID or API key | Validate YouTube URL and Supadata API access | | Empty translation/summarization | Transcript was empty or prompt was weak | Ensure the video contains spoken content and refine prompt | | Google Doc not created | OAuth2 credentials not authorized properly | Reconnect Google Docs credentials in n8n | | Gemini LLM Chain fails | Model misconfigured or request malformed | Verify your model selection and payload structure | Need Help? Need help getting this set up or customizing it for your workflow? ✅ We can help you: Set up transcription and translation APIs Modify the summarization prompt Customize document layouts or automate sharing 👉 Contact WeblineIndia's automation experts !
by Oneclick AI Squad
Automate your payroll process with this efficient workflow. Triggered monthly on the 28th, it fetches employee data from a Google Sheet, uses AI to calculate net salaries with tax and deductions, structures payslip data, generates PDF payslips, and notifies employees via email while alerting HR on Slack. Ensure accurate payroll distribution with minimal manual effort. 💰📧 Good to Know The workflow runs on the 28th of each month to align with typical payroll cycles. Ensure AI credentials and Google Sheet access are configured for smooth operation. How It Works The Monthly Payroll Trigger initiates the process on the 28th. Gets Employee Data** by reading salary and deduction details from a Google Sheet. AI Calculates Salary** applies tax and deduction rules to compute net pay. Formats Payslip Data** prepares structured data for distribution. Generates PDF Payslip** creates individual payslip documents. Logs payroll data to a Google Sheet for records. Branches to: Sends Email Payslip to Employee with the PDF attachment. Notifies HR on Slack with payroll completion details. How to Use Use the manual trigger for testing, then set a monthly cron (e.g., 0 0 28 * *) for live runs on the 28th. Adjust tax and deduction rules in the AI node to match local regulations. Requirements GOOGLE_SHEET_ID**: Your Google Sheet ID (structured as below) Credentials Needed:** Google Sheets OAuth2 Gmail API Key OpenAI API Key (or similar) Slack Bot Token (with chat:write permissions) Customize:** • Employee data columns (e.g., ID, Name, Base Salary, Deductions) • Tax and deduction formulas • Email subject and Slack message format Google Sheet Structure: Create a sheet with columns: Employee ID Name Base Salary Deductions Net Salary Payslip Status Updated At Customizing This Workflow Adapt for bi-weekly payroll by adjusting the trigger to the 14th and 28th. Integrate with HR systems like BambooHR for real-time employee updates.
by Evoort Solutions
🎁 Automate YouTube Giveaway Winner Selection with YouTube Comments Scraper API Description: Easily automate your YouTube video giveaways using n8n and the YouTube Comments Scraper API. This workflow fetches comments, selects a random winner, logs results to Google Sheets, and notifies the admin—all hands-free! 🧩 Node-by-Node Breakdown | Node | Name | Purpose | | ---- | -------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------- | | 1️⃣ | Form Trigger | Captures a YouTube video URL from a user via form submission. | | 2️⃣ | Fetch YouTube Comments | Makes a POST request to YouTube Comments Scraper API to retrieve comments. | | 3️⃣ | Check API Response Status | Ensures that the response status is 200 before proceeding. | | 4️⃣ | Select Random Commenter | Parses the comments and selects a random commenter as the giveaway winner. | | 5️⃣ | Log Winner to Google Sheet | Appends winner name, video URL, and date to a Google Sheet for record-keeping. | | 6️⃣ | Notify Winner Email | Sends a congratulatory email to the admin with the selected winner's name. | | 7️⃣ | Notify: Invalid API Response | If the API fails, sends an alert to the admin about the issue. | 🔑 How to Get Your RapidAPI Key To use the YouTube Comments Scraper API, follow these steps: Go to YouTube Comments Scraper API. Sign in or create a free RapidAPI account. Click the "Subscribe to Test" button. Copy your x-rapidapi-key from the "Code Snippets" or "Header Parameters" section. Paste it into your HTTP Request node in n8n. 🎯 Use Case & Benefits ✅ Use Case: Automatically pick a random commenter from a YouTube video as a giveaway winner. 🚀 Benefits: Fully automated** – no manual comment scanning or random selection. Accurate & fair** – random selection from valid commenters only. Time-saving** – especially for creators running multiple giveaways. Integrated logging** – keep a historical record of all winners in Google Sheets. Email alerts** – get notified whether the flow succeeds or fails. 👥 Who Is This For? YouTube Content Creators** running giveaways. Marketing Teams** promoting products via YouTube contests. Agencies** managing influencer campaigns. Developers & Automation Enthusiasts** looking to simplify giveaway processes. 💡 Why Use YouTube Comments Scraper API? The YouTube Comments Scraper API offers a simple and effective way to extract public YouTube comments programmatically. It’s fast, reliable, and integrates smoothly with platforms like n8n. You’ll use this API: To retrieve all comments from a YouTube video. To power fair and transparent giveaways. To trigger downstream automations like winner logging and notification. Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by Rahul Joshi
Description: This workflow automates team capacity monitoring using Jira data to identify over-allocated team members and alert managers instantly. It ensures proactive workload management by fetching active issues, calculating utilization rates, logging capacity metrics, and sending detailed email alerts when members exceed 100% capacity. It helps project managers prevent burnout, balance workloads, and maintain operational efficiency — all with zero manual tracking. What This Workflow Does (Step-by-Step) 🟢 Manual Trigger – Start the capacity analysis manually on demand. 📋 Fetch Active Jira Issues – Retrieves all “In Progress” tasks from Jira to analyze workloads. ✅ Data Validation – Checks whether Jira returned valid data before continuing. True Path: Moves to capacity calculation. False Path: Logs query failure to error tracking sheet. 📊 Capacity Calculator – Aggregates logged hours per user and calculates utilization percentage based on an 8-hour daily capacity. 📈 Log Capacity Data to Tracking Sheet – Appends capacity metrics (Assignee, Total Hours, Utilization %, Status) to a Google Sheet for historical tracking and trend analysis. ⚠️ Over-Allocation Check – Identifies team members exceeding 100% utilization (status = “Overallocated”). 📢 Alert Report Generator – Builds a dynamic report summarizing all over-allocated members, their logged hours, utilization %, and corrective suggestions. Generates both alert and “All Clear” reports based on findings. 📧 Send Over-Allocation Alert to Manager – Sends an automated Gmail alert to the project manager, including severity-based subject lines and detailed breakdown of each over-allocated member. 🚨 Log Query Failures to Error Sheet – Records any Jira API or data retrieval issues in the “error log sheet” for monitoring and debugging. Prerequisites Jira account with API access Google Sheets for “Team Capacity Tracking” and “Error Log” Gmail credentials for automated email delivery Key Benefits ✅ Early detection of team over-allocation ✅ Automated data logging and historical tracking ✅ Real-time email alerts to prevent burnout ✅ Data-driven sprint planning and workload balancing ✅ Zero manual monitoring required Perfect For Project Managers and Scrum Masters tracking team load Engineering teams managing multiple active sprints Organizations looking to automate workload visibility HR and PMOs monitoring resource utilization trends
by Robert Breen
This workflow pulls all tasks from your Monday.com board each day and logs them into a Google Sheet. It creates a daily snapshot of your project’s progress and statuses for reporting, tracking, or analysis. ⚙️ Setup Instructions 1️⃣ Connect Monday.com API In Monday.com → go to Admin → API Copy your Personal API Token Docs: Generate Monday API Token In n8n → Credentials → New → Monday.com API → paste your token and save 2️⃣ Prepare Your Google Sheet Copy this template to your own Google Drive: Google Sheet Template Add your data in rows 2–100. Make sure each new task row starts with Added = No. Connect Google Sheets in n8n Go to n8n → Credentials → New → Google Sheets (OAuth2) Log in with your Google account and grant access In the workflow, select your Spreadsheet ID and the correct Sheet Name 🧠 How it works Trigger**: Runs on click or via schedule (e.g., daily at 9 AM). Get many items (Monday.com)**: Fetches all tasks and their current status. Today's Date Node**: Adds the current date for snapshot logging. Map Fields**: Normalizes task name and status. Google Sheets (Append)**: Saves all tasks with status + date into your sheet for historical tracking. 📬 Contact Need help customizing this (e.g., filtering by status, emailing daily reports, or adding charts)? 📧 robert@ynteractive.com 🔗 Robert Breen 🌐 ynteractive.com
by automedia
Transcribe New YouTube Videos and Save to Supabase Who's It For? This workflow is for content creators, marketers, researchers, and anyone who needs to quickly get text transcripts from YouTube videos. If you analyze video content, repurpose it for blogs or social media, or want to make videos searchable, this template will save you hours of manual work. What It Does This template automatically monitors multiple YouTube channels for new videos. When a new video is published, it extracts the video ID, retrieves the full transcript using the youtube-transcript.io API, and saves the structured data—including the title, author, URL, and full transcript—into a Supabase table. It intelligently filters out YouTube Shorts by default and includes error handling to ensure that only successful transcriptions are processed. Requirements A Supabase account with a table ready to receive the video data. An API key from youtube-transcript.io (offers a free tier). The Channel ID for each YouTube channel you want to track. You can find this using a free online tool like TunePocket's Channel ID Finder. How to Set Up Add Channel IDs: In the "Channels To Track" node, replace the example YouTube Channel IDs with your own. The workflow uses these IDs to create RSS links and find new videos. Configure API Credentials: Find the "youtube-captions" HTTP Request node. In the credentials tab, create a new "Header Auth" credential. Name it youtube-transcript-io and paste your API key into the "Value" field. The "Name" field should be x-api-key. Connect Your Supabase Account: Navigate to the "Add to Content Queue Table" node. Create new credentials for your Supabase account using your Project URL and API key. Once connected, select your target table and map the incoming fields (title, source_url, content_snippet, etc.) to the correct columns in your table. Set Your Schedule (Optional): The workflow starts with a manual trigger. To run it automatically, replace the "When clicking ‘Execute workflow’" node with a Schedule node and set your desired interval (e.g., once a day). Activate the Workflow: Save your changes and toggle the workflow to Active in the top right corner. How to Customize Transcribe YouTube Shorts:* To include Shorts in your workflow, select the *"Does url exist?"** If node and delete the second condition that checks for youtube.com/shorts. Change Your Database:* Don't use Supabase? Simply replace the *"Add to Content Queue Table"* node with another database or spreadsheet node, such as *Google Sheets, **Airtable, or n8n's own Table.
by BizThrive.ai
Turn your Telegram bot into a real-time research assistant with this intelligent n8n workflow. Designed for founders, analysts, and knowledge workers, this automation uses Perplexity Sonar and Sonar Pro to deliver concise, citation-rich answers to complex queries — directly inside Telegram. 🔍 What It Does ✅ Smart Query Routing** Automatically selects the right tool based on query complexity — Sonar for fast lookups, Sonar Pro for multi-source synthesis. 📚 Cited Research Summaries** Includes clickable URLs from Perplexity’s source data for transparency and auditability. 🧠 Session Memory** Maintains chat context using Telegram chat ID for follow-up questions and threaded insight. 🔐 Secure Access Filter** Restricts bot usage to authorized Telegram users. ⚙️ Customizable Agent Behavior** Easily adjust tone, tool preferences, and citation style via system message. 🚀 Use Cases Market research & competitor analysis Academic and scientific deep-dives Legal and transcript summarization Podcast, video, and trend monitoring Personal AI assistant for founders and consultants 🛠 Setup Instructions Create a Telegram bot via @BotFather and add your token. Add your OpenAI and Perplexity API keys. Update the filter node with your Telegram user ID. Deploy and start chatting — responses appear in Telegram.
by Yehor EGMS
🔐 n8n Workflow: Access Control for Internal Chats or Chatbots This n8n workflow helps you restrict access to your internal chats or chatbots so that only authorized team members can interact with them. It's perfect for setups using Telegram, Slack, or other corporate messengers, where you need to prevent unauthorized users from triggering internal automations. 📌 Section 1: Trigger & Input ⚡ Receive Message (Telegram Trigger) Purpose: Captures every incoming message from a user interacting with your Telegram bot (or another messenger). How it works: When a user sends a message, it instantly triggers the workflow and passes their username or ID as input data. Benefit: Acts as the entry point for verifying whether a user is allowed to proceed. 📌 Section 2: Access Table Lookup 📋 User Access Table (Data Node / Spreadsheet / DB Query) Purpose: Stores all your team members and their current access status. Structure Example: | Username | Access Status | |----------|---------------| | user1 | granted | | user2 | denied | | user3 | granted | Benefit: Centralized access control — you can easily update user permissions without editing the workflow. 📌 Section 3: Permission Check 🧩 Check Access (IF Node) Purpose: Compares the incoming user's name or ID against the access table. Logic: If status = granted → Allow message to continue If status = denied → Stop workflow execution Benefit: Ensures only approved users can interact with your automations or receive responses. 📌 Section 4: Response Handling 💬 Send Reply (Telegram Node) Purpose: Sends a message back to the user depending on their access level. Paths: ✅ Granted: Sends the normal bot response or triggers the main process. ❌ Denied: Sends no reply (or an optional "Access denied" message). Benefit: Prevents unauthorized access while maintaining a seamless experience for approved users. 📊 Workflow Overview Table | Section | Node Name | Purpose | |---------|-----------|---------| | 1. Trigger | Receive Message | Captures incoming messages | | 2. Access Table | User Access Table | Stores usernames + permissions | | 3. Check | Check Access | Verifies if user has permission | | 4. Response | Send Reply | Sends or blocks response based on status | 🎯 Key Benefits 🔐 Secure access control: Only trusted users can trigger your internal automations. ⚙️ Dynamic management: Easily update user permissions from a table or database. 🧠 Lightweight setup: Just three nodes create a fully functional access gate. 🚀 Scalable foundation: Extend it with role-based access or activity logging later.
by Fahmi Fahreza
Analyze Trustpilot & Sitejabber sentiment with Decodo + Gemini to Sheets Sign up for Decodo HERE for Discount This template scrapes public reviews from Trustpilot and Sitejabber with a Decodo tool, converts findings into a flat, spreadsheet-ready JSON, generates a concise sentiment summary with Gemini, and appends everything to Google Sheets. It’s ideal for reputation snapshots, competitive analysis, or lightweight BI pipelines that need structured data and a quick narrative. Who’s it for? Marketing teams, growth analysts, founders, and agencies who need repeatable review collection and sentiment summaries without writing custom scrapers or manual copy/paste. How it works A Form Trigger collects the Business Name or URL. Set (Config Variables) stores business_name, spreadsheet_id, and sheet_id. The Agent orchestrates the Decodo tool and enforces a strict JSON schema with at most 10 reviews per source. Gemini writes a succinct summary and recommendations, noting missing sources with: “There’s no data in this website.” A Merge node combines JSON fields with the narrative. Google Sheets appends a row. How to set up Add Google Sheets, Gemini, and Decodo credentials in Credential Manager. Replace (YOUR_SPREADSHEET_ID) and (YOUR_SHEET_ID) in Set: Config Variables. In Google Sheets, select Define below and map each column explicitly. Keep the parser and agent connections intact to guarantee flat JSON. Activate, open the form URL, submit a business, and verify the appended row.