by Daniel
Spark your creativity instantly in any chat—turn a simple prompt like "heartbreak ballad" into original, full-length lyrics and a professional AI-generated music track, all without leaving your conversation. 📋 What This Template Does This chat-triggered workflow harnesses AI to generate detailed, genre-matched song lyrics (at least 600 characters) from user messages, then queues them for music synthesis via Fal.ai's minimax-music model. It polls asynchronously until the track is ready, delivering lyrics and audio URL back in chat. Crafts original, structured lyrics with verses, choruses, and bridges using OpenAI Submits to Fal.ai for melody, instrumentation, and vocals aligned to the style Handles long-running generations with smart looping and status checks Returns complete song package (lyrics + audio link) for seamless sharing 🔧 Prerequisites n8n account (self-hosted or cloud with chat integration enabled) OpenAI account with API access for GPT models Fal.ai account for AI music generation 🔑 Required Credentials OpenAI API Setup Go to platform.openai.com → API keys (sidebar) Click "Create new secret key" → Name it (e.g., "n8n Songwriter") Copy the key and add to n8n as "OpenAI API" credential type Test by sending a simple chat completion request Fal.ai HTTP Header Auth Setup Sign up at fal.ai → Dashboard → API Keys Generate a new API key → Copy it In n8n, create "HTTP Header Auth" credential: Name="Fal.ai", Header Name="Authorization", Header Value="Key [Your API Key]" Test with a simple GET to their queue endpoint (e.g., /status) ⚙️ Configuration Steps Import the workflow JSON into your n8n instance Assign OpenAI API credentials to the "OpenAI Chat Model" node Assign Fal.ai HTTP Header Auth to the "Generate Music Track", "Check Generation Status", and "Fetch Final Result" nodes Activate the workflow—chat trigger will appear in your n8n chat interface Test by messaging: "Create an upbeat pop song about road trips" 🎯 Use Cases Content Creators**: YouTubers generating custom jingles for videos on the fly, streamlining production from idea to audio export Educators**: Music teachers using chat prompts to create era-specific folk tunes for classroom discussions, fostering interactive learning Gift Personalization**: Friends crafting anniversary R&B tracks from shared memories via quick chats, delivering emotional audio surprises Artist Brainstorming**: Songwriters prototyping hip-hop beats in real-time during sessions, accelerating collaboration and iteration ⚠️ Troubleshooting Invalid JSON from AI Agent**: Ensure the system prompt stresses valid JSON; test the agent standalone with a sample query Music Generation Fails (401/403)**: Verify Fal.ai API key has minimax-music access; check usage quotas in dashboard Status Polling Loops Indefinitely**: Bump wait time to 45-60s for complex tracks; inspect fal.ai queue logs for bottlenecks Lyrics Under 600 Characters**: Tweak agent prompt to enforce fuller structures like V1V2[C]; verify output length in executions
by Cole Hallgarth
This workflow automatically filters and organizes incoming Gmail emails using keyword-based classification. The goal is to reduce inbox noise and automatically organize repetitive types of emails so that imprtant messages remain visible while unsolicited or promotional emails are handled automatically. How it works When a new email arrives in Gmail, the workflow is triggered and retrieves the full email content. It then analyzes the subject line and email snippet for keywords commonly associated with different types of emails. Based on this analysis, the workflow classifies the message into one of several categories and performs actions such as applying Gmail labels, marking the email as read, or removing it from the inbox. This allows you to automatically separate unsolicited outreach, marketing emails, and legitimate communication without manually reviewing every message. How to use The workflow will be triggered each time a new email comes in. However, you can change the trigger based on your own needs. The email categories can be changed and adjusted to your own needs as well. Make sure to follow the sticky notes in the workflow to set up your Gmail labels before starting to build the workflow. Requirement: Gmail connection Optional: Google sheet connection Customizing this workflow: Change the email categories based on your real-life needs and if you don't need to google sheet for logging and debugging purposes, feel free to omit them.
by Snehasish Konger
Target audience Solo creators, PMs, and content teams who queue LinkedIn ideas in Google Sheets and want them posted on a fixed schedule with AI-generated copy. How it works The workflow runs on a schedule (Mon/Wed/Fri at 09:30). It pulls the first Google Sheet row with Status = Pending, generates a LinkedIn-ready post from Post title using an OpenAI prompt, publishes to your LinkedIn profile, then updates the same row to Done and writes the final post back to the sheet. Prerequisites (use your own credentials) Google Sheets (OAuth2)** with access to the target sheet LinkedIn OAuth2* tied to the account that should post — set the *person** field to your profile’s URN in the LinkedIn node OpenAI API key** for the Chat Model node Store secrets in n8n Credentials. Never hard-code keys in nodes. Google Sheet structure (exact columns) Minimum required columns id — unique integer/string used to update the same row later Status — allowed values: Pending or Done Post title — short prompt/topic for the AI model Recommended columns Output post — where the workflow writes the final text (use this header, or keep your existing Column 5) Hashtags (optional) — comma-separated list (the prompt can append these) Image URL (optional) — public URL; add an extra LinkedIn “Create Post” input if you post with media later Notes (optional) — extra hints for tone, audience, or CTA Example header row id | Status | Post title | Hashtags | Image URL | Output post | Notes Example rows (inputs → outputs) 1 | Pending | Why I moved from Zapier to n8n | #automation,#nocode | | | Focus on cost + flexibility 2 | Done | 5 lessons from building a rules engine | #product,#backend | | This is the final posted text... | Resulting Output post (for row 1 after publish) I switched from Zapier to n8n for three reasons: control, flexibility, and cost. Here’s what changed in my stack and what I’d repeat if I had to do it again. #automation #nocode > If your sheet already has a column named Column 5, either rename it to Output post and update the mapping in the final Google Sheets Update node, or keep Column 5 as is and leave the node mapping untouched. Step-by-step Schedule Trigger Runs on Mon/Wed/Fri at 09:30. Fetch pending rows (Google Sheets → Get Rows) Reads the sheet and filters rows where Status = Pending. Limit Keeps only the first pending row so one post goes out per run. Writing the post (Agent + OpenAI Chat Model + Structured Output Parser) Uses Post title (and optional Notes/Hashtags) as input. The agent returns JSON with a post field. Model set to gpt-4o-mini by default. Create a post (LinkedIn) Publishes {{$json.output.post}} to the configured person (your profile URN). Update the sheet (Google Sheets → Update) Matches by id, sets Status = Done, and writes the generated text into Output post (or your existing output column). Customization Schedule** — change days/time in the Schedule node. Consider your n8n server timezone. Posts per run* — remove or raise the *Limit** to post more than one item. Style and tone** — edit the Agent’s system prompt. Add rules for line breaks, hashtags, or a closing CTA. Hashtags handling** — parse the Hashtags column in the prompt so the model appends them cleanly. Media posts** — add a branch that attaches Image URL (requires LinkedIn media upload endpoints). Company Page* — switch the *person* field to an *organization** URN tied to your LinkedIn app scope. Troubleshooting No post created** Check the If/Limit path: is there any row with Status = Pending? Confirm the sheet ID and tab name in the Google Sheets nodes. Sheet not updating** The Update node must receive the original id. If you changed field names, remap them. Make sure id values are unique. LinkedIn errors (403/401/404)** Refresh LinkedIn OAuth2 in Credentials. The person/organization URN is wrong or missing. Copy the exact URN from the LinkedIn node helper. App lacks required permissions for posting. Rate limit (429) or model errors** Add a short Wait before retries. Switch to a lighter model or simplify the prompt. Post too long or broken formatting** LinkedIn hard limit is \~3,000 characters. Add a truncation step in Code or instruct the prompt to cap length. Replace double line breaks in the LinkedIn node if you see odd spacing. Timezone mismatch** The Schedule node uses the n8n instance timezone. Adjust or move to a Cron with explicit TZ if needed. Need to post at a different cadence, or push two posts per day? Tweak the Schedule and Limit nodes and you’re set.
by Matt Chong
Who is this for? This workflow is for professionals, entrepreneurs, or anyone overwhelmed by a cluttered Gmail inbox. If you want to automatically archive low-priority emails using AI, this is the perfect hands-free solution. What does it solve? Your inbox fills up with old, read emails that no longer need your attention but manually archiving them takes time. This workflow uses AI to scan and intelligently decide whether each email should be archived, needs a reply, or is spam. It helps you: Declutter your Gmail inbox automatically Identify important vs. unimportant emails Save time with smart email triage How it works A scheduled trigger runs the workflow (you set how often). It fetches all read emails older than 45 days from Gmail. Each email is passed to an AI model(GPT-4) that classifies it as: Actionable Archive If the AI recommends archiving, the workflow archives the email from your inbox. All other emails are left untouched so you can review them as needed. How to set up? Connect your Gmail (OAuth2) and OpenAI API credentials. Open the "Schedule Trigger" node and choose how often the workflow should run (e.g., daily, weekly). Optionally adjust the Gmail filter in the “List Old Emails” node to change which emails are targeted. Start the workflow and let AI clean up your inbox automatically. How to customize this workflow to your needs Change the Gmail filter**: Edit the query in the Gmail node to include other conditions (e.g., older_than:30d, specific labels, unread only). Update the AI prompt**: Modify the prompt in the Function node to detect more nuanced categories like “Meeting Invite” or “Newsletter.” Adjust schedule frequency**: Change how often the cleanup runs (e.g., hourly, daily).
by Yenire
How it works • Receives user identity data from multiple channels (WhatsApp, Telegram, email). • Generates and stores secure one-time passwords (OTP). • Validates OTP codes against Postgres records. • Centralizes user identity across channels. • Maintains session continuity for onboarding and verification. Set up steps • Import the workflow into n8n. • Configure Postgres or Supabase credentials. • Download and create the required database tables in Supabase. • Connect messaging channel credentials. • Define OTP expiration and validation rules. • Test using sample user verification scenarios.
by M Ayoub
Who is this for? DevOps engineers, sysadmins, and website owners who manage multiple domains and need proactive SSL certificate expiration monitoring without manual checks. What it does Automatically monitors SSL certificates across multiple domains, tracks expiration status in a Google Sheet dashboard, and sends beautifully formatted HTML email alerts before certificates expire. ✅ No API rate limits — Uses direct OpenSSL commands, so you can scan unlimited domains with zero API costs or restrictions. How it works Triggers on schedule (every 3 days at 10AM) Reads domain list from your Google Sheet Checks each domain's SSL certificate using OpenSSL commands Parses expiration dates, issuer info, and calculates days remaining Updates Google Sheet with current status for all domains Sends styled email alerts only when certificates are expiring soon Set up steps Connect your Google Sheets OAuth2 credentials Create a Google Sheet with these columns: Domain, Expiry Date, Days Left, Status, Issuer, Last Checked (the workflow matches on the Domain column to update results) Add your domains to scan in the Domain column Update the Sheet ID in the Read Domain List from Google Sheets and Update Google Sheet with Results nodes Connect SMTP credentials in the Send Alert Email via SMTP node Optionally adjust ALERT_THRESHOLD_DAYS in two nodes: Prepare Domain List and Set Threshold and Parse SSL Results and Identify Expiring Certs (default: 20 days) Setup time: ~10 minutes
by Tomohiro Goto
🧠 How it works This workflow enables automatic translation in Slack using n8n and OpenAI. When a user types /trans followed by text, n8n detects the language and replies with the translated version via Slack. ⚙️ Features Detects the input language automatically Translates between Japanese ↔ English using GPT-4o-mini (temperature 0.2 for stability) Sends a quick “Translating...” acknowledgement to avoid Slack’s 3s timeout Posts the translated text back to Slack (public or private selectable) Supports overrides like en: こんにちは or ja: hello 💡 Perfect for Global teams communicating in Japanese and English Developers learning how to connect Slack + OpenAI + n8n 🧩 Notes Use sticky notes inside the workflow for setup details. Duplicate and modify it to support mentions, group messages, or other language pairs.
by Parhum Khoshbakht
Who's it for Product managers, customer success teams, and UX researchers who collect feedback in Google Sheets and want to automatically categorize and analyze it with sentiment and emotions insights. Ideal for teams processing dozens or hundreds of customer comments daily. Read on Medium Watch on Youtube What it does This workflow automatically tags and analyzes customer feedback stored in Google Sheets using OpenAI's GPT-4. It reads unprocessed feedback entries, sends them in batches to OpenAI for intelligent tagging and sentiment analysis, then writes the results back to your sheet—complete with up to 3 relevant tags per feedback, sentiment scores (Very Negative to Very Positive), and emotional analysis. The workflow uses batch processing to optimize API costs: instead of sending 10 separate requests, it sends one request with all 10 feedbacks, reducing API calls by ~90%. How it works Fetches allowed tags from a Tags sheet and new feedback entries (where Status is empty) from a Feedbacks sheet Merges tags with feedbacks and processes them in batches of 10 Sends each batch to OpenAI with a structured prompt requesting tags, sentiment, and emotional analysis Writes results back to Google Sheets with tags, sentiment, emotions, and timestamps Requirements Google Sheets account with OAuth2 credentials OpenAI API account (uses GPT-4o-mini model) Google Sheet template with two sheets: "Tags" (single column) and "Feedbacks" (with columns: Feedbacks, Status, Tag 1-3, Sentiment, Primary/Secondary Emotion, AI Tag 1-2, Updated Date) Setup instructions Duplicate the provided Google Sheet template Connect your credentials in n8n: Google Sheets OAuth2 API OpenAI API Update Sheet URLs in these nodes: "Fetch Allowed Tags" - point to your Tags sheet "Fetch New Feedbacks" - point to your Feedbacks sheet "Update Google Sheet (Tagged)" - point to your Feedbacks sheet Test manually first using the Manual Trigger, then enable the Schedule Trigger for automatic processing every 60 minutes How to customize Adjust batch size**: Change the batch size in "Process Feedbacks in Batches" (default: 10) to process more or fewer feedbacks per API call Modify tagging logic**: Edit the system prompt in "Tag Feedbacks with AI" to change how tags are selected or add custom sentiment categories Change schedule**: Update the Schedule Trigger interval (default: 60 minutes) based on your feedback volume Extend the workflow**: Add nodes to send results to Slack, Notion, or Airtable for real-time alerts on negative feedback
by EmailListVerify
Who is this template for? This template is designed for link building. When you reach out to some small blogs it is common for the owner to have an address like DomainName@gmail.com. This workflow will find such emails for you. What problem does this workflow solve? Get from a list of domain names to a list of email addresses. This is perfect to prepare a cold outreach campaign for link building. This workflow allows you to find email addresses with any extension. I recommend searching for Gmail as a starting Point. But you can also use workflow to check for other email providers. Pro-tip: Check the email provider used in the geography you target: Lapost.net for France Seznam.cz for Czechia What this workflow does This workflow will: Generate email candidates based on the domain name and root you are providing Check if those email addresses are valid using EmailListVerify ##Requirement This template uses: Google Sheet to handle input and output data EmailListVerify to discover email (from $0.05 per email) Setup (10 minutes) 1: Make a copy of the GoogleSheet template 2: In "[Input] pattern" sheet write the email extension you want to check. Gmail is a no-brainer. Depending on the location you target, you might want to include local email providers like laposte.net for France. 3: In "[Input] domain" put the domain for which you want to find email addresses. 4: Add your EmailListVerify API key to setting to the 3rd step 5: Update Google Sheet node to point to your copy of the template 6: Trigger the workflow
by Robert Breen
This workflow automates the process of finding YouTube creator contact emails for outreach and partnerships. It combines Apify scrapers with OpenAI to deliver a clean list of emails from channel descriptions: Step 1:** Search YouTube with Apify based on a keyword or topic Step 2:** Scrape each channel for descriptions and metadata Step 3:** Use OpenAI to extract and format valid email addresses into a structured JSON output This is useful for influencer outreach, creator collaborations, UGC sourcing, or lead generation — all automated inside n8n. ⚙️ Setup Instructions 1️⃣ Set Up OpenAI Connection Go to OpenAI Platform Navigate to OpenAI Billing Add funds to your billing account Copy your API key into the OpenAI credentials in n8n 2️⃣ Set Up Apify Connection Go to Apify Console and sign up/login Get your API token here: Apify API Keys Set up the two scrapers in your Apify account: YouTube Scraper by streamers YouTube Scraper by apidojo In n8n, create a HTTP Query Auth credential Query Key: token Value: YOUR_APIFY_API_KEY Attach this credential to both HTTP Request nodes (Search YouTube and Scrape Channels) 📬 Contact Information Need help customizing this workflow or building similar automations? 📧 robert@ynteractive.com 🔗 Robert Breen 🌐 ynteractive.com
by Miha
This n8n template auto-enriches brand-new HubSpot contacts with company details. Each day it finds contacts created in the last 24 hours (skipping free email domains), researches the company from the contact’s email domain, and writes back clean fields—no manual lookup needed. Perfect for GTM teams that want better segmentation and faster personalization from day one. How it works A daily schedule trigger starts the workflow. HubSpot: Get recently created/updated contacts** pulls the newest records. A filter keeps only contacts: created within the last 24 hours whose email domain doesn’t contain gmail.com (adjust as needed). An AI research agent (Gemini + SerpAPI): extracts the company domain from the contact’s email searches the web and returns structured JSON: company_name, industry, headquarters_city, headquarters_country, employee_count, website, linkedin, description HubSpot: Add company info** updates the contact with the enriched fields. How to use Connect HubSpot on both HubSpot nodes (OAuth2). Connect SerpAPI (paste your API key). Connect Google Gemini (Google AI Studio API key). (Optional) Edit the agent prompt to fetch more/different fields. (Optional) Tweak the filter to include/exclude other domains. Activate the workflow to run daily. Requirements HubSpot** (OAuth2) for reading/updating contacts SerpAPI** for web search results Google Gemini** for company profiling and structured output Notes & customization Free domains:** Add more exclusions (e.g., yahoo.com, outlook.com) to reduce false positives. Confidence gating:** Require website + LinkedIn before writing to HubSpot, or route low-confidence results for manual review. Field mapping:** Extend the update step with additional properties (e.g., industry tags, HQ timezone). Frequency:** Switch the trigger to hourly for faster enrichment on high-volume inbound. Data hygiene:** Normalize employee count ranges and country names to your CRM picklists.
by Ayis Saliaris Fasseas
How it works A form trigger accepts an Industry + Location query (e.g. Accountants London). Text Search Page 1 calls Google Places Text Search to return results and a next_page_token. Conditional checks + 5s wait nodes fetch page 2 and page 3 when a next page exists. All pages are merged, split into individual place results, and each place_id is passed to Place Details. Place Details returns name, formatted_phone_number, website, formatted_address. Results are formatted and appended to a Google Sheet. Setup steps Enable Google Cloud project billing and Places API (Text Search + Details). Create an API key and add it to the three Text Search and Place Details HTTP request nodes (<YOUR KEY>). Add Google Sheets OAuth credentials to the Google Sheets node. Create a Google Sheet with columns: Company Name, Phone Number, Website, Address Update the Google Sheets node documentId and sheetName to your spreadsheet. Import/paste this workflow into n8n and test with a small query. Customization Edit Place Details fields to retrieve more/less info (address_components, opening_hours, etc.). Adjust the number of pages fetched (workflow currently supports up to 3 pages). Alter output mapping in the code node to add/remove columns or change column order. Add further nodes after the sheet (e.g., email scraper/ sender, CRM integration, enrichment API) Use cases B2B lead generation for targeted industries and locations. Building outreach lists for sales teams and agencies. Enriching CRM with phone, website, and address data from Google Places. Rapid market mapping for local competitors, vendors, or partners. Troubleshooting tips No results / invalid API key: verify API key is correct, not restricted incorrectly, and Places API is enabled. next_page_token not working: the token can take a couple of seconds to activate. Keep the 5s wait nodes; if still failing, increase to ~10s+ Quota / billing errors: confirm billing is enabled on the Google Cloud project and check your Places API quota. Missing phone/website: not all places provide all fields; fallback handling is already included (nulls). Duplicates in Sheets: run tests on small queries and inspect the Merge/Split logic; add a dedupe step before appending if needed. Place Details rate limits: if you plan high volume, throttle requests or add longer waits to avoid quota/rate-limit errors. Permissions / OAuth: ensure the Google Sheets OAuth user has edit access to the target spreadsheet.