by Robert Breen
n8n Workflow: OpenAI DALL·E 2 Image Generation & Google Drive Upload Description This n8n workflow automates the process of generating multiple AI-created images from a single prompt using OpenAI's DALL·E 2, then uploads the results directly to a Google Drive folder. It includes a loop to produce several image variations for the same prompt, making it ideal for creative projects, marketing materials, or content experimentation. Step-by-Step Setup Instructions 1. Prepare Your API Keys OpenAI API Key** Sign up or log in at https://platform.openai.com/ Go to API Keys and create a new one. Copy and store this securely — you'll need it in n8n. Google Drive API** Go to https://console.cloud.google.com/ Create a project and enable Google Drive API. Create OAuth 2.0 credentials and set the redirect URI to your n8n OAuth redirect (found in your n8n Google Drive node setup). Connect your Google account when adding credentials in n8n. 2. Workflow Nodes Overview Manual Trigger – Starts the workflow manually. Set Image Prompt – Stores the prompt text and base file name (e.g., “Make an image of an attractive woman standing in New York City”). Duplicate Rows (Code Node) – Creates multiple "runs" of the same prompt for variation. Loop Over Items – Processes each variation one at a time. Generate an image (OpenAI DALL·E 2) – Sends the prompt to OpenAI and retrieves an image. Upload to Google Drive – Saves each generated image to your chosen Google Drive folder. 3. Building the Workflow in n8n Step 1 — Manual Trigger Add a Manual Trigger node to start the workflow manually when testing. Step 2 — Set Image Prompt Add a Set node with two fields: Prompt → The image description text. Name → The base name for the saved file. Example: | Name | Value | |--------|---------------------------------------------------------------| | Prompt | Make an image of an attractive woman standing in New York City | | Name | woman-nyc | Step 3 — Duplicate Rows (Code Node) Use this JavaScript to create three copies of the prompt (run 1, run 2, run 3): const original = items[0].json; return [ { json: { ...original, run: 1 } }, { json: { ...original, run: 2 } }, { json: { ...original, run: 3 } }, ]; Step 4 — Loop Over Items Insert a Split in Batches node and set the batch size to 1. This ensures each prompt variation runs through the image generation process individually. Connect this node so it runs after the Duplicate Rows node. Step 5 — Generate Image Add the OpenAI Image Generation node and configure it as follows: Model**: dall-e-2 Prompt**: ={{ $json.Prompt }} Leave other options at their defaults unless you want to specify image size or style. Connect your OpenAI API credentials created in Step 1. This node will send the current prompt in the batch to OpenAI's DALL·E 2 model and return an AI-generated image. Step 6 — Upload to Google Drive Add a Google Drive node and configure it to store the generated image: File Name**: ={{ $('Set Image Prompt').item.json.Name }} - {{ $('Duplicate Rows').item.json.run }} Folder ID**: Select the target Google Drive folder where images should be saved. Connect your Google Drive OAuth2 API credentials. The node will upload each generated image to your chosen Google Drive location, with a unique filename for each variation. Running the Workflow Execute the workflow manually. The process will: Loop through each prompt variation. Generate an image using OpenAI DALL·E 2. Upload the image to Google Drive with a unique name. You will find all generated images in the selected Google Drive folder. Customization Tips Change the number of variations by editing the Duplicate Rows code. Adjust the prompt dynamically from other data sources like Google Sheets, webhooks, or forms. Schedule the workflow to run at specific times or trigger it via an API call. Created by Robert A. – Ynteractive Website: https://ynteractive.com Email: robert@ynteractive.com
by bangank36
Overview This workflow retrieves all blog and event collection items from a Squarespace site and saves them into a Google Sheets spreadsheet. It uses pagination to fetch 20 items per request, ensuring all content is collected efficiently. How It Works The workflow queries your Squarespace blog and event collections. It fetches data in paginated batches (20 items per page). The retrieved data is formatted and inserted into Google Sheets. The workflow runs on demand or on a schedule, ensuring your data stays up to date. Requirements Credentials To use this template, you need: Your Squarespace collection URL Google Sheets API credentials Google Sheets Setup Use this sample Google Sheets template to get started quickly. Who Is This For? This template is designed for: Bloggers looking to manage and analyze content externally. Businesses and marketers tracking content performance. Anyone who needs an automated way to extract Squarespace blog and event data. Explore More Templates Check out my other n8n templates: 👉 n8n.io/creators/bangank36
by Angel Menendez
Who is this for? This subworkflow is ideal for developers and automation builders working with UniPile and n8n to automate message enrichment and LinkedIn lead routing. What problem is this workflow solving? UniPile separates personal and organization accounts into two different API endpoints. This flow handles both intelligently so you're not missing sender context due to API quirks or bad assumptions. What this workflow does This subworkflow is used by: LinkedIn Auto Message Router with Request Detection** LinkedIn AI Response Generator with Slack Approval** It receives a message sender ID and tries to enrich it using UniPile's /people and /organizations endpoints. It returns a clean, consistent profile object regardless of which source was used. Setup Generate a UniPile API token and save it in your n8n credentials Make sure this subworkflow is triggered correctly by your parent flows Test both people and organization lookups to verify responses are normalized How to customize this workflow to your needs Add a secondary enrichment layer using tools like Clearbit or FullContact Customize the fallback logic or error handling Expand the returned data for more AI context or user routing (e.g., job title, region)
by Anurag
Description This workflow automates the extraction of structured data from invoices or similar documents using Docsumo's API. Users can upload a PDF via an n8n form trigger, which is then sent to Docsumo for processing and structured parsing. The workflow fetches key document metadata and all line items, reconstructs each invoice row with combined header and item details, and finally exports all results as an Excel file. Ideal for automating invoice data entry, reporting, or integrating with accounting systems. How It Works A user uploads a PDF document using the integrated n8n form trigger. The workflow securely sends the document to Docsumo via REST API. After uploading, it checks and retrieves the parsed document results. Header information and table line items are extracted and mapped into structured records. The complete result is exported as an Excel (.xls) file. Setup Steps Docsumo Account: Register and obtain your API key from Docsumo. n8n Credentials Manager: Add your Docsumo API key as an HTTP header credential (never hardcode the key in the workflow). Workflow Configuration: In the HTTP Request nodes, set the authentication to your saved Docsumo credentials. Update the file type or document type in the request (e.g., "type": "invoice") as needed for your use case. Testing: Enable the workflow and use the built-in form to upload a sample invoice for extraction. Features Supports PDF uploads via n8n’s built-in form or via API/webhook extension. Sends files directly to Docsumo for document data extraction using secure credentials. Extracts invoice-level metadata (number, date, vendor, totals) and full line item tables. Consolidates all data in easy-to-use Excel format for download or integration. Modular node structure, easily extensible for further automation. Prerequisites Docsumo account with API access enabled. n8n instance with form, HTTP Request, Code, and Excel/Convert to File nodes. Working Docsumo API Key stored securely in n8n’s credential manager. Example Use Cases | Scenario | Benefit | |---------------------|-----------------------------------------| | Invoice Automation | Extract line items and metadata rapidly | | Receipts Processing | Parse and digitize business receipts | | Bulk Bill Imports | Batch process bills for analytics | Notes Credentials Security:** Do not store your API key directly in HTTP Request nodes; always use n8n credentials manager. Sticky Notes:** The workflow includes sticky notes for setup, input, API call, extraction, and output steps to assist template users. Custom Columns:** You can customize header or line item extraction by editing the Code node as needed.
by Mike Russell
Boost engagement on your Discord server by automatically sharing new YouTube videos along with AI generated summaries of their content. This workflow is ideal for content creators and community managers looking to provide value and spark interest through summarized content, making it easier for community members to decide if a video is of interest to them. Watch this video tutorial to learn more about the template. How it works RSS Feed Trigger**: Monitors your YouTube channel for new uploads using the RSS feed. Video Captions Retrieval**: Fetches video captions using the YouTube API to get detailed content data. AI Summary Generation**: Uses an AI model to generate concise summaries from the video captions, highlighting key points. Discord Notification**: Posts video announcements along with their AI generated summaries to a specified Discord channel using a webhook. Set up steps Configure YouTube RSS Feed: Set up the RSS feed node to detect new video uploads. Add your YouTube channel ID to the URL in the first node: https://www.youtube.com/feeds/videos.xml?channel_id=YOUR_CHANNEL_ID. Connect OpenAI Account: To enable AI summary generation, connect your OpenAI account in n8n. Set Up Discord Webhook: Create a webhook in your Discord server and configure it in the Discord node. Design the Message: Format the Discord message as you like to include the video title, link, and the AI generated summary. Example This template empowers you to maintain a highly engaging Discord community, ensuring members receive not only regular updates but also valuable insights into each video's content without needing to watch immediately.
by Intuz
This n8n template from Intuz provides a complete and automated solution for real-time financial reporting. It instantly syncs new QuickBooks invoices to Google Sheets, using specific invoice data or keywords as triggers to ensure your financial records are always accurate and up-to-date. It uses a webhook to capture every new or updated invoice and logs the essential details into a designated Google Sheet. Perfect for creating custom reports, data backups, or a real-time dashboard of your accounts receivable. Use Cases Financial Reporting:** Create a simple, shareable Google Sheet for team members who don't have QuickBooks access. Data Backup:** Maintain a secure, independent log of all your invoices outside of the QuickBooks ecosystem. Custom Dashboards:** Use the Google Sheet as a data source for tools like Google Data Studio or Grafana to build custom financial dashboards. Auditing:** Easily track the history and status of all invoices in a simple, searchable spreadsheet format. How it Works 1. Instant Webhook Trigger: The workflow activates the moment an invoice is created or updated in QuickBooks. The QuickBooks webhook sends a notification to n8n, kicking off the process in real time. 2. Fetch Full Invoice Details: The initial webhook notification only contains the invoice ID. This node uses that ID to make a call back to the QuickBooks API and retrieve the complete invoice data, including customer name, due date, and more. 3. Format Key Data: A simple Code node cleans up the data fetched from QuickBooks. It extracts only the fields you need—ID, Domain, Customer Name, and Due Date—and structures them perfectly for the next step. 4. Append or Update in Google Sheets: The final node connects to your Google Sheet and uses the powerful "Append or Update" operation. If the ID of the invoice doesn't exist in the sheet, it adds a new row. If the ID already exists, it updates the existing row with the latest information. This ensures your Google Sheet is always a perfect mirror of your QuickBooks invoice data, preventing duplicates and keeping everything current. Setup Instructions For this workflow to run successfully, follow these setup steps: 1. Credentials: QuickBooks: Connect your QuickBooks account credentials to n8n. Google: Connect your Google account using OAuth2 credentials. Ensure the Google Sheets and Google Drive APIs are enabled. 2. QuickBooks Webhook Configuration: Activate the workflow. Copy the Production URL from the Webhook node. In your Intuit Developer Portal, go to the webhooks section for your app. Paste the URL and subscribe to Invoice events (e.g., Create, Update). 3. Google Sheet Setup: Create a Google Sheet for your invoice data. Crucially, create the following headers in the first row of your sheet: -ID -Domain -Customer Name -Due Date 4. Node Configuration: In the Append or update row in sheet node, select your Google Sheet document and the specific sheet name from the dropdown lists. The columns should map automatically if you've set up the headers correctly. Connect with us Website: https://www.intuz.com/cloud/stack/n8n Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz
by Airtop
Automating LinkedIn Profile Discovery with Verification Use Case Accurately identifying and verifying a person’s LinkedIn profile is essential for prospecting, recruiting, or contact enrichment. This automation ensures high accuracy by combining search logic with optional profile validation. What This Automation Does This automation locates and verifies a LinkedIn profile using the following inputs: Person_info**: Any identifying information about the person (e.g., name, company, email). Airtop_profile**: Your Airtop Profile authenticated on LinkedIn, used for verifying the profile. How It Works Extracts a likely LinkedIn URL by performing a Google search using the provided person info. Validates the result (if Airtop Profile is provided): Visits the LinkedIn profile. Verifies match by checking the content (e.g., experience, role) against the person info. Returns a verified LinkedIn profile URL or "NA" if not found or not valid. Setup Requirements Airtop API Key Optional but recommended: an Airtop Profile authenticated on LinkedIn. Next Steps Combine with Email Lookup**: Use email-to-profile tools upstream to gather inputs. CRM Integration**: Automatically append LinkedIn profiles to contact records. Automate Outreach**: Use the verified URLs for personalized LinkedIn engagement workflows. Read more about how find and verify Linkedin profiles
by Anton Vanhoucke
This workflow converts Notion pages to markdown, and then converts that markdown back to Notion blocks. It will triple the content of the last updated page it finds. This is useless by itself, but you can copy-paste from this workflow to create your own. Prerequisites A notion account with some pages or databases Setup instructions Create a notion credential and share some pages as described here: https://docs.n8n.io/integrations/builtin/credentials/notion/ How it works The HTTP Request gets notion child blocks from a page, because the default n8n block only gets plain text and no links. The first code block converts it to markdown. The second code block converts it back to Notion blocks The last HTTP block appends everything to the original Notion page, essentially duplicating it for the purpose of demoing the script. I hope in the future we get official n8n blocks that extract markdown, or use markdown to write to Notion. There is community block that also does this, but this template is easier: you can simply copy-paste the blocks from this workflow.
by Pat
Who is this for? This workflow template is perfect for content creators, researchers, students, or anyone who regularly works with audio files and needs to transcribe and summarize them for easy reference and organization. What problem does this workflow solve? Transcribing audio files and summarizing their content can be time-consuming and tedious when done manually. This workflow automates the process, saving users valuable time and effort while ensuring accurate transcriptions and concise summaries. What this workflow does This template automates the following steps: Monitors a specified Google Drive folder for new audio files Sends the audio file to OpenAI's Whisper API for transcription Passes the transcribed text to GPT-4 for summarization Creates a new page in Notion with the summary Setup To set up this workflow: Connect your Google Drive, OpenAI, and Notion accounts to n8n Configure the Google Drive node with the folder you want to monitor for new audio files Set up the OpenAI node with your API key and desired parameters for Whisper and GPT-4 Specify the Notion database where you want the summaries to be stored How to customize this workflow Adjust the Google Drive folder being monitored Modify the OpenAI node parameters to fine-tune the transcription and summarization process Change the Notion database or page properties to match your preferred structure With this AI-powered workflow, you can effortlessly transcribe audio files, generate concise summaries, and store them in a structured manner within Notion. Streamline your audio content processing and organization with this automated template.
by Airtop
Scoring LinkedIn Profiles Against Your ICP Use Case This automation scores individual LinkedIn profiles against your Ideal Customer Profile (ICP) based on interest in AI, technical depth, and seniority level. It's ideal for prioritizing leads and understanding how well a person fits your ICP criteria. What This Automation Does Given a LinkedIn profile and an Airtop profile, it: Extracts relevant data from the person's profile Determines levels of AI interest, seniority, and technical depth Calculates an ICP score based on weighted criteria Returns the full enriched profile with the score Input parameters: LinkedIn Profile URL** (e.g., https://linkedin.com/in/janedoe) Airtop Profile** connected to LinkedIn ICP scoring method** in the Airtop node prompt Output fields in JSON format: Full name, job title, employer, company LinkedIn URL, location, number of connections and followers, about section content and more Calculated ICP Score (out of 95) How It Works Form Trigger or Workflow Trigger: Accepts input from either a form or another workflow. Parameter Assignment: Ensures proper variable names for downstream nodes. Airtop Enrichment Tool: Extracts and scores the person based on a detailed prompt. Scoring: Uses this point system: AI Interest: beginner (5), intermediate (10), advanced (25), expert (35) Technical Depth: basic (5), intermediate (15), advanced (25), expert (35) Seniority Level: junior (5), mid-level (15), senior (25), executive (30) Output Formatting: Cleans and returns the result as JSON. Setup Requirements IMPORTANT: Enter your ICP scoring method in the prompt field of the Airtop node Airtop Profile connected to LinkedIn. Airtop API credentials configured in n8n. Optional: a front-end form to collect profile URLs and trigger the automation. Next Steps Embed in CRM**: Trigger this automation on new leads to auto-score them. Batch Process Leads**: Run it over a list of profile URLs for segmentation. Customize Scoring**: Adjust point weights based on your sales priorities. Read more about ICP Scoring with Airtop and n8n
by Shahrear
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Automatically transform audio files into professional transcription reports with AI-powered speech recognition, timestamp generation, and formatted Google Docs output. What this workflow does Monitors Gmail for incoming audio attachments Downloads and processes audio files using VLM Run AI transcription Generates accurate transcriptions with precise timestamps and segmentation Creates professional reports in Google Docs with formatted output Handles asynchronous processing for long audio files without timeouts Setup Prerequisites: Gmail account, VLM Run API credentials, Google Docs access, self-hosted n8n. You need to install VLM Run community node Quick Setup: Configure Gmail OAuth2 for email monitoring Add VLM Run API credentials for audio transcription Set up Google Docs OAuth2 for report generation Create target Google Doc for transcription reports Update document URL in workflow nodes Test with sample audio file and activate Perfect for Meeting recordings and conference calls Voice memos and dictation workflows Interview transcriptions and journalism Podcast episode documentation Accessibility compliance and documentation Legal proceedings and court recordings Educational content and lecture notes Customer service call analysis Key Benefits Human-level accuracy** - Advanced AI speech recognition with automatic punctuation Timestamp precision** - Segmented transcriptions with exact time markers Multi-format support** - Handles MP3, WAV, M4A, AAC, OGG, FLAC files Asynchronous processing** - No timeouts for long audio files Professional formatting** - Beautifully structured Google Docs reports Automatic workflow** - Zero manual intervention required Saves hours per recording** - Transforms manual transcription into instant results Searchable documentation** - Google Docs integration enables easy content discovery How to customize Extend by adding: Speaker identification and diarization Integration with project management tools (Notion, Asana, Trello) Automatic summary generation from transcripts Translation to multiple languages Slack notifications for completed transcriptions Integration with CRM systems for call logging Audio quality enhancement preprocessing Custom formatting templates for different use cases Automatic keyword extraction and tagging Integration with calendar systems for meeting context This workflow revolutionizes audio documentation by combining cutting-edge AI transcription with professional report generation, making spoken content instantly accessible, searchable, and shareable across your organization.
by Akram Kadri
Who is this for? This workflow is designed for YouTubers who want to update their video descriptions in bulk without manually editing each one. It's especially useful for creators who include a standard set of links in their descriptions and need to insert a new link between existing ones across multiple videos. What problem does this workflow solve? Manually updating video descriptions for multiple videos can be tedious and time-consuming. If you have a section in your video descriptions that contains important links, adding a new one in a specific position (e.g., between two existing links) can be a challenge. This workflow automates that process, allowing you to insert a specific string between two predefined rows in all of your video descriptions at once. What this workflow does Fetches all videos from your YouTube channel. Iterates through each video to retrieve its existing description. Identifies two predefined rows in the description. Inserts a new row between the two specified rows. Updates the video description with the modified text. Setup Connect your YouTube account to n8n and grant necessary permissions. Define your variables in the "Set String to Insert" node: rowBefore: The existing row after which the new row will be inserted. rowToInsert: The new text or link to insert. rowAfter: The existing row before which the new row will be inserted. Run the workflow using the manual trigger. Review the updated descriptions to ensure accuracy. How to customize this workflow to your needs Change the insertion criteria** by modifying the rowBefore and rowAfter values. Insert multiple rows** by adjusting the JavaScript code in the Code node. Extend the workflow** by adding conditions (e.g., only updating descriptions of videos with certain tags). Filter specific** videos instead of updating all by modifying the "Get All Videos" node. This workflow ensures that all your YouTube descriptions stay updated and consistent with minimal effort.