by Giacomo Lanzi
Extract Title tag and meta description from url for SEO analysis. How it works The workflows takes records from Airtable, get the url in the records and extract from the related webpage the title tag (<title>) and meta description (<meta name="description" content="Some content">). If title tag and/or meta description tag isn't available on the webpage, the result will be empty. Setup Set a Base in Airtable with a table with the following structure: url (field type url), title tag (field type text string), meta desc (field type text field) Minimum suggested table structure is: url (https://example.com), title (Title example), meta desc* (This is the meta description of the example page) Connect Airtable to both Airtable nodes in the template and, with the following formula, get all the records that miss title tag and meta desc. Formula: AND(url != "", {title tag} = "", {meta desc} = "") Insert the url to be analyzed in the table in the field url and let the workflow do the rest. Extra You can also calculate the length for title tag and meta desc using formula field inside Airtable. This is the formula: LEN({title tag}) or LEN({meta desc}) You can automate the process calling a Webhook from Airtable. For this, you need an Airtable paid plan.
by Agent Studio
Overview This workflow answers user requests sent via Mac Shortcuts Several Shortcuts call the same webhook, with a query and a type of query Types of query are: translate to english translate to spanish correct grammar (without changing the actual content) make content shorter make content longer How it works Select a text you are writing Launch the shortcut The text is sent to the webhook Depending on the type of request, a different prompt is used Each request is sent to an OpenAI node The workflow responds to the request with the response from GPT Shortcut replace the selected text with the new one For a demo and setup instructions: How to use it Activate the workflow Download this Shortcut template Install the shortcut In step 2 of the shortcut, change the url of the Webhook In Shortcut details, "add Keyboard Shortcut" with the key you want to use to launch the shortcut Go to settings, advanced, check "Allow running scripts" You are ready to use the shortcut. Select a text and hit the keyboard shortcut you just defined
by Ramsey Njire
Who Is This For? This workflow is perfect for content creators, marketers, and business professionals who receive regular newsletters and want to effortlessly convert them into engaging LinkedIn posts. By automating the extraction and repurposing process, you can save time and consistently share thoughtful updates with your network. What Problem Does This Workflow Solve? Manually reading newsletters, extracting the key points, and then formatting that content into professional, engaging LinkedIn posts can be time-consuming and error-prone. This workflow automates those steps by: Filtering Emails:** Uses the Gmail node to process only those emails from a specific sender (e.g., newsletter@example.com). Extracting Content:** Leverages OpenAI to identify and summarize the top news items in your newsletter. Generating Posts:** Crafts concise, insightful LinkedIn posts in a smart, deadpan style with a touch of subtle humor. Publishing:** Posts the generated content directly to LinkedIn. What This Workflow Does Filter Newsletters:** The Gmail node is set up to only handle emails from your chosen sender, ensuring that only relevant newsletters are processed. Extract Key Content:** An OpenAI node analyzes the newsletter text to pull out the most important news items, including headlines and summaries. Split Content:** A Split Out node divides the extracted content so each news item is processed on its own. Generate LinkedIn Posts:** Another OpenAI node takes each news item's details and produces a well-structured LinkedIn post that delivers practical insights and ends with a reflective observation or question. Publish to LinkedIn:** The LinkedIn node publishes the crafted posts directly to your account. Setup Gmail Node: Rename it to “Filter Gmail Newsletter” and configure it to filter emails by your newsletter sender. OpenAI Nodes: Ensure your OpenAI API credentials are set up correctly. Customize the prompt if needed to match your desired tone. LinkedIn Node: Rename it to “Post to LinkedIn” and confirm that your LinkedIn OAuth2 credentials are properly configured. How to Customize OpenAI Prompts:** Adjust the prompts in the OpenAI nodes to fine-tune the post tone and output formatting. Email Filter:** Change the Gmail filter to match the sender of your newsletters. Post Processing:** Optionally, add extra formatting (using Function nodes) to further enhance the readability of the generated LinkedIn posts. This template offers an automated, hands-off solution to transform your newsletter content into engaging LinkedIn updates, keeping your audience informed and inspired with minimal effort.
by Ryan
Who is this template for? This template is for any Microsoft Outlook user who wants a trained AI agent to reason and reply on their behalf. Teach your agent tone and writing style to replicate your own, or develop a persona for a shared inbox. Requirements Outlook with authentication credentials OpenAI account with authentication credentials A few sample email replies of various lengths and topics How it works: Connect your Outlook account. Select (filter) which email sender(s) your trained AI agent will reply to. [Tip: pick a sender that has some repeatability either with a topic (ie. sales) or an individual (coworker@yourcompany.com)] Connect your OpenAI account. Choose your AI model (ie. gpt-4o-mini) Add Prompt (User Message) and select "system message" from the option below Update the instructions by filling in your name (or persona), response style, and add full email replies from the topic or individual you want the AI agent to emulate. [Tip: Add actual replies from your email sent folder, including your greeting and sign off. Paste each email sample between a set of <example> .... </example> tags] Configure the reply (or reply all) to remain within the original email string Test it! Send an email from the address to which your agent wants to respond. Check your sent (or draft) folder for the result. Enjoy all the free time you now have!! If you have questions or need assistance, email us at: support@teambisonandbird.com ++This template does not include retrieving email addresses out of the message or body of the email.++
by Oneclick AI Squad
This n8n template demonstrates how to create an intelligent food recipe assistant that accepts requests via Gmail and web forms, processes them using AI chat models (Ollama and Llama 3.2), and delivers personalized recipes back to users. The system combines multiple input methods with advanced AI processing to provide customized cooking instructions and ingredient lists. Good to know The system accepts recipe requests through both Gmail and web form submissions AI models understand dietary restrictions, cuisine preferences, and cooking skill levels Recipe responses include formatted ingredients, step-by-step instructions, and cooking tips All requests are processed automatically without manual intervention How it works Gmail Recipe Request Workflow Gmail triggers activate when users send emails with recipe requests to the designated email address The system extracts recipe requirements, dietary preferences, and cooking constraints from email content User queries are processed through the Ollama Recipe Generator for intelligent recipe creation AI-generated recipes are formatted with proper ingredients, instructions, and cooking times Formatted recipes are sent back to users via Gmail with a professional presentation Web Form Recipe Request Workflow Web form submissions trigger when users fill out structured recipe request forms Form data includes cuisine type, dietary restrictions, available ingredients, and cooking time preferences The Llama 3.2 Chef Model processes structured requests for optimized recipe generation Recipes are formatted with clear instructions, ingredient measurements, and cooking techniques Users receive formatted recipes via email with additional cooking tips and variations How to use Import the workflow into your n8n instance and configure Gmail integration for recipe requests Set up the web form with fields for cuisine preferences, dietary restrictions, and cooking skill level Configure Ollama and Llama 3.2 AI models with appropriate recipe generation prompts Test both Gmail and web form inputs with sample recipe requests Customize email templates to match your brand and include additional cooking resources The system scales automatically to handle multiple simultaneous recipe requests Requirements Gmail account for email-based recipe requests and responses Ollama installation with Recipe Generator model Llama 3.2 Chef Model access for advanced recipe processing n8n instance with Gmail and AI model integrations Customising this workflow Recipe automation can be adapted for different cuisines, dietary needs, and cooking skill levels Try popular use-cases such as meal planning assistance, ingredient substitution suggestions, or nutritional information inclusion The workflow can be extended to include recipe image generation, shopping list creation, and cooking video recommendations
by Daniel Nolde
What it is: In version 1.78, n8n introduced a dedicated node to use the OpenRouter service, which lets you to use a lot of different LLM models and providers and change models on the fly in an agentic workflow. For prior n8n versions, there's a workaround to make OpenRouter accessible, by using the OpenAI node with a OpenRouter-specific BaseURL. This trivial workflow demonstrates this for version before 1.78, so that you can use different LLM model dynamically with the available n8n nodes for OpenAI LLM and OpenAI credentials. What you can do: Use any of the OpenRouter models Have the model even dynamically configured or changing (by some external config, some rule, or some specific chat message) Setup steps: Import the workflow Ensure you have registered and account, purchased some credits and created and API key for OpenRouter.ai Configure the "OpenRouter" credentials with your own credentials, using an OpenAI type credential, but making sure in the credential's config form its "Base URL" is set to https://openrouter.ai/api/v1 so OpenRouter is used instead of OpenAI. Open the "Settings" node and change the model value to any valid model id from the OpenRouter models list or even have the model property set dynamically
by Oneclick AI Squad
This n8n template demonstrates how to create an automated customer feedback collection system for restaurants. The workflow triggers when new customer emails are added to an Excel sheet, automatically sends personalized feedback forms, and stores all responses in a separate Excel tracking sheet. Perfect for restaurants wanting to systematically gather customer insights and improve service quality. Good to know Each feedback form is personalized with the customer's name and email All responses are automatically timestamped and organized in Excel sheets The system handles form validation and ensures complete data capture Email notifications keep your team updated on new feedback submissions How it works Email Distribution Workflow New customer entries are detected in Excel Sheet-1 (customer database) containing customer names and email addresses The system automatically generates personalized feedback forms for each new customer Customized feedback emails are sent with embedded forms tailored to restaurant experience evaluation Wait nodes ensure proper processing timing before sending emails Feedback Collection Workflow Customer form submissions trigger the data collection process All feedback responses are captured including ratings, comments, and contact information Data is automatically appended to Excel Sheet-2 (feedback responses) with complete timestamps The system handles multiple concurrent submissions without data loss Excel Sheet Structure Sheet-1 (Customer Database) Name - Customer's full name Email - Customer's email address for form distribution Sheet-2 (Feedback Responses) Timestamp - Date and time of form submission Name - Customer's full name E-Mail - Customer's email address Contact Number - Customer's phone number How was the cleanliness of the dining area? - Cleanliness rating/feedback Did you like the taste of the food? - Food taste evaluation What dish did you enjoy the most? - Favorite dish identification Was your order accurate and timely? - Service accuracy rating Was our staff polite and helpful? - Staff service evaluation Was the food presentation appealing? - Food presentation rating How would you rate your overall dining experience? - Overall experience score Any additional comments or suggestions? - Open-ended feedback field How to use Import the workflow into your n8n instance and configure Excel integration Set up Sheet-1 with customer names and emails for feedback distribution Configure the feedback form with your restaurant's specific questions and branding Add new customer entries to Sheet-1 to automatically trigger feedback emails Monitor Sheet-2 for incoming responses and analyze customer satisfaction trends The system scales automatically with your customer database growth Requirements Google Sheets account for data storage and management Email service integration (Gmail, SMTP, or similar) n8n instance with Google Sheets and email connectors Customising this workflow Customer feedback automation can be adapted for different restaurant types and service models Try popular use-cases such as post-dining follow-ups, seasonal menu feedback, or special event evaluations The workflow can be extended to include automated response analysis, sentiment scoring, and management dashboard integration
by Viktor Klepikovskyi
Google Sheets UI for Workflow Control This n8n template provides a practical and efficient way to manage your n8n workflows using Google Sheets as a user-friendly interface. It demonstrates how to leverage a simple spreadsheet to control inputs, capture outputs, and track the processing status of individual data rows, offering a clear and visual overview of your automation tasks. Purpose of This Template: The primary purpose of this template is to illustrate how Google Sheets can serve as a dynamic UI for your n8n automations. It's designed for n8n users who need: A structured method to feed specific data into their workflows. The ability to selectively trigger workflow execution based on data status. A centralized place to view and store workflow outputs alongside original inputs. A simple, no-code solution for managing workflow data without building custom applications. Setup Instructions: To use this template, follow these steps: Create a Google Sheet: Set up a new Google Sheet (see the template here) with three columns: Color, Status, and Number. Populate the Color column with some sample data (e.g., color names) and set the Status for the rows you want to process to READY. Import the n8n Workflow: Import this n8n template into your n8n instance. Configure Google Sheets Nodes: For the first Google Sheets node (Read operation), ensure it's connected to your newly created Google Sheet and configured to read rows where the Status column is READY. You will need to authenticate your Google Sheets account. For the second Google Sheets node (Update operation), ensure it's also connected to the same Google Sheet. The node should automatically map the row_number, Number, and Status fields from the preceding nodes. Execute the Workflow: Run the workflow. Observe how it reads READY rows, processes them (calculates string length), and updates the Number and Status columns in your Google Sheet to DONE. Control Execution: To process new data, simply add new rows to your Google Sheet and set their Status to READY. Rerunning the workflow will then only process these new entries. For more details and context on this approach, you can refer to the related blog post here.
by Ria
This workflow demonstrates how to use the workflowStaticData() function to set any type of variable that will persist within workflow executions. https://docs.n8n.io/code/cookbook/builtin/get-workflow-static-data/ This can be useful for example when working with access tokens that expire after a certain time period. Using staticData we can keep a record of that access token and the expiry time and build our workflow logic around it. Important Static Data only persists across production executions, i.e. triggered by Webhooks or Schedule Triggers (not manual executions!) For this the workflow will have to be activated. Setup configure HTTP Request node to fetch access token from your API (optional) activate workflow test the workflow with the webhook production link you can check the population of the static data in the single executions Feedback If you found this useful or want to report some missing information - I'd be happy to hear from you at ria@n8n.io
by Yang
What this workflow does This workflow automatically turns new technical video uploads into short, engaging Facebook post drafts—complete with a suggested image—and saves the results to Google Sheets for quick review or publishing. It’s designed to help you repurpose tutorial or demo videos into ready-to-use social content without any manual writing or design effort. What problem is this workflow solving? Manually writing Facebook posts for every new tutorial or product video takes time, especially when you want them to be engaging and consistent. This workflow solves that by using AI to watch for new videos, extract meaningful insights, and write posts and create visuals automatically—saving hours of work. Who is this for? This workflow is ideal for: Content creators uploading tutorial videos Marketing teams working with how-to or product videos Agencies and automation pros building scalable social workflows for clients How it works Trigger: Starts when a new video is uploaded to a specific Google Drive folder. Download & Convert: Downloads the video and converts it to base64. Extract Insights: Dumpling AI analyzes the video and extracts structured insights such as topic, tools mentioned, and key steps. Generate Post: GPT-4o creates a short, friendly Facebook post using those insights, along with an image prompt. Create Visual: Dumpling AI generates an image using the prompt. Save to Sheet: The Facebook post and image URL are saved to a Google Sheet. Setup Create a Google Sheet to store the posts and images. Connect your Google Drive, Google Sheets, Dumpling AI, and OpenAI credentials in n8n. Update the workflow with: Your Google Drive folder ID Your target Google Sheet ID (Optional) Edit the prompt used in the GPT node if you want a different tone, style, or structure for the post. How to customize the workflow Change the platform**: Replace “Facebook” in the prompt with LinkedIn, Instagram, or another platform. Use a different image tool**: You can swap Dumpling AI for any other image generation API (e.g. DALL·E, Midjourney via webhook). Add auto-publishing**: Add a Facebook or social media module to publish the generated post directly instead of just saving to Google Sheets. Tag videos by content type**: Use AI to classify videos into categories and store them in separate tabs or sheets.
by Mihai Farcas
This n8n workflow automates the process of saving web articles or links shared in a chat conversation directly into a Notion database, using Google's Gemini AI and Browserless for web scraping. Who is this AI automation template for? It's useful for anyone wanting to reduce manual copy-pasting and organize web findings seamlessly within Notion. A smarter web clipping tool! What this AI automation workflow does Starts when a message is received Uses a Google Gemini AI Agent node to understand the context and manage the subsequent steps. It identifies if a message contains a request to save an article/link. If a URL is detected, it utilizes a tool configured with the Browserless API (via the HTTP Request node) to scrape the content of the web page. Creates a new page in a specified Notion database, populating it with thea summary scraped content, in a specific format, never leaving out any important details. It also saves the original URL, smart tags, publication date, and other metadata extracted by the AI. Posts a confirmation message (e.g., to a Discord channel) indicating whether the article was saved successfully or if an error occurred. Setup Import Workflow: Import this template into your n8n instance. Configure Credentials & Notion Database: Notion Database: Create or designate a Notion database (like the example "Knowledge Database") where articles will be saved. Ensure this database has the following properties (fields): Name (Type: Text) - This will store the article title. URL (Type: URL) - This will store the original article link. Description (Type: Text) - This can store the AI-generated summary. Tags (Type: Multi-select) - Optional, for categorization. Publication Date (Type: Date) - *Optional, store the date the article was published. Ensure the n8n integration has access to this specific database. If you require a different format to the Notion Database, not that you will have to update the Notion tool configuration in this n8n workflow accordingly. Notion Credential: Obtain your Notion API key and add it as a Notion credential in n8n. Select this credential in the save_to_notion tool node. Configure save_to_notion Tool: In the save_to_notion tool node within the workflow, set the 'Database ID' field to the ID of the Notion database you prepared above. Map the workflow data (URL, AI summary, etc.) to the corresponding database properties (URL, Description, etc.). In the blocks section of the notion tool, you can define a custom format for the research page, allowing the AI to fill in the exact details you want extracted from any web page! Google Gemini AI: Obtain your API key from Google AI Studio or Google Cloud Console (if using Vertex AI) and add it as a credential. Select this credential in the "Tools Agent" node. Discord (or other notification service): If using Discord notifications, create a Webhook URL (instructions) or set up a Bot Token. Add the credential in n8n and select it in the discord_notification tool node. Configure the target Channel ID. Browserless/HTTP Request: Cloud: Obtain your API key from Browserless and configure the website_scraper HTTP Request tool node with the correct API endpoint and authentication header. Self-Hosted: Ensure your Browserless Docker container is running and accessible by n8n. Configure the website_scraper HTTP Request tool node with your self-hosted Browserless instance URL. Activate Workflow: Save test and activate the workflow. How to customize this workflow to your needs Change AI Model:** Experiment with different AI models supported by n8n (like OpenAI GPT models or Anthropic Claude) in the Agent node if Gemini 2.5 Pro doesn't fit your needs or budget, keeping in mind potential differences in context window size and processing capabilities for large content. Modify Notion Saving:** Adjust the save_to_notion tool node to map different data fields (e.g., change the summary style by modifying the AI prompt, add specific tags, or alter the page content structure) to your Notion database properties. Adjust Scraping:** Modify the prompt/instructions for the website_scraper tool or change the parameters sent to the Browserless API if you need different data extracted from the web pages. You could also swap Browserless for another scraping service/API accessible via the HTTP Request node.
by Niklas Hatje
Use Case In most companies, employees have a lot of great ideas. That was the same for us at n8n. We wanted to make it as easy as possible to allow everyone to add their ideas to some formatted database - it should be somewhere where everyone is all the time and could add a new idea without much extra effort. Since we're using Slack, this seemed to be the perfect place to easily add ideas and collect them in Notion. What this workflow does This workflow waits for a webhook call within Slack, that gets fired when users use the /idea command on a bot that you will create as part of this template. It then checks the command, adds the idea to Notion, and notifies the user about the newly added idea as you can see below: Creating your Slack bot Visit https://api.slack.com/apps, click on New App and choose a name and workspace. Click on OAuth & Permissions and scroll down to Scopes -> Bot token Scopes Add the chat:write scope Head over to Slash Commands and click on Create New Command Use /idea as the command Copy the test URL from the Webhook node into Request URL Add whatever feels best to the description and usage hint Go to Install app and click install Setup Add a Database in Notion with the columns Name and Creator Add your Notion credentials and add the integration to your Notion page. Fill the setup node below Create your Slack app (see other sticky) Click Test workflow and use the /idea comment in Slack Activate the workflow and exchange the Request URL with the production URL from the webhook How to adjust it to your needs You can adjust the table in Notion and for example, add different types of ideas or areas that they impact You might wanna add different templates in Notion to make it easier for users to fill their ideas with details Rename the Slack command as it works best for you How to enhance this workflow At n8n we use this workflow in combination with some others. E.g. we have the following things on top: We additionally have a /bug Slack command that adds a new bug to Linear. Here we're using AI to classify the bugs and move it to the right team. (see this template and this template) We also added other types, like /pain to be less solution-driven To make it easier for everyone to give input, we added a Votes column that allows everyone to vote on ideas/pain points in the list We're also running a workflow once a week that highlights the most popular new ideas and the most active voters (see here)