by Pedro Santos
🤖 AI Agent Web Search using SearchApi & LLM Who is this for? This workflow is ideal for anyone conducting online research, including students, researchers, content creators, and professionals looking for accurate, up-to-date, and verifiable information. It also serves as an excellent foundation for building more sophisticated AI-driven applications. What problem does this workflow solve? / Use case This workflow automates web searches by enabling an AI agent to efficiently retrieve and summarize external, verifiable information, ensuring accuracy through source citations. What this workflow does Connects an AI agent node to SearchApi.io as an integrated search tool. Empowers the AI agent to perform real-time web searches using various SearchApi engines (e.g., Google, Bing). Allows the AI agent to dynamically determine search parameters based on user interaction, delivering contextually relevant results. Ensures responses include clearly cited sources for validation and further exploration. Setup Install the SearchApi community node: Open Settings → Community Nodes inside your self‑hosted n8n instance. Fill npm Package Name with @searchapi/n8n-nodes-searchapi. Accept the risk prompt, and hit Install. It should now appear as a node when you search for it. API Configuration: Set up your SearchApi.io credentials in n8n. Add your preferred LLM provider credentials (e.g., OpenRouter API). Input Requirements: Provide the YouTube video ID (e.g., wBuULAoJxok). Connect LLM Integration: Configure the summarization chain with your chosen model and parameters for text splitting. How to customize this workflow to your needs Integrate additional nodes to structure or store search results (e.g., saving to databases, Notion, Google Sheets). Extend chatbot capabilities to integrate with messaging platforms (Slack, Discord) or email notifications. Adjust search parameters and filters within the AI agent node to tailor information retrieval. Example Usage Input**: User asks, "What are the latest developments in AI regulation?" Output**: AI retrieves, summarizes, and cites recent, authoritative articles and news sources from the web.
by TechDennis
Edit an existing image with OpenAI ImageGen1 via API Request Transform your creative pipeline by letting n8n call OpenAI ImageGen1’s edit image endpoint, automatically replacing or augmenting parts of any image you supply and returning a brand-new version in seconds. Designers, marketers, and product teams can eliminate repetitive manual edits and test more variations, faster. Who is this for? Content creators who need quick, on-brand image tweaks Marketers running A/B visual tests at scale Developers exploring the new ImageGen1 API inside low-code automations Use case / problem solved Opening design software to mask, fill, or swap objects is slow and error-prone. This workflow feeds an input image plus a prompt to OpenAI ImageGen1, receives the edited output, and passes it on to any service you like—perfect for bulk-editing product shots, social visuals, or UI mocks. What this workflow does Read or receive the source image (Webhook → Binary Data). Call OpenAI ImageGen1 with an HTTP Request node, sending the image and edit prompt. Parse the JSON response to capture the returned image URL. Download & hand off the edited file (e.g., upload to S3, post to Slack, or store in Drive). Setup Add your OpenAI API key in the API KEY node. Follow the notes on the workflow for more information. (Optional) Point the final node to your preferred storage or chat tool. > 📝 A sticky note in the workflow summarizes these steps and links to the OpenAI documentation. How to customize this workflow Trigger alternatives**: Replace the Chat with Google Drive, Airtable, etc. Chained edits**: Loop the output back for successive prompts. Conditional flows**: Add an If node to branch actions by image size or category. With renamed nodes, color-coded sticky notes, and a concise setup guide, you’ll be editing images via OpenAI ImageGen1 in under five minutes—no code, maximum creativity.
by KlickTipp
Community Node Disclaimer: This workflow uses KlickTipp community nodes. How It Works This workflow automates personalized WhatsApp message template delivery triggered by events in KlickTipp or by messages sent to the Whatsapp Business account. When a contact triggers an Outbound, the workflow uses a pre-approved WhatsApp message template to send dynamic, real-time messages through the WhatsApp Business Cloud API. When receiving messages it checks whether a cancellation should be processed or if a auto-response is sent. This setup is ideal for time-sensitive campaigns such as: Birthday greetings Discount or promo notifications Follow-ups on product or service interest Key Features KlickTipp Trigger Starts the workflow when a specific outbound is triggered Typical use case: subcriber receives activation Tag and triggers an Outbound which sends a webhook call to trigger WhatsApp messaging. WhatsApp Business Cloud - Message Trigger Listens to messages from the contact and processes answers with answering auto-responder or by tagging the contact in KlickTipp. WhatsApp Business Cloud - Sending Template Messages Sends WhatsApp message templates using a pre-approved template. Template placeholders are filled with data from KlickTipp custom fields. Setup Instructions Set up the KlickTipp and Whatsapp nodes in your n8n instance. Authenticate your WhatsApp and KlickTipp accounts. Create the necessary custom fields to match the data structure Verify and customize field assignments in the workflow to align with your specific form and subscriber list setup. | Field Label | Field Type | |--------------------------------|-------------| | Whatsapp_Produkt/Dienstleistung | Single line | | | Whatsapp_Name/Unternehmen | Single line | | Whatsapp_Link_Endung | Single line | Testing & Deployment Use a real test contact with all required fields filled. Trigger the Outbound in KlickTipp using the activation tag and answer with a message to the template. Run the scenario once in n8n to verify successful delivery of the whatsapp message template to your test contact as well as the reception of the auto-responder and the subscription and tagging in KlickTipp to stop further messages. Campaign Expansion Ideas Connect campaign to process keywords like "STOP" from WhatsApp messages Pair WhatsApp with welcome email series for onboarding. Use tags like product_interest_X for precise segmentation. A/B test templates with different CTA formats or timings. Monitor CTRs via dynamic URLs in WhatsApp templates. Benefits Multi-channel engagement:** Adds WhatsApp to your marketing toolkit. Dynamic content:** Personalizes messages using contact data. KlickTipp campaign control** Whatsapp contacts can for example signal with messages like "STOP" to receive the according Tag in KlickTipp in order to start/end automations. > 💡 Pro Tip: Customize the domain link ending per campaign or product line. This allows targeted redirects, e.g., meinshop.de/ProduktA or `mein Ressources: Send WhatsApp Templates with KlickTipp Use KlickTipp Community Node in n8n Automate Workflows: KlickTipp Integration in n8n
by Mutasem
Use case This workflow uses Gmail to send outreach emails to new Hubspot contacts that have yet to be contacted (usually unknown contacts), and records the outreach in Hubspot. Setup Setup HubSpot Oauth2 creds (Be careful with scopes. They have to be exact, not less or more. Yes, it's not simple, but it's well documented in the n8n docs. Be smarter than me, read the docs) Setup Gmail creds. Change the from email and from name in the Record outreach in HubSpot node How to adjust this template to your needs Change the email message in the Set keys node Think about your criteria to reach out to new contacts. Here we simply filter for only contacts with unknown dates.
by Hugo
This workflow provides a robust solution for automatically backing up all your n8n workflows to a designated GitHub repository on a daily basis. By leveraging the n8n API and GitHub API, it ensures your workflows are version-controlled and securely stored, safeguarding against data loss and facilitating disaster recovery. How it works The automation follows these key steps: Scheduled trigger: The workflow is initiated automatically every day at a pre-configured time. List existing backups: It first connects to your GitHub repository to retrieve a list of already backed-up workflow files. This helps in determining whether a workflow's backup file needs to be created or updated. Retrieve n8n workflows: The workflow then fetches all current workflows directly from your n8n instance using the n8n REST API. Process and prepare: Each retrieved workflow is individually processed. Its data is converted into JSON format. This JSON content is then encoded to base64, a format suitable for GitHub API file operations. Commit to GitHub: For each n8n workflow: A standardized filename is generated (e.g., workflow-name-tag.json). The workflow checks if a file with this name already exists in the GitHub repository (based on the list fetched in step 2). If the file exists: It updates the existing file with the latest version of the workflow. If it's a new workflow (file doesn't exist): A new file is created in the repository. Each commit is timestamped for clarity. This process ensures that you always have an up-to-date version of all your n8n workflows stored securely in your GitHub version control system, providing peace of mind and a reliable backup history. Pre-requisites Before you can use this template, please ensure you have the following: An active n8n instance (self-hosted or cloud). A GitHub account. A GitHub repository created where you want to store the workflow backups. A GitHub Personal Access Token with repo scope (or fine-grained token with read/write access to the specific backup repository). This token will be used for GitHub API authentication. n8n API credentials (API key) for your n8n instance. Set up steps Setting up this workflow should take approximately 10-15 minutes if you have your credentials ready. Import the template: Import this workflow into your n8n instance. Configure n8n API credentials: Locate the "Retrieve workflows" node. In the "Credentials" section for "n8n API", create new credentials (or select existing ones). Enter your n8n instance URL and your n8n API Key (you can create your n8n api key in the settings of your n8n instance) Configure GitHub credentials: Locate the "List files from repo" node (and subsequently "Update file" / "Upload file" nodes which will use the same credential). In the "Credentials" section for "GitHub API", create new credentials. Select OAuth2/Personal Access Token authentication method. Enter the GitHub Personal Access Token you generated as per the pre-requisites. Specify repository details: In the "List files from repo", "Update file", and "Upload file" GitHub nodes: Set the Owner: Your GitHub username or organization name. Set the Repository: The name of your GitHub repository dedicated to backups. Set the Branch (e.g., main or master) where backups should be stored. (Optional) Specify a Path within the repository if you want backups in a specific folder (e.g., n8n_backups/). Leave blank to store in the root. Adjust schedule (Optional): Select the "Schedule Trigger" node. Modify the trigger interval (e.g., change the time of day or frequency) as needed. By default, it's set for a daily run. Activate the workflow: Save and activate the workflow. Explanation of nodes Here's a detailed breakdown of each node used in this workflow: Schedule trigger** Type: n8n-nodes-base.scheduleTrigger Purpose: This node automatically starts the workflow based on a defined schedule (e.g., daily at midnight). List files from repo** Type: n8n-nodes-base.github Purpose: Connects to your specified GitHub repository and lists all files, primarily to check for existing workflow backups. Aggregate** Type: n8n-nodes-base.aggregate Purpose: Consolidates the list of file names obtained from the "List files from repo" node into a single item for easier lookup later in the "Check if file exists" node. Retrieve workflows** Type: n8n-nodes-base.n8n Purpose: Uses the n8n API to fetch a list of all workflows currently present in your n8n instance. Json file** Type: n8n-nodes-base.convertToFile Purpose: Takes the data of each workflow (retrieved by the "Retrieve workflows" node) and converts it into a structured JSON file format. To base64** Type: n8n-nodes-base.extractFromFile Purpose: Converts the binary content of the JSON file (from the "Json file" node) into a base64 encoded string. This encoding is required by the GitHub API for file content. Commit date & file name** Type: n8n-nodes-base.set Purpose: Prepares metadata for the GitHub commit. It generates: commitDate: The current date and time for the commit message. fileName: A standardized file name for the workflow backup (e.g., my-workflow-vps-backups.json), typically using the workflow's name and its first tag. Check if file exists** Type: n8n-nodes-base.if Purpose: A conditional node. It checks if the fileName (generated by "Commit date & file name") is present in the list of files aggregated by the "Aggregate" node. This determines if the workflow backup already exists in GitHub. Update file** Type: n8n-nodes-base.github Purpose: If the "Check if file exists" node determines the file does exist, this node updates that existing file in your GitHub repository with the latest workflow content (base64 encoded) and a commit message. Upload file** Type: n8n-nodes-base.github Purpose: If the "Check if file exists" node determines the file does not exist, this node creates and uploads a new file to your GitHub repository with the workflow content and a commit message. Customization Here are a few ways you can customize this template to better fit your needs: Backup path**: In the GitHub nodes ("List files from repo", "Update file", "Upload file"), you can specify a Path parameter to store backups in a specific folder within your repository (e.g., workflows/ or daily_backups/). Filename convention**: Modify the "Commit date & file name" node (specifically the expression for fileName) to change how backup files are named. For example, you might want to include the workflow ID or a different date format. Commit messages**: Customize the commit messages in the "Update file" and "Upload file" GitHub nodes to include more specific information if needed. Error handling**: Consider adding error handling branches (e.g., using the "Error Trigger" node or checking for node execution failures) to notify you if a backup fails for any reason. Filtering workflows**: If you only want to back up specific workflows (e.g., those with a particular tag or name pattern), you can add a "Filter" node after "Retrieve workflows" to include only the desired workflows in the backup process. Backup frequency**: Adjust the "Schedule Trigger" node to change how often the backup runs (e.g., hourly, weekly, or on specific days). Template was created in n8n v1.92.2
by Airtop
Monitor X for Relevant Posts Use Case This automation monitors X (formerly Twitter) search pages in real time and extracts high-signal posts that match your categories of interest. It’s ideal for community engagement, lead discovery, thought leadership tracking, or competitive analysis. What This Automation Does Given a search URL and a list of categories, it: Logs into X using Airtop Opens the specified search URL Scrolls through the results Extracts up to 10 valid, English-language posts Filters and classifies each post by category (or marks as [NA] if unrelated) Returns the structured results as JSON Input parameters: airtop_profile** — An Airtop browser profile authenticated on X x_url** — X search URL (e.g., https://x.com/search?q=ai agents&f=live) relevant_categories** — Text-based list of categories to classify posts (e.g., "Web automation use cases", "Thought leadership") Output: A JSON array of posts, each with: writer time text url category How It Works Trigger: This workflow is triggered by another workflow (e.g., a community engagement pipeline). Input Setup: Accepts the Airtop profile, search URL, and categories to use for classification. Session: Starts a browser session using the Airtop profile. Window Navigation: Opens the provided X search URL. Extraction: Scrapes up to 10 posts with /status/ in the URL and text in English. Classification: Each post is labeled with a category if relevant, or [NA] otherwise. Filtering: Discards [NA] posts. Output: Returns the list of classified posts. Setup Requirements Airtop profile with an active X login. Airtop API key connected in n8n. List of category definitions to guide post classification (used in prompt). Next Steps Feed into Engagement Workflows**: Pass the results to workflows that reply, retweet, or track posts. Use in Slack Alerts**: Push classified posts into Slack channels for review and reaction. Customize Classifier**: Refine the categorization logic to include sentiment or company mentions. Read more about Monitoring X for Relevant Posts
by bangank36
This workflow converts an exported CSV from Squarespace profiles into a Shopify-compatible format for customer import. How It Works Clone this Google Sheets template, which includes two sheets: Squarespace Profiles (Input) Go to Squarespace Dashboard → Contacts Click the three-dot icon → Select Export all Contacts Shopify Customers (Output) This sheet formats the data to match Shopify's customer import CSV. Shopify Dashboard → Customers → Import customers by CSV The workflow can run on-demand or be triggered via webhook. Via webhook Set up webhook node to expect a POST request Trigger the webhook using this code (psuedo) - replace {webhook-url} with the actual URL const formData = new FormData(); formData.append('file', blob, 'profiles_export.csv'); // Add file to FormData fetch('{webhook-url}', { // Replace with your target URL method: 'POST', mode: 'no-cors', body: formData }); The data is processed into the Shopify Customers sheet. Manually trigger Import Squarespace profiles into the sheet. Run the workflow to convert and populate the Shopify Customers sheet. Once workflow is done, export the Shopify to csv and import to Shopify customers Requirements To use this template, you need: Google Sheets API credentials Google Sheets Setup Use this sample Google Sheets template to get started quickly. Who Is This For? For anyone looking to automate Squarespace contact exports into a Shopify-compatible format—no more manual conversion! Explore More Templates Check out my other n8n templates: 👉 n8n.io/creators/bangank36
by Akhil Varma Gadiraju
📥 Gmail Attachment Backup to Google Drive — n8n Workflow This n8n workflow automatically backs up email attachments from a specific sender in Gmail to a designated folder in Google Drive. It polls Gmail every minute and uploads any new attachments from matching emails to the specified Google Drive folder with a timestamped filename. 📌 Use Case Primary Purpose: Automatically archive and back up attachments from a specific sender (e.g., test@gmail.com) to Google Drive for safekeeping, audit, or processing. Ideal For: Automating invoice/receipt collection from a vendor Archiving reports from a monitored email address Creating a searchable historical log of attachments for compliance 🧭 Workflow Overview Here’s how the workflow operates: 🔔 Gmail Trigger Polls Gmail every minute for new messages from a specific sender (test@gmail.com). 📩 Gmail Get Message Retrieves the full contents (including attachments) of the matched email. 🧠 Code (JS) Iterates over all binary attachments in the email and restructures them as individual binary items to upload separately. 📤 Google Drive Uploads each attachment to a target Google Drive folder (DOcs) with a timestamp and unique name. 📍 Replace Me (NoOp) Placeholder node to indicate workflow completion. You can replace this with Slack notifications, logs, or alerts. 🔧 How to Use Prerequisites An n8n instance (self-hosted or cloud) A connected Gmail account with OAuth2 credentials A connected Google Drive account with OAuth2 credentials Permissions for n8n to access your Gmail and Google Drive Setup Instructions Import the Workflow Copy and paste the workflow JSON into your n8n editor. Set Up Credentials Ensure the following credentials exist and are authorized: `Gmail (for Gmail nodes) `Google Drive (for Google Drive node) Configure the Folder Update the folderId in the Google Drive node if you want to use a different target folder. Activate the Workflow Enable the workflow in n8n. It will start polling Gmail every minute. ✏️ How to Customize | Task | How to Customize | |------|------------------| | Change sender filter | Modify the sender field in the Gmail Trigger node | | Adjust polling frequency | Change the pollTimes configuration in the trigger node | | Change destination folder | Update folderId in the Google Drive node | | Modify filename format | Edit the name expression in the Google Drive node | | Add post-upload logic | Replace or extend the Replace Me node with notifications, logs, etc. | | Process only specific attachments | Add logic in the Code node to filter by filename or MIME type | 📂 Filename Format Example [MessageID]_[Timestamp]_backup_attachment This naming convention ensures uniqueness and traceability back to the original message. ✅ Future Improvements Email subject filtering** to narrow down the match Slack/Email notifications** after upload Deduplication check** to avoid reuploading the same files Virus scan or file validation** before upload 💬 Support For any issues using this workflow: Double-check your credential permissions Review n8n logs for Gmail or Google Drive errors Visit the n8n community forums
by Lucía Maio Brioso
🧑💼 Who is this for? This workflow is for anyone who receives too many emails and wants to stay informed without drowning in their inbox. If you're constantly checking your Gmail and wish you had someone summarizing messages and sending just the important parts to your phone, this is for you. Especially useful for solopreneurs, customer support, busy professionals, or newsletter addicts. 🧠 What problem is this workflow solving? Email is powerful, but also overwhelming. Important info gets buried in threads, and staying on top of things can mean hours wasted scanning messages. This workflow turns that chaos into clarity: as soon as a new email arrives, you get a concise AI-generated summary in Telegram — straight to your pocket. No more checking Gmail constantly. No more missing key updates. Just a clean, human-style summary, written in the language you choose. ⚙️ What this workflow does Watches your Gmail inbox for new messages Prepares the content, including sender, subject, and message body Sends it to OpenAI to generate a friendly, casual summary Delivers that summary to your Telegram chat All in seconds, completely automated. 🛠️ Setup Connect your accounts: Gmail, Telegram, and OpenAI credentials must be added to the respective nodes. Set your Telegram chat ID: Use a bot like @userinfobot to get it. Customize the language in the Set summary language node (default is English). Activate the workflow — and watch it go. 🧩 How to customize this workflow to your needs You can make this workflow your own in a few easy ways: Summarize only some emails**: Add a Filter node after the Gmail trigger (e.g., only messages from certain senders). Change the tone or detail of summaries**: Tweak the system prompt in the Summary generation agent. Use a different model**: Swap OpenAI’s GPT-4o for another provider like Claude or DeepSeek. Translate to your preferred language**: Just change "english" to "español", "français", etc.
by Intuz
This n8n template from Intuz provides a complete and automated solution for real-time financial reporting. It instantly syncs new QuickBooks invoices to Google Sheets, using specific invoice data or keywords as triggers to ensure your financial records are always accurate and up-to-date. It uses a webhook to capture every new or updated invoice and logs the essential details into a designated Google Sheet. Perfect for creating custom reports, data backups, or a real-time dashboard of your accounts receivable. Use Cases Financial Reporting:** Create a simple, shareable Google Sheet for team members who don't have QuickBooks access. Data Backup:** Maintain a secure, independent log of all your invoices outside of the QuickBooks ecosystem. Custom Dashboards:** Use the Google Sheet as a data source for tools like Google Data Studio or Grafana to build custom financial dashboards. Auditing:** Easily track the history and status of all invoices in a simple, searchable spreadsheet format. How it Works 1. Instant Webhook Trigger: The workflow activates the moment an invoice is created or updated in QuickBooks. The QuickBooks webhook sends a notification to n8n, kicking off the process in real time. 2. Fetch Full Invoice Details: The initial webhook notification only contains the invoice ID. This node uses that ID to make a call back to the QuickBooks API and retrieve the complete invoice data, including customer name, due date, and more. 3. Format Key Data: A simple Code node cleans up the data fetched from QuickBooks. It extracts only the fields you need—ID, Domain, Customer Name, and Due Date—and structures them perfectly for the next step. 4. Append or Update in Google Sheets: The final node connects to your Google Sheet and uses the powerful "Append or Update" operation. If the ID of the invoice doesn't exist in the sheet, it adds a new row. If the ID already exists, it updates the existing row with the latest information. This ensures your Google Sheet is always a perfect mirror of your QuickBooks invoice data, preventing duplicates and keeping everything current. Setup Instructions For this workflow to run successfully, follow these setup steps: 1. Credentials: QuickBooks: Connect your QuickBooks account credentials to n8n. Google: Connect your Google account using OAuth2 credentials. Ensure the Google Sheets and Google Drive APIs are enabled. 2. QuickBooks Webhook Configuration: Activate the workflow. Copy the Production URL from the Webhook node. In your Intuit Developer Portal, go to the webhooks section for your app. Paste the URL and subscribe to Invoice events (e.g., Create, Update). 3. Google Sheet Setup: Create a Google Sheet for your invoice data. Crucially, create the following headers in the first row of your sheet: -ID -Domain -Customer Name -Due Date 4. Node Configuration: In the Append or update row in sheet node, select your Google Sheet document and the specific sheet name from the dropdown lists. The columns should map automatically if you've set up the headers correctly. Connect with us Website: https://www.intuz.com/cloud/stack/n8n Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz
by Aitor | 1Node
This n8n workflow automates the generation and delivery of a daily order summary via email. It leverages an AI Agent to fetch and summarize e-commerce order data from the last 24 hours stored in Supabase, providing a concise overview of the daily business operations. How it works Scheduled Trigger:** The workflow is triggered every day at 8 AM. Sender Email Configuration:** A manual step allows you to set the sender's email address. AI Agent:** An AI Agent node acts as the central intelligence, interacting with various tools to gather and process data. Supabase Data Fetching:** The AI Agent calls "Get Orders," "Get Order Items," "Get Clients," and "Get Products" tables to retrieve relevant e-commerce data from your Supabase database. OpenAI Chat Model:** An OpenAI Chat Model with the 4.1 model is integrated to help the AI Agent understand and summarize the fetched data into a human-readable format. Gmail Summary:** Finally, the workflow sends a summarized report to your specified email address using the "Send Gmail Summary" node. Set up steps This setup will take approximately 15-20 minutes. Download the workflow: Download this workflow and import it into your n8n instance. Configure the Daily 8am trigger: Ensure the "Daily 8am" trigger is active and set to your desired timezone. Set Sender Email: In the "Set Sender Email" node, manually enter the email address you wish to use as the sender for the daily reports. Configure AI Agent: Chat Model: Connect your OpenAI Chat Model credential. Memory & Tools: Ensure all the necessary nodes ("Get Orders", "Get Order Items", "Get Clients", "Get Products", "Send Gmail Summary") are correctly linked to the AI Agent. In our workflow we call data from 4 tables in Supabase. Configure Supabase Database Connections: For each of the "Get Orders," "Get Order Items," "Get Clients," and "Get Products" nodes, you will need to configure your Supabase credentials to access your e-commerce database. Select the tables (e.g., orders, order_items, clients, products) that you want the AI agent to pull data from in your Supabase schema. Configure Gmail Credentials: In the "Send Gmail Summary" node, connect your Gmail account credentials to allow n8n to send emails on your behalf. Test the workflow: Run the workflow manually to ensure all connections are working correctly and the email summary is generated as expected. Requirements n8n instance:** An active n8n instance (self-hosted or cloud). Supabase Account:** A Supabase account with your e-commerce order data accessible. OpenAI API Key:** An OpenAI API key for the Chat Model. Gmail Account:** A Gmail account credentials to send the daily summaries. Need help? Feel free to contact us at 1 Node. Get instant access to a library of free resources we created.
by Derek Schatz
Overview This automated workflow delivers a weekly digest of the most important AI news directly to your inbox. Every Monday at 9 AM, it uses Perplexity AI to research the latest developments and organizes them into four key categories: New Technology, Trending Topics, Top Stories, and AI Security. The workflow then formats this information into a beautifully designed HTML email with summaries, significance explanations, and source links. What It Does Automatically searches for the latest AI news using Perplexity AI Categorizes content into four focused areas most relevant to AI enthusiasts and professionals Generates comprehensive summaries explaining why each story matters Creates a professional HTML email with styled sections and clickable links Sends weekly on Monday at 9 AM (customizable schedule) Includes error handling with fallback content if news parsing fails Setup Instructions Import the Workflow Copy the JSON code and import it into your n8n instance The workflow will appear as “Daily AI News Summary” Configure Perplexity API Sign up for a Perplexity API account at perplexity.ai Create new credentials in n8n: Type: “OpenAI” Name: “perplexity-credentials” API Key: Your Perplexity API key Base URL: https://api.perplexity.ai Set Up Email Credentials Configure SMTP credentials in n8n: Name: “email-credentials” Add your email provider’s SMTP settings Test the connection to ensure emails can be sent Customize Email Settings Open the “Send Email Summary” node Update the toEmail field with your email address Modify the fromEmail if needed (must match your SMTP credentials) Optional Customizations Change Schedule:** Modify the “Daily Trigger” node to run at your preferred time Adjust Categories:** Edit the Perplexity prompt to focus on different AI topics or change the theme altogether Modify Styling:** Update the HTML template in the “Format Email Content” node Test and Activate Run a test execution to ensure everything works correctly Activate the workflow to start receiving daily AI news summaries Requirements n8n instance (cloud or self-hosted) Perplexity API account and key SMTP email access (Gmail, Outlook, etc.)