by Harshil Agrawal
This workflow allows you to add articles to a Notion reading list by accessing a Discord slash command. Prerequisites A Notion account and credentials, and a reading list similar to this template. A Discord account and credentials, and Discord Slash Command connected to n8n. Nodes Webhook node triggers the workflow whenever the Discord Slash command is issued. IF node checks the type returned by Discord. If the type is not equal to 1, it will return true, otherwise false. HTTP Request node makes an HTTP call to the link and gets the HTML of the webpage. HTML Extract node extracts the title from the HTML which we will use in the next node. Notion node adds the link to your Notion reading list. Set nodes set the reply values for Discord and register the Interaction Endpoint URL.
by Codez & AI
Overview This n8n workflow automates the process of extracting published WordPress posts, converting them into a CSV file, and uploading it to Google Drive. It’s perfect for content backups, SEO audits, and data migration. Features Fetches all published posts from a WordPress website Extracts key post details (ID, Title, Link) Converts the extracted data into a CSV file Uploads the CSV file to Google Drive for easy access and storage Use Cases SEO Optimization**: Export post data for keyword analysis and performance tracking Automated Content Backup**: Store WordPress post details in Google Drive. You can add more fields to the Csv file if needed Workflow Steps 1. Trigger Workflow Manually The workflow starts when triggered manually in n8n. 2. Retrieve WordPress Posts The workflow fetches all published posts using the WordPress API. It extracts: Post ID Title Link Rendered Content 3. Format Data The retrieved data is structured to ensure correct CSV formatting. 4. Convert to CSV File The formatted data is transformed into a downloadable CSV file. 5. Upload to Google Drive The CSV file is automatically uploaded to a specified Google Drive folder for easy access and storage. How to Use Connect your WordPress and Google Drive accounts to n8n. Run the workflow manually or set up a scheduled trigger. Access the CSV file from your Google Drive folder.
by Yaron Been
Openai Clip Image Generator Description Official CLIP models, generate CLIP (clip-vit-large-patch14) text & image embeddings Overview This n8n workflow integrates with the Replicate API to use the openai/clip model. This powerful AI model can generate high-quality image content based on your inputs. Features Easy integration with Replicate API Automated status checking and result retrieval Support for all model parameters Error handling and retry logic Clean output formatting Parameters Optional Parameters text** (string, default: None): Input text to encode image** (string, default: None): Input image to encode How to Use Set up your Replicate API key in the workflow Configure the required parameters for your use case Run the workflow to generate image content Access the generated output from the final node API Reference Model: openai/clip API Endpoint: https://api.replicate.com/v1/predictions Requirements Replicate API key n8n instance Basic understanding of image generation parameters
by Yaron Been
Telegram AI Assistant: Summarize Links & Generate Images On Demand This workflow turns any Telegram chat into a smart assistant. By typing simple commands like /summary or /img, users can trigger powerful AI actions—directly from Telegram. ✨ What It Does This automation listens for specific commands in Telegram messages: /help: Sends a help menu explaining available commands. /summary <link>: Fetches a webpage, extracts its content, and summarizes it using OpenAI into 10–12 bullet points. /img <prompt>: Sends the image prompt to OpenAI and replies that the request has been received (designed for future integration with image APIs). 📦 Features ✅ Works instantly in Telegram 🧠 Uses OpenAI for text summarization and image prompt processing 🌐 Scrapes and cleans raw article text before summarizing 📤 Replies directly to the same Telegram thread 🔧 Easily expandable to support more commands 🔧 Use Cases Research Summaries**: Quickly condense articles or reports shared in chat. Content Review**: Get team-friendly TL;DRs of long blog posts or product pages. Creative Brainstorming**: Share visual ideas via /img and get quick prompts logged. Customer Support**: Offer instant answers in group chats (with further extension). Daily Digest Bot**: Connect to news feeds and auto-summarize updates. 🚀 Getting Started Clone this workflow and connect your Telegram Bot. Insert your OpenAI credentials. Deploy and test by messaging /summary https://example.com in your Telegram group or DM. Expand with new commands or connect Stability.ai or other services for real image generation. 🔗 Author & Resources Built by Yaron Been Follow more automations at nofluff.online
by Sirhexalot
This n8n workflow allows you to reset all user roles in Zammad to specified default roles. It ensures consistency in role management across your Zammad instance. Features Retrieve all active users from Zammad. Update each user's roles to predefined default role IDs. Exclude specific users by their IDs from the update process. Simple configuration for default roles and excluded users. Usage Import the Workflow: Upload the provided .json file into your n8n instance. Configure Variables: zammad_base_url: Your Zammad instance URL. zammad_api_key: Your Zammad API key. default_roles: List of default role IDs to apply to all users. exclude_zammad_users_by_id: List of user IDs to exclude from the update. Run the Workflow: Execute the workflow to update roles automatically. Issues and Suggestions For issues or suggestions, visit the GitHub Repository.
by Sirhexalot
This n8n workflow allows you to update user roles in Zammad based on data from an Excel file. The workflow automates role assignments, ensuring efficient and consistent updates. Features Excel Integration**: Import user data from an Excel file containing emails and role assignments. Dynamic Updates**: Match Zammad users by email and update their roles. Error Handling**: Continue workflow execution even if some updates fail. Customizable Variables**: Configure Zammad API URL, API key, and Excel file URL. Usage Import the Workflow: Upload the provided .json file into your n8n instance. Set Variables: zammad_base_url: Your Zammad instance URL. excel_source_url: URL of the Excel file containing user data. Authentication for Zammad Create in the Node "Find Zammad User by email" and "Update User Roles" a Header Auth Authentication Name**: Authorization Value**: Bearer <put here your zammad api token> Run the Workflow: Execute the workflow to update user roles based on the Excel data. Issues and Suggestions For issues or suggestions, visit the GitHub Repository.
by Sascha
Campaign tracking is pivotal; it enables marketers to evaluate the efficacy of various strategies and channels. UTM parameters are particularly essential as they provide granular details about the source, medium, and campaign effectiveness. However, when this data is not automatically integrated into a centralized system, it can become a tedious and error-prone process to manually collate and analyze it. Retrieving UTM data from Shopify and storing it in Baserow enables oy to do more with this data. For example you could build a campaign database in Baserow and automatically add campaign revenue to it using this workflow template. This template will help you: Automatically retrieve UTM parameters from Shopify orders using the Shopify Admin API Process marketing data through n8n Store this data into Baserow, providing you with a dynamic, responsive base for campaign tracking and decision-making This template will demonstrate the follwing concepts in n8n: use the Schedule trigger node use the GraphQL node to call the Shopify Admin API split larger incoming datasets into n8n items with the Split node transform the data structure with the Set node control flow with the If node store data in Baserow with the Baserow node How to get started? Create a custom app in Shopify get the credentials needed to connect n8n to Shopify This is needed for the Shopify Trigger Create Shopify Acces Token API credentials n n8n for the Shopify trigger node Create Header Auth credentials: Use X-Shopify-Access-Token as the name and the Acces-Token from the Shopify App you created as the value. The Header Auth is neccessary for the GraphQL nodes. You will need a running Baserow instance for this. You can also sign up for a free account at https://baserow.io/ Please make sure to read the notes in the template. For a detailed explanation please check the corresponding video: https://youtu.be/VBeN-3129RM
by Dataki
This workflow demonstrates how to enrich data from a list of companies in a spreadsheet. While this workflow is production-ready if all steps are followed, adding error handling would enhance its robustness. Important notes Check legal regulations**: This workflow involves scraping, so make sure to check the legal regulations around scraping in your country before getting started. Better safe than sorry! Mind those tokens**: OpenAI tokens can add up fast, so keep an eye on usage unless you want a surprising bill that could knock your socks off! 💸 Main Workflow Node 1 - Webhook This node triggers the workflow via a webhook call. You can replace it with any other trigger of your choice, such as form submission, a new row added in Google Sheets, or a manual trigger. Node 2 - Get Rows from Google Sheet This node retrieves the list of companies from your spreadsheet. here is the Google Sheet Template you can use. The columns in this Google Sheet are: Company**: The name of the company Website**: The website URL of the company These two fields are required at this step. Business Area**: The business area deduced by OpenAI from the scraped data Offer**: The offer deduced by OpenAI from the scraped data Value Proposition**: The value proposition deduced by OpenAI from the scraped data Business Model**: The business model deduced by OpenAI from the scraped data ICP**: The Ideal Customer Profile deduced by OpenAI from the scraped data Additional Information**: Information related to the scraped data, including: Information Sufficiency: Description: Indicates if the information was sufficient to provide a full analysis. Options: "Sufficient" or "Insufficient" Insufficient Details: Description: If labeled "Insufficient," specifies what information was missing or needed to complete the analysis. Mismatched Content: Description: Indicates whether the page content aligns with that of a typical company page. Suggested Actions: Description: Provides recommendations if the page content is insufficient or mismatched, such as verifying the URL or searching for alternative sources. Node 3 - Loop Over Items This node ensures that, in subsequent steps, the website in "extra workflow input" corresponds to the row being processed. You can delete this node, but you'll need to ensure that the "query" sent to the scraping workflow corresponds to the website of the specific company being scraped (rather than just the first row). Node 4 - AI Agent This AI agent is configured with a prompt to extract data from the content it receives. The node has three sub-nodes: OpenAI Chat Model: The model used is currently gpt4-o-mini. Call n8n Workflow: This sub-node calls the workflow to use ScrapingBee and retrieves the scraped data. Structured Output Parser: This parser structures the output for clarity and ease of use, and then adds rows to the Google Sheet. Node 5 - Update Company Row in Google Sheet This node updates the specific company's row in Google Sheets with the enriched data. Scraper Agent Workflow Node 1 - Tool Called from Agent This is the trigger for when the AI Agent calls the Scraper. A query is sent with: Company name Website (the URL of the website) Node 2 - Set Company URL This node renames a field, which may seem trivial but is useful for performing transformations on data received from the AI Agent. Node 3 - ScrapingBee: Scrape Company's Website This node scrapes data from the URL provided using ScrapingBee. You can use any scraper of your choice, but ScrapingBee is recommended, as it allows you to configure scraper behavior directly. Once configured, copy the provided "curl" command and import it into n8n. Node 4 - HTML to Markdown This node converts the scraped HTML data to Markdown, which is then sent to OpenAI. The Markdown format generally uses fewer tokens than HTML. Improving the Workflow It's always a pleasure to share workflows, but creators sometimes want to keep some magic to themselves ✨. Here are some ways you can enhance this workflow: Handle potential errors Configure the scraper tool to scrape other pages on the website. Although this will cost more tokens, it can be useful (e.g., scraping "Pricing" or "About Us" pages in addition to the homepage). Instead of Google Sheets, connect directly to your CRM to enrich company data. Trigger the workflow from form submissions on your website and send the scraped data about the lead to a Slack or Teams channel.
by Marcel Claus-Ahrens
This workflow downloads all files from a specific folder in a S3 Bucket and compresses them so you can download it via n8n or do further processings. Fill in your Credentials and Settings in the Nodes marked with "*". Might serve well as Blueprint or as manual Download for S3 Folders. Since I found it rather tricky to compress all binary files into one zip file I figured might it be an interesting Template. Hint: This is the expression to get every binary key to compress them dynamically. (used in the "Compress"-Node) Enjoy the Workflow! ❤️ https://let-the-work-flow.com Workflow Automation & Development
by Anthony
This workflow allows you to recognize a folder with receipts or invoices (make sure your files are in .pdf, .png, or .jpg format). The workflow can be triggered via the "Test workflow" button, and it also monitors the folder for new files, automatically recognizing them. Video Demo https://youtu.be/mGPt7fqGQD8 1. n8n import glitch After import, the trigger node "When clicking 'Test workflow'" might be disconnected. You need to connect it via 2 arrows to "Google Sheets1" and "Google Drive" nodes. So, the workflow has 2 triggers - via button, and via Google Sheets "new file" event - both of these triggers should be connected to 2 nodes. Here is how it should look like: https://ocr.oakpdf.com/n8n_fix.png 2. Set up RapidAPI HTTP auth key Create new "HTTP header" n8n credential and paste your RapidAPI key from https://rapidapi.com/restyler/api/receipt-and-invoice-ocr-api into it. https://ocr.oakpdf.com/n8n_api_key.png Make sure "HTTP Request" node uses this credential. 3. Set up your Google Auth You need a Google connection to work with your Google Sheets and Google Drive accounts: https://docs.n8n.io/integrations/builtin/credentials/google/oauth-generic/#finish-your-n8n-credential 4. Set up Google Sheets Copy this Google Sheets document: https://docs.google.com/spreadsheets/d/1G0w-OMdFRrtvzOLPpfFJpsBVNqJ9cfRLMKCVWfrTQBg/edit?usp=sharing Custom document formats and advanced usage Email: contact@scrapeninja.net Linkedin: https://www.linkedin.com/in/anthony-sidashin/
by Ludwig
Overview This template helps n8n cloud plan users execute all executions to a CSV for easy data analysis. Identify what workflows are generating the most executions or could be optimized. How this workflow works Click "Test Workflow" to manually execute the workflow Open the "Convert to CSV" node to access the binary data of the CSV file Download the CSV file Nodes included: n8n node Convert to File No Operation, do nothing - replace with another Set up steps Import the workflow to your workspace Add your n8n API credential Benefits of Exporting n8n Cloud Executions to CSV Exporting n8n Cloud executions to CSV offers significant advantages for enhancing workflow management and data analysis capabilities. Here are three key benefits: Enhanced Data Analysis: Comprehensive Insights: Exporting execution data allows for in-depth analysis of workflow performance, helping identify bottlenecks and optimize processes. Custom Reporting: CSV files can be easily imported into various data analysis tools (e.g., Excel, Google Sheets, or BI software) to create custom reports and visualizations tailored to specific business needs. Improved Workflow Monitoring: Historical Data Review: Accessing historical execution data enables users to track workflow changes and their impacts over time, facilitating better decision-making. Error Tracking and Debugging: By reviewing execution logs, users can quickly identify and address errors or failures, ensuring smoother and more reliable workflow operations. Regulatory Compliance and Auditing: Audit Trails: Keeping a record of all executions provides a clear audit trail, essential for regulatory compliance and internal audits. Data Retention: Exported data ensures that execution records are preserved according to organizational data retention policies, safeguarding against data loss. By leveraging the capabilities of CSV exports, users can gain valuable insights, streamline workflow management, and ensure robust data handling practices, ultimately driving better performance and efficiency in their n8n Cloud operations.
by Zacharia Kimotho
This workflow helps marketers verify and update data using EffiBotics Email Verifier API. Copy and create a list with emails as on this one https://docs.google.com/spreadsheets/d/1rzuojNGTaBvaUEON6cakQRDva3ueGg5kNu9v12aaSP4/edit#gid=0 The trigger checks for any updates in the number of rows that are present in a sheet and updates the verified emails on Google sheets Once you update a new cell, the new data is read, and the email is checked for its validity before. The results are then updated in real-time on the sheet. Happy Emailing!