by ARRE
Good to know: This workflow automatically processes product images from Google Drive, generates AI-powered background prompts using multiple AI models (ChatGPT, Claude, or Groq), creates professional background scenes using Pixelcut.ai, and saves enhanced images back to your Google Drive. Perfect for e-commerce businesses and product photography workflows. Who is this for? ➖E-commerce store owners who need professional product backgrounds ➖Product photographers looking to automate background generation ➖Marketing teams creating consistent product imagery ➖Small businesses wanting to enhance their product photos without expensive studio setups ➖Anyone who needs to quickly transform transparent product images into commercial-ready photos What problem is this workflow solving? This workflow solves the challenge of creating professional product photography backgrounds at scale. Instead of manually editing each product image or setting up expensive photo shoots, it automatically generates contextually appropriate backgrounds for your products using AI technology. It eliminates the time-consuming process of background creation while maintaining professional quality and consistency across your product catalog. What this workflow does: ✅Automatically fetches product images from your Google Drive folder ✅Downloads transparent/background-free product images ✅Uses advanced AI models (ChatGPT, Claude, or Groq) to generate intelligent background prompts based on product analysis ✅Creates professional backgrounds using Pixelcut.ai API with AI-generated or custom prompts ✅Saves enhanced product images back to Google Drive with organized naming ✅Processes multiple images in batch automatically How it works: 1️⃣Google Drive node searches for PNG product images in your specified folder 2️⃣Binary download node retrieves the actual image files for processing 3️⃣Optional AI agent analyzes products using your chosen AI model (OpenAI GPT-4, Claude, or Groq) and generates appropriate background prompts 4️⃣Pixelcut.ai API processes images and adds professional backgrounds using AI-generated or manual prompts 5️⃣Enhanced images are automatically saved back to Google Drive with "enhanced-" prefix How to use: Set up Google Drive OAuth2 credentials in n8n Create a Pixelcut.ai account and get your API key Configure your source folder ID in the Google Drive nodes Set up your output folder ID for enhanced images Choose and configure your preferred AI model credentials (OpenAI for ChatGPT, Anthropic for Claude, or Groq) Replace placeholder API keys with your actual credentials Execute the workflow to process your product images Requirements: ✅n8n instance (cloud or self-hosted) ✅Google Drive account with OAuth2 access ✅Pixelcut.ai API account and key ✅Product images in PNG format (transparent backgrounds recommended) ✅AI API credentials for automatic prompt generation (choose from): OpenAI API (for ChatGPT/GPT-4) Anthropic API (for Claude) Groq API (for fast inference) ✅Basic understanding of n8n workflows Customizing this workflow: 🟢Modify the image format filter to support JPG, WEBP, or other formats 🟢Switch between different AI models (ChatGPT, Claude, Groq) for prompt generation 🟢Customize background prompts for different product categories 🟢Add background removal step for products with existing backgrounds 🟢Switch to different AI background services (Deep-Image.ai, Remove.bg, etc.) 🟢Configure different AI model parameters for varied prompt creativity 🟢Add image resizing or quality optimization steps 🟢Create multiple output folders for different product categories 🟢Add error handling and retry mechanisms for failed processes 🟢Implement A/B testing with different AI models for prompt quality comparison
by Daniel Shashko
This workflow enables you to automate the daily monitoring of how an AI model (like ChatGPT) responds to specific queries relevant to your market. It identifies mentions of your brand and predefined competitors, logs detailed interactions in Google Sheets, and delivers a comprehensive email report. Main Use Cases Monitor how your brand is mentioned by AI in response to relevant user queries. Track mentions of key competitors to understand AI's comparative positioning. Gain insights into AI's current knowledge and portrayal of your brand and market landscape. Automate daily intelligence gathering on AI-driven brand perception. How it works The workflow operates as a scheduled process, organized into these stages: Configuration & Scheduling Triggers daily (or can be run manually). Key variables are defined within the workflow: your brand name (e.g., "YourBrandName"), a list of queries to ask the AI, and a list of competitor names to track in responses. AI Querying For each predefined query, the workflow sends a request to the OpenAI ChatGPT API (via an HTTP Request node). Response Analysis Each AI response is processed by a Code node to: Check if your brand name is mentioned (case-insensitive). Identify if any of the listed competitors are mentioned (case-insensitive). Extract the core AI response content (limited to 500 characters for brevity in logs/reports). Data Logging to Google Sheets Detailed results for each query—including timestamp, date, the query itself, query index, your brand name, the AI's response, whether your brand was mentioned, and any errors—are appended to a specified Google Sheet. Email Report Generation A comprehensive HTML email report is compiled. This report summarizes: Total queries processed, number of times your brand was mentioned, total competitor mentions, and any errors encountered. A summary of competitor mentions, listing each competitor and how many times they were mentioned. A detailed table listing each query, whether your brand was mentioned, and which competitors (if any) were mentioned in the AI's response. Automated Reporting The generated HTML email report is sent to specified recipients, providing a daily snapshot of AI interactions. Summary Flow: Schedule/Workflow Trigger → Initialize Brand, Queries, Competitors (in Code node) → For each Query: Query ChatGPT API → Process AI Response (Check for Brand & Competitor Mentions) → Log Results to Google Sheets → Generate Consolidated HTML Email Report → Send Email Notification Benefits: Fully automated daily monitoring of AI responses concerning your brand and competitors. Provides objective insights into how AI models are representing your brand in user interactions. Delivers actionable competitive intelligence by tracking competitor mentions. Centralized logging in Google Sheets for historical analysis and trend spotting. Easily customizable with your specific brand, queries, competitor list, and reporting recipients.
by Jonathan
You still can use the app in a workflow even if we don’t have a node for that or the existing operation for that. With the HTTP Request node, it is possible to call any API point and use the incoming data in your workflow Main use cases: Connect with apps and services that n8n doesn’t have integration with Web scraping How it works This workflow can be divided into three branches, each serving a distinct purpose: 1.Splitting into Items (HTTP Request - Get Mock Albums): The workflow initiates with a manual trigger (On clicking 'execute'). It performs an HTTP request to retrieve mock albums data from "https://jsonplaceholder.typicode.com/albums." The obtained data is split into items using the Item Lists node, facilitating easier management. 2.Data Scraping (HTTP Request - Get Wikipedia Page and HTML Extract): Another branch of the workflow involves fetching a random Wikipedia page using an HTTP request to "https://en.wikipedia.org/wiki/Special:Random." The HTML Extract node extracts the article title from the fetched Wikipedia page. 3.Handling Pagination (The final branch deals with handling pagination for a GitHub API request): It sends an HTTP request to "https://api.github.com/users/that-one-tom/starred," with parameters like the page number and items per page dynamically set by the Set node. The workflow uses conditions (If - Are we finished?) to check if there are more pages to retrieve and increments the page number accordingly (Set - Increment Page). This process repeats until all pages are fetched, allowing for comprehensive data retrieval.
by Nasser
For Who? Content Creators Youtube Automation Marketing Team How it works? 1 - Enter the ID of the YTB channel to trigger the workflow when a new video is posted 2 - Apify scrape the last YTB video of the channel 3 - Wait until the dataset is completed in Apify and get it 4 - Verify if Metadata are not already generated and generate them with LLM 5 - Format all the data created and update YTB Video 📺 YouTube Video Tutorial: SETUP Setup Input YTB Chanel : Go to the channel's page on YouTube, and look at the URL of the page. The channel ID is the value that comes after channel/ in the URL. Add it after "?channel_id=" You can also use free tools available to retrieve channel ID. Setup Output YTB Video Update : Connect your YTB account to your n8n instance thanks to the Google Cloud Console. You can find tutorials by typing "youtube api Oauth" on Google. APIs : For the following third-party integrations, replace ==[YOUR_API_TOKEN]== with your API Token or connect your account via Client ID / Secret to your n8n instance : Apify : https://docs.apify.com/api/v2/getting-started Youtube : https://docs.n8n.io/integrations/builtin/app-nodes/n8n-nodes-base.youtube/?utm_source=n8n_app&utm_medium=node_settings_modal-credential_link&utm_campaign=n8n-nodes-base.youTube#templates-and-examples 👨💻 More Workflows : https://n8n.io/creators/nasser/
by Corentin Ribeyre
This template can be used to verify an email address with Icypeas. Be sure to have an active account to use this template. How it works This workflow can be divided into three steps : The workflow initiates with a manual trigger (On clicking 'execute'). It connects to your Icypeas account. It performs an HTTP request to verify an email address. Set up steps You will need a working icypeas account to run the workflow and get your API Key, API Secret and User ID. You will need an email address to perform the verification.
by Ibrahim
Overview This n8n workflow is designed to extract specific interests from messages in a Telegram chat and retrieve related information using the Facebook Graph API. It aims to provide a streamlined solution for parsing and analyzing user-provided interests within the Telegram platform. Features Interest Extraction:** Automatically identifies and extracts interests from messages that start with the hashtag "#interest". Data Retrieval:** Utilizes the Facebook Graph API to retrieve information related to the extracted interests. Structured Outputs:** Presents the retrieved data in an organized format for further analysis and review. Requirements Operational instance of n8n (self-hosted or cloud version). Basic understanding of n8n workflows and nodes. Setup and Configuration Import Workflow: Load the provided JSON workflow into your n8n instance. Configure Telegram Trigger Node: Ensure the Telegram trigger node is set up with the appropriate credentials and webhook ID. Configure and Test Nodes: Adjust node parameters as necessary and test the workflow to ensure proper functionality. How it Works Telegram Trigger: Listens for incoming messages in a specified Telegram chat. Check Message Contents: Verifies if the message begins with the specified hashtag and is from the designated chat ID. Extract Message: Extracts the content of the message for further processing. Split Message: Splits the extracted message to identify the interest and remaining content. Connect to Graph API: Utilizes the Facebook Graph API to search for information related to the extracted interest. Split Interests into a Table: Organizes the retrieved data into a structured table format. Get Variables: Maps the retrieved data to create a new JSON object containing specific fields related to the interest. Create a Spreadsheet: Generates a spreadsheet file in CSV format based on the retrieved and formatted data. Send the Spreadsheet File: Sends the generated spreadsheet file back to the original Telegram chat. Customization Modify the filtering conditions and fields to suit specific requirements. Adjust the frequency of the trigger node based on preference. Best Practices Regularly test the workflow to ensure consistent performance. Stay informed about any changes to external APIs that might affect the workflow's functionality. Contributing Your feedback and contributions are highly valued. Feel free to adapt, modify, and share enhancements with the n8n community.
by Milorad Filipović
How It works It's very important to come prepared to Sales calls. This often means a lot of manual research about the person you're calling with. This workflow delivers the latest news about businesses you are about to interact with each day. Scans Your Calendar**: Each morning, it reviews your Google Calendar for any scheduled meetings or calls with companies. Fetches Latest News**: For each identified company, it searches the web for the most recent and relevant news articles using newsapi.org Delivers Insights**: You receive personalized emails via Gmail, each dedicated to a company you're meeting with that day, containing a curated list of news headlines, brief descriptions, and direct links to full articles. Setup steps The workflow requires you to have the following accounts set up in their respective nodes: Google Calendar GMail Besides those, there are a few parameters in the node called Setup that can be used to tweak the workflow:
by Mutasem
Use Case This workflow aims to enrich new contacts in Intercom. The more relevant the Intercom profile, the more useful it is. Once active, this n8n workflow will update contact data (phone, email) as well as location data from ExactBuyer. Setup Add a webhook url in Intercom to call this workflow Add your Exact Buyer API key Add your Intercom API key Activate workflow How to adjust this template There's plenty of interesting info that ExactBuyer returns that could be helpful. Take a look and update this workflow to add what you need.
by Mutasem
Use Case This workflow aims to enrich new contacts in HubSpot. The more relevant the HubSpot profile, the more useful it is. Once active, this n8n workflow will update the social profiles, contact data (phone, email) as well as location data from ExactBuyer. Setup Add HubSpot trigger credential (be careful, scopes must be exactly as in n8n docs ) Add your Exact Buyer API key Add HubSpot credential for update node (be careful, scopes must be same as n8n docs for this. This is different from the trigger cred) Activate workflow How to adjust this template There's plenty of interesting info that ExactBuyer returns that could be helpful. Take a look and update this workflow to add what you need.
by Joachim Brindeau
What it does The workflow is a simple yet efficient way to automate the process of indexing your website on Google using the Google Indexing API. How it works It works by extracting information from your sitemap, converting it into a JSON file, and looping through each URL to submit it for indexing. Here's a brief rundown of the workflow: The workflow can be triggered manually via the "Execute Workflow" button or scheduled to run at a specific time using the "Schedule Trigger" node. The sitemap of your website is fetched using the "sitemap_set" node with a HTTP Request to the sitemap URL. This XML sitemap is then converted into a JSON file using the "sitemap_convert" node. The "sitemap_parse" node splits the JSON file into individual URLs. The "url_set" node then prepares each URL to be sent to the Google Indexing API. A loop is created using the "loop" node to process each URL individually and make a POST request to Google Indexing API indicating that the URL has been updated. If the POST request is successful and the URL has been updated, the workflow waits for 2 seconds before moving to the next URL. In case the daily limit for the Google Indexing API is reached (200/day by default), an error message is triggered using the "Stop and Error" node. Before you use the workflow Activate the indexing API Create an account with Google Cloud Platform > Console and then create a new project Search for the Indexing API in the Library Activate the API Create a Service Account and get credentials Open the Service accounts page. If prompted, select a project. Click add Create Service Account, enter a name and description for the service account. You can use the default service account ID, or choose a different, unique one. When done click Create. On the Grant users access to this service account screen, scroll down to the Create key section. Click add Create key. In the side panel that appears, select the JSON format Click Create. Your new public/private key pair is generated and downloaded to your machine. Open the file and copy the private key. Add the credentials in the url_index node Add the user as owner of the site Beware, for each site you need to add the user as a owner like this: Set your sitemap Open the sitemap_set node and add the url to your sitemap. Now you should be able to ensure that Google is always up-to-date with the latest content on your website, improving your website's visibility and SEO rankings, have fun!
by Milorad Filipovic
This workflow will translate all your PDF documents from specified Google Drive folder to the desired language. Translated files will be automatically uploaded to the original folder. Required accounts 1️⃣ Google Drive account 2️⃣ DeepL developer account and API key How to setup? 1️⃣ Setup your google drive folder url, target and source language in the configuration node 2️⃣ Connect your Google Drive account with all Google Drive nodes 3️⃣ Setup HTTP header credentials that should be used for HTTP nodes in the template (replace yourAuthKey with your DeepL API key) 4️⃣ Set your DeepL header credentials in all HTTP nodes
by Jonathan
How it works This workflow watches a selected Google Drive folder for any images added to it. It then takes that image, sends it the the tinypng.com service which optimises and reduces its size (where possible) Tinypng then returns the updated image which is then automatically saved in your chose Google Drive folder Setting things up It's pretty simple to configure and should only take around 5-10mins. You only need to set up credentials for Google Drive and Tinypng.com For Tinypng.com you can sign up for their free tier API access which gives you 500 optimisations per month Once you have those two things, you just need choose your 'input' folder to watch for images and your 'output' folder for where these images should be stored There are a few more optional things you can do such as the naming of your final image and also lots more you could do with the Tinypng API for more advanced image optimisation