by Brian Money
Overview This template is designed for Amazon sellers and advertisers who want to automate their campaign performance analysis and bidding strategy. It solves the common challenge of manually reviewing Sponsored Products reports and guessing how to adjust keywords, placements, and budgets. By combining Amazon Advertising reports with OpenAI's GPT-4o, this workflow delivers real-time, personalized optimization instructions — automatically. Features 📥 Automatically downloads Sponsored Products reports from Google Drive 🧠 Uses AI to analyze campaign, keyword, placement, targeting, and budget performance 📊 Supports both .csv and .xlsx report formats 🔁 Handles multiple ASINs and scales easily across ad accounts 📧 Sends structured optimization recommendations to your inbox via Gmail 🗂 Built-in logic to normalize filenames and correctly map reports 🧹 Includes error handling and formatting cleanup for AI-ready input Requirements To use this workflow, you’ll need: An Amazon Ads account with access to Sponsored Products reports A Google Drive folder where Amazon Ads reports are delivered (manually or via Gmail automation) A Gmail account (for sending summaries) An OpenAI API key with access to GPT-4o Optional: a developer account for the Amazon Ads API to fully automate report generation in the future Setup Instructions 📂 Connect your Amazon Ads reports folder in the Google Drive node 🔐 Add your credentials to the OpenAI and Gmail nodes 📝 Schedule five reports in the Amazon Ads Console: Search Term Report → Detailed Targeting Report → Detailed Campaign Report → Summary Placement Report → Summary Budget Report → Summary Use “Last 30 Days”, “Daily”, and .xlsx or .csv format 🔁 (Optional) Automate report ingestion using Gmail + Drive workflows 🧪 Test with one account, then replicate across additional ad accounts as needed ⏱️ Setup time: 15–30 minutes 📌 All field-specific guidance is included in workflow notes`
by Abhishek Patoliya
This n8n automation lets you build a complete AI-powered task management system that integrates Telegram, Google Sheets, and GPT-4o mini to help users easily manage to-do lists and receive daily task reminders. Users can interact with the system via Telegram, while the AI assistant (powered by GPT-4o mini) processes commands naturally, updates a central Google Sheet, and ensures scheduled reminders are sent for pending tasks. ✨ Key Features ✅ Add, list, update, complete, or delete tasks via Telegram ✅ AI-powered conversational responses using GPT-4o ✅ All tasks stored and synced in Google Sheets ✅ Daily scheduled task summary and pending reminders sent to Telegram ✅ Friendly, human-like assistant responses ✅ Fully configurable and easy to set up 🛠️ Workflow Functionality Breakdown 1. User Interacts on Telegram Sends commands like: add buy groceries list tasks complete submit report delete dentist appointment 2. AI-Powered Processing A GPT-4o agent processes user messages Ensures clear, friendly responses Determines task intent: add, update, delete, list, complete 3. Google Sheets Sync Every operation is logged to Google Sheets Google Sheets acts as the master task database Sheet structure includes: Task Status (pending or done) Created At (timestamp) Due Date (optional) Notes (optional) 4. Scheduled Daily Task Summary At 9 PM daily, the workflow: Fetches tasks from Google Sheets Generates a warm, conversational summary via GPT-4o Sends the summary to the user on Telegram 5. Automated Reminders Checks for pending tasks due today Sends reminder messages to Telegram ✅ Prerequisites Before setting up the workflow, ensure you have: ✔️ An n8n instance (Cloud or self-hosted) ✔️ A Telegram Bot Token ✔️ Access to Google Sheets API (OAuth2 credentials) ✔️ An OpenAI API Key with GPT-4o access ✔️ A Google Sheet structured as per the specification below 📝 Google Sheet Structure Your Google Sheet should have the following columns: | Column Name | Description | | -------------- | ---------------------------------------------------- | | Task | Short task description | | Status | pending or done | | Created At | Date & time task was created (YYYY-MM-DD HH:mm:ss) | | Due Date | (Optional) When task is due (YYYY-MM-DD HH:mm:ss) | | Notes | (Optional) Additional details | Important: The first row should be the header row with these exact column names. 🔧 Setup Instructions 1. Telegram Bot Setup Create a bot via BotFather Obtain the Bot Token Connect Telegram Trigger and Telegram Send nodes using your Bot Token in n8n 2. Google Sheets API Setup Follow n8n Google Sheets integration guide Set up OAuth2 credentials Provide access to your task Google Sheet 3. OpenAI API Setup Obtain an API key from OpenAI Ensure GPT-4o mini access is enabled Add OpenAI credentials to relevant nodes 4. Sheet Linking Replace the Google Sheet ID in the workflow with your own Confirm sheet names and column structure match exactly 5. Schedule Configuration (Optional) Adjust the daily summary time (Schedule Trigger node) as desired ⚙️ Configuration Options 🔧 Adjust AI prompt instructions for tone/style 🔧 Change reminder times in the schedule trigger 🔧 Customize Google Sheet columns if needed (update mappings accordingly) 🔧 Add multi-user support with chat ID checks (advanced) 📂 Files Included Full n8n JSON workflow ready to import 💡 Tips You can extend this with Slack, WhatsApp, or Email reminders Combine with Notion, ClickUp, or CRM integrations for more powerful task management Consider adding a "Priority" column for advanced sorting Ready to stay organized with AI-powered task management? Import this workflow, link your accounts, and your Telegram assistant is good to go! 🚀
by Zacharia Kimotho
This workflow makes it easier to prepare for meetings and calls by researching your lead right before the call and creates a high-level meeting prep that is sent to your email. This removes the extra steps needed by teams to learn their leads, research, and prepare for the upcoming calls. How does it work This workflow starts when We Capture the webhook from cal.com for new bookings. Ensure you have a field on the form to collect LinkedIn posts. This can be optional or mandatory depending on your preferences. When a new event is booked, we will add the leads to an Airtable CRM for appointments and new bookings. This table will contain all the items and items needed to enrich and maintain your CRM. If the lead has linkedin then we do research on LinkedIn for their content and posts and perform a lead enrichment to get as much info as we can about the leads and create a new meeting prep. What you need Bright data API Cal.com account/calendar. Other calendars can be used too for this eg calendly, Google Calendar, etc with a few tweaks CRM - This can be anything not just airtable Setting it up Create/update your calendar to allow collecting users LinkedIn profiles/bios Add a new webhook to and subscribe to the desired events like below Map the fields from the webhook to match your CRM. If you have no CRM make a copy of this Airtable CRM and map the fields to your account. We will be using the Base and table ID to make the mapping easier Setup your Bright Data API and select the data source as linkedin for the scraping You can edit more data on the bio as needed Update this info to the CRM under the table lead enrichment and map accordingly You can update the prompt on the AI models or work with them as is. Update the Gmail node to send the meeting preps to you and finally update the CRM with the generated Meeting prep This automated process can save your team a couple of minutes each day otherwise spent on other client fulfillment items. If you would like to learn more about n8n templates like this, feel free to reach out via Linkedin Happy productivity!!
by lin@davoy.tech
The YogiAI workflow automates sending daily yoga pose reminders and related information via Line Push Messages . This automation leverages data from a Google Sheets database containing yoga pose details such as names, image URLs, and links to ensure users receive personalized and engaging content every day. Purpose Provide users with daily yoga pose suggestions tailored to their practice. Deliver visually appealing and informative content through Line's Flex Messages, including images and clickable links. Log user interactions and preferences back into Google Sheets to refine future recommendations. Key Features Automated Daily Reminders : Sends a curated list of yoga poses at a scheduled time (21:30 Bangkok time). Dynamic Content Generation : Uses AI to rewrite and format messages in a user-friendly manner, complete with emojis and clear instructions. Integration with Google Sheets : Pulls data from a predefined Google Sheet and logs interactions for continuous improvement. Customizable Messaging : Ensures JSON outputs are properly formatted for Line’s Flex Message API, allowing for interactive and visually rich content. Data Source Google Sheets Structure The workflow relies on a Google Sheet structured as follows: PoseName : The name of the yoga pose. uri : The image URL representing the pose. url : A clickable link directing users to more information about the pose. Sample Data Layout Supine Angle https://example.com/SupineAngle-tn146.png https://example.com/pose/SupineAngle Warrior II https://example.com/WarriorII-tn146.png https://example.com/pose/WarriorII *Note : Ensure that you update the Google Sheet with your own data. Refer to this sample sheet for reference. * Scheduled Trigger The workflow is triggered daily at 21:30 (9:30 PM) Bangkok Time (Asia/Bangkok) . This ensures timely delivery of reminders to users, keeping them engaged with their yoga practice. Workflow Process Data Retrieval Node: Get PoseName Fetches yoga pose details from the specified range in the Google Sheet. Content Generation Node: WritePosesToday Utilizes Azure OpenAI to craft user-friendly text, complete with emojis and clear instructions. Node: RewritePosesToday Formats the AI-generated text specifically for Line messaging, ensuring compatibility and visual appeal. JSON Formatting Node: WriteJSONflex Generates JSON structures required for Line’s Flex Messages, enabling carousel displays of yoga pose images and links. Node: Fix JSON Ensures all JSON outputs are correctly formatted before being sent via Line. Message Delivery Node: Line Push with Flex Bubble Sends the final message, including both text and Flex Message carousels, directly to users via Line Push Messages. Logging Interactions Nodes: YogaLog & YogaLog2 Logs each interaction back into Google Sheets to track which poses were sent and how often they appear, refining future recommendations. Setup Prerequisites Google Sheets Account : Set up a Google Sheet with the required structure and populate it with your yoga pose data. Line Developer Account : Create a Line channel to obtain necessary credentials for sending push messages. Azure OpenAI Account : Configure access to Azure OpenAI services for generating and formatting content. Intended Audience This workflow is ideal for: Yoga Instructors : Seeking to engage students with daily pose suggestions. Fitness Enthusiasts : Looking to maintain consistency in their yoga practice. Content Creators : Interested in automating personalized and visually appealing content distribution.
by Yaron Been
🔍 Competitor Review Scraper & Ad Copy Generator (Trustpilot + Bright Data + GPT-4o-mini) 📌 Who It's For Marketers, business owners, and agencies looking to: Analyze competitor pain points Generate high-impact Facebook ad copy Automate manual data processing 🧩 How It Works This n8n-based workflow combines Bright Data, Google Sheets, and OpenAI to scrape, process, and transform Trustpilot reviews into ready-to-use ad copy. 🔹 Step-by-Step Breakdown Trigger (Manual Form Submission) Input required: Competitor’s Trustpilot URL Review timeframe (30d, 3m, 6m, 12m) Fetch Reviews Calls Bright Data’s Dataset API with URL & timeframe Polls until snapshot is ready Retrieve & Store Extracts all reviews Saves them into a structured Google Sheet Filter & Aggregate Filters to only 1–2 star reviews Summarizes common negative feedback Generate Ad Copy Sends the summary to OpenAI GPT-4o-mini Produces 3 variations of ad copy targeting pain points Distribute Insights Sends ad copy + summary via email to the marketing team ✅ Requirements -LLM Account -Google Sheets - Copy this sheet: https://docs.google.com/spreadsheets/d/1Zi758ds2_aWzvbDYqwuGiQNaurLgs-leS9wjLWWlbUU/edit?gid=0#gid=0 -Bright Data account ⚙️ Setup Instructions **Step 1: Google Sheets ** Copy this Google Sheets template Do not change column headers **Step 2: n8n Credential Setup ** Google Sheets: OAuth2 Bright Data: Authorization Header OpenAI: API Key for GPT-4o-mini **Step 3: Import Workflow ** Import the .json file into n8n Configure your sheet + dataset ID Adjust GPT prompts as needed **Step 4: Run the Workflow ** Trigger via form Receive ad copy + review insights via email 🧠 Tips & Best Practices Bright Data snapshots may take time — polling is handled Focusing on 1–2 star reviews yields the most actionable pain points You can customize GPT-4o-mini prompts for tone or vertical 💬 Support & Feedback Need help or customization? 📧 Email: Yaron@nofluff.online 📺 YouTube: @YaronBeen 🔗 LinkedIn: linkedin.com/in/yaronbeen 📚 Bright Data Docs: docs.brightdata.com/introduction
by Oskar
With this workflow you can extract data from resume documents uploaded via a Telegram bot. Workflow transform readable content of PDF resume into structured data, using AI nodes and returns PDF with formatted, plain HTML. You can modify this workflow to perform other actions with structured data (e.g. insert it into database or create other, well-formatted documents). Functionality of this workflow was presented during the n8n community call on March 7, 2024 - recording of presentation available here. ⚠️ Workflow made for demo purposes. If you want to use it in real life, please make sure necessary measures for personal data protection are set. How it works? User uploads readable PDF resume document into Telegram bot. After authentication based on chat ID parameter, workflow extracts text from the PDF and transfers it into AI chain with connected sub-nodes: OpenAI Chat Model and Structured Output (JSON) Parser. Then, each extracted section (employment history, projects etc.) is formatted into desired HTML structure. Finally, the document is converted into new, structured PDF using Gotenberg. 💡 This workflow requires installed Gotenberg. If you are not familiar with this software, please have a look on my YouTube tutorial. You can also replace call to Gotenberg with other PDF generation service (such as PDFMonkey or ApiTemplate). Set up steps Create Telegram bot and add its credentials in n8n. Set your chat ID parameter in Auth node. Adjust JSON schema in Structured Output Parser according to your needs. Optionally: replace HTTP call to Gotenberg with PDF generation service of your choice. If you like this workflow, please subscribe to my YouTube channel and/or my newsletter.
by Custom Workflows AI
Introduction The Content SEO Audit Workflow is a powerful automated solution that generates comprehensive SEO audit reports for websites. By combining the crawling capabilities of DataForSEO with the search performance metrics from Google Search Console, this workflow delivers actionable insights into content quality, technical SEO issues, and performance optimization opportunities. The workflow crawls up to 1,000 pages of a website, analyzes various SEO factors including metadata, content quality, internal linking, and search performance, and then generates a professional, branded HTML report that can be shared directly with clients. The entire process is automated, transforming what would typically be hours of manual analysis into a streamlined workflow that produces consistent, thorough results. This workflow bridges the gap between technical SEO auditing and practical, client-ready deliverables, making it an invaluable tool for SEO professionals and digital marketing agencies. Who is this for? This workflow is designed for SEO consultants, digital marketing agencies, and content strategists who need to perform comprehensive content audits for clients or their own websites. It's particularly valuable for professionals who: Regularly conduct SEO audits as part of their service offerings Need to provide branded, professional reports to clients Want to automate the time-consuming process of content analysis Require data-driven insights to inform content strategy decisions Users should have basic familiarity with SEO concepts and metrics, as well as a basic understanding of how to set up API credentials in n8n. While no coding knowledge is required to run the workflow, users should be comfortable with configuring workflow parameters and following setup instructions. What problem is this workflow solving? Content audits are essential for SEO strategy but are traditionally labor-intensive and time-consuming. This workflow addresses several key challenges: Manual Data Collection: Gathering data from multiple sources (crawlers, Google Search Console, etc.) typically requires hours of work. This workflow automates the entire data collection process. Inconsistent Analysis: Manual audits can suffer from inconsistency in methodology. This workflow applies the same comprehensive analysis criteria to every page, ensuring thorough and consistent results. Report Generation: Creating professional, client-ready reports often requires additional design work after the analysis is complete. This workflow generates a fully branded HTML report automatically. Data Integration: Correlating technical SEO issues with actual search performance metrics is difficult when working with separate tools. This workflow seamlessly integrates crawl data with Google Search Console metrics. Scale Limitations: Manual audits become increasingly difficult with larger websites. This workflow can efficiently process up to 1,000 pages without additional effort. What this workflow does Overview The Content SEO Audit Workflow crawls a specified website, analyzes its content for various SEO issues, retrieves performance data from Google Search Console, and generates a comprehensive HTML report. The workflow identifies issues in five key categories: status issues (404 errors, redirects), content quality (thin content, readability), metadata SEO (title/description issues), internal linking (orphan pages, excessive click depth), and performance (underperforming content). The final report includes executive summaries, detailed issue breakdowns, and actionable recommendations, all branded with your company's colors and logo. Process Initial Configuration: The workflow begins by setting parameters including the target domain, crawl limits, company information, and branding colors. Website Crawling: The workflow creates a crawl task in DataForSEO and periodically checks its status until completion. Data Collection: Once crawling is complete, the workflow: Retrieves the raw audit data from DataForSEO Extracts all URLs with status code 200 (successful pages) Queries Google Search Console API for each URL to get clicks and impressions data Identifies 404 and 301 pages and retrieves their source links Data Analysis: The workflow analyzes the collected data to identify issues including: Technical issues: 404 errors, redirects, canonicalization problems Content issues: thin content, outdated content, readability problems SEO metadata issues: missing/duplicate titles and descriptions, H1 problems Internal linking issues: orphan pages, excessive click depth, low internal links Performance issues: underperforming pages based on GSC data Report Generation: Finally, the workflow: Calculates a health score based on the severity and quantity of issues Generates prioritized recommendations Creates a comprehensive HTML report with interactive tables and visualizations Customizes the report with your company's branding Provides the report as a downloadable HTML file Setup To set up this workflow, follow these steps: Import the workflow: Download the JSON file and import it into your n8n instance. Configure DataForSEO credentials: Create a DataForSEO account at https://app.dataforseo.com/api-access (they offer a free $1 credit for testing) Add a new "Basic Auth" credential in n8n following the HTTP Request Authentication guide Assign this credential to the "Create Task", "Check Task Status", "Get Raw Audit Data", and "Get Source URLs Data" nodes Configure Google Search Console credentials: Add a new "Google OAuth2 API" credential following the Google OAuth guide Ensure your Google account has access to the Google Search Console property you want to analyze Assign this credential to the "Query GSC API" node Update the "Set Fields" node with: dfs_domain: The website domain you want to audit dfs_max_crawl_pages: Maximum number of pages to crawl (default: 1000) dfs_enable_javascript: Whether to enable JavaScript rendering (default: false) company_name: Your company name for the report branding company_website: Your company website URL company_logo_url: URL to your company logo brand_primary_color: Your primary brand color (hex code) brand_secondary_color: Your secondary brand color (hex code) gsc_property_type: Set to "domain" or "url" depending on your Google Search Console property type Run the workflow: Click "Start" and wait for it to complete (approximately 20 minutes for 500 pages). Download the report: Once complete, download the HTML file from the "Download Report" node. How to customize this workflow to your needs This workflow can be adapted in several ways to better suit your specific requirements: Adjust crawl parameters: Modify the "Set Fields" node to change: The maximum number of pages to crawl (dfs_max_crawl_pages). This workflow supports up to 1000 pages. Whether to enable JavaScript rendering for JavaScript-heavy sites (dfs_enable_javascript) Customize issue detection thresholds: In the "Build Report Structure" code node, you can modify: Word count thresholds for thin content detection (currently 1500 words) Click depth thresholds (currently flags pages deeper than 4 clicks) Title and description length parameters (currently 40-60 chars for titles, 70-155 for descriptions) Readability score thresholds (currently flags Flesch-Kincaid scores below 55) Modify the report design: In the "Generate HTML Report" code node, you can: Adjust the HTML/CSS to change the report layout and styling Add or remove sections from the report Change the recommendations logic Modify the health score calculation algorithm Add additional data sources: You could extend the workflow by: Adding Pagespeed Insights data for performance metrics Incorporating backlink data from other APIs Adding keyword ranking data from rank tracking APIs Implement automated delivery: Add nodes after the "Download Report" to: Send the report directly to clients via email Upload it to cloud storage Create a PDF version of the report
by Bela
Purpose of the workflow Most scraping workflows get blocked by anti-bot technologies. To avoid this, you can use Scrappey to scrape every website you want. How it works: We use Test Data and make a API Call to the Scrappey service. We get the scraped website data back as a result. Setup Steps: Replace YOUR_API_KEY in the "Scrappey API Call" node with your Scrappey API Key (Register For Free) Replace the test data with your production data. You can plug in any type of data connector at this point of your workflow.
by Shiv Gupta
Pinterest Keyword-Based Content Scraper with AI Agent & BrightData Automation Overview This n8n workflow automates Pinterest content scraping based on user-provided keywords using BrightData's API and Claude Sonnet 4 AI agent. The system intelligently processes keywords, initiates scraping jobs, monitors progress, and formats the extracted data into structured outputs. Architecture Components 🧠 AI-Powered Controller Claude Sonnet 4 Model**: Processes and understands keywords before initiating scrape AI Agent**: Acts as the intelligent controller coordinating all scraping steps 📥 Data Input Form Trigger**: User-friendly keyword input interface Keywords Field**: Required input field for Pinterest search terms 🚀 Scraping Pipeline Launch Scraping Job: Sends keywords to BrightData API Status Monitoring: Continuously checks scraping progress Data Retrieval: Downloads completed scraped content Data Processing: Formats and structures the raw data Storage: Saves results to Google Sheets Workflow Nodes 1. Pinterest Keyword Input Type**: Form Trigger Purpose**: Entry point for user keyword submission Configuration**: Form title: "Pinterest" Required field: "Keywords" 2. Anthropic Chat Model Type**: Language Model (Claude Sonnet 4) Model**: claude-sonnet-4-20250514 Purpose**: AI-powered keyword processing and workflow orchestration 3. Keyword-based Scraping Agent Type**: AI Agent Purpose**: Orchestrates the entire scraping process Instructions**: Initiates Pinterest scraping with provided keywords Monitors scraping status until completion Downloads final scraped data Presents raw scraped data as output 4. BrightData Pinterest Scraping Type**: HTTP Request Tool Method**: POST Endpoint**: https://api.brightdata.com/datasets/v3/trigger Parameters**: dataset_id: gd_lk0sjs4d21kdr7cnlv include_errors: true type: discover_new discover_by: keyword limit_per_input: 2 Purpose**: Creates new scraping snapshot based on keywords 5. Check Scraping Status Type**: HTTP Request Tool Method**: GET Endpoint**: https://api.brightdata.com/datasets/v3/progress/{snapshot_id} Purpose**: Monitors scraping job progress Returns**: Status values like "running" or "ready" 6. Fetch Pinterest Snapshot Data Type**: HTTP Request Tool Method**: GET Endpoint**: https://api.brightdata.com/datasets/v3/snapshot/{snapshot_id} Purpose**: Downloads completed scraped data Trigger**: Executes when status is "ready" 7. Format & Extract Pinterest Content Type**: Code Node (JavaScript) Purpose**: Parses and structures raw scraped data Extracted Fields**: URL Post ID Title Content Date Posted User Likes & Comments Media Image URL Categories Hashtags 8. Save Pinterest Data to Google Sheets Type**: Google Sheets Node Operation**: Append Mapped Columns**: Post URL Title Content Image URL 9. Wait for 1 Minute (Disabled) Type**: Code Tool Purpose**: Adds delay between status checks (currently disabled) Duration**: 60 seconds Setup Requirements Required Credentials Anthropic API Credential ID: ANTHROPIC_CREDENTIAL_ID Required for Claude Sonnet 4 access BrightData API API Key: BRIGHT_DATA_API_KEY Required for Pinterest scraping service Google Sheets OAuth2 Credential ID: GOOGLE_SHEETS_CREDENTIAL_ID Required for data storage Configuration Placeholders Replace the following placeholders with actual values: WEBHOOK_ID_PLACEHOLDER: Form trigger webhook ID GOOGLE_SHEET_ID_PLACEHOLDER: Target Google Sheets document ID WORKFLOW_VERSION_ID: n8n workflow version INSTANCE_ID_PLACEHOLDER: n8n instance identifier WORKFLOW_ID_PLACEHOLDER: Unique workflow identifier Data Flow User Input (Keywords) ↓ AI Agent Processing (Claude) ↓ BrightData Scraping Job Creation ↓ Status Monitoring Loop ↓ Data Retrieval (when ready) ↓ Content Formatting & Extraction ↓ Google Sheets Storage Output Data Structure Each scraped Pinterest pin contains: URL**: Direct link to Pinterest pin Post ID**: Unique Pinterest identifier Title**: Pin title/heading Content**: Pin description text Date Posted**: Publication timestamp User**: Pinterest username Engagement**: Likes and comments count Media**: Media type information Image URL**: Direct image link Categories**: Pin categorization tags Hashtags**: Associated hashtags Comments**: User comments text Usage Instructions Initial Setup: Configure all required API credentials Replace placeholder values with actual IDs Create target Google Sheets document Running the Workflow: Access the form trigger URL Enter desired Pinterest keywords Submit the form to initiate scraping Monitoring Progress: The AI agent will automatically handle status monitoring No manual intervention required during scraping Accessing Results: Structured data will be automatically saved to Google Sheets Each run appends new data to existing sheet Technical Notes Rate Limiting**: BrightData API has built-in rate limiting Data Limits**: Current configuration limits 2 pins per keyword Status Polling**: Automatic status checking until completion Error Handling**: Includes error capture in scraping requests Async Processing**: Supports long-running scraping jobs Customization Options Adjust Data Limits**: Modify limit_per_input parameter Enable Wait Timer**: Activate the disabled wait node for longer jobs Custom Data Fields**: Modify the formatting code for additional fields Alternative Storage**: Replace Google Sheets with other storage options Sample Google Sheets Template Create a copy of the sample sheet structure: https://docs.google.com/spreadsheets/d/SAMPLE_SHEET_ID/edit Required columns: Post URL Title Content Image URL Troubleshooting Authentication Errors**: Verify all API credentials are correctly configured Scraping Failures**: Check BrightData API status and rate limits Data Formatting Issues**: Review the JavaScript formatting code for parsing errors Google Sheets Errors**: Ensure proper OAuth2 permissions and sheet access For any questions or support, please contact: Email or fill out this form
by Kumar Shivam
🧠 AI Blog Generator for Shopify Products using GPT-4o The AI Blog Generator is an advanced automation workflow powered by n8n, integrating GPT-4o and Google Sheets to generate SEO-rich blog articles for Shopify products. It automates the entire process — from pulling product data, analyzing images for nutritional information, to producing structured HTML content ready for publishing — with zero manual writing. 💡 Key Advantages 🔗 Shopify Product Sync** Automatically pulls product data (title, description, images, etc.) via Shopify API. 🤖 AI-Powered Nutrition Extraction** Uses GPT-4o to intelligently analyze product images and extract nutritional information. ✍️ SEO Blog Generation** GPT-4o generates blog titles, meta descriptions, and complete articles using both product metadata and extracted nutritional info. 🗂️ Structured Content Output** Produces well-formatted HTML with headers, bullet points, and nutrition tables for seamless Shopify blog integration. 📄 Google Sheets Integration** Tracks blog creation, manages retries, and prevents duplicate publishing using a centralized Google Sheet. 📤 Shopify Blog API Integration** Publishes the generated blog to Shopify using a two-step blog + article API call. ⚙️ How It Works Manual Trigger Initiate the process using a test trigger or a scheduler. Fetch Products from Shopify Retrieves all product details including descriptions and images. Extract Product Images Splits and processes each image individually. OCR + Nutrition AI GPT-4o reads nutrition facts from product images. Skips items without valid info. Check Existing Logs References a Google Sheet to avoid duplicates and determine retry status. AI Blog Generation Creates a blog with headings, bullet points, intro, and a nutrition table. Shopify Blog + Article Posting Uses the Shopify API to publish the blog and its content. Update Google Sheet Logs the blog URL, HTML content, errors, and status for future reference. 🛠️ Setup Steps Shopify Node**: Connects to your Shopify store and fetches product data. Split Out Node**: Divides product images for individual OCR processing. OpenAI Node**: Uses GPT-4o to extract nutrition data from images. If Node**: Filters for entries with valid nutrition information. Edit Fields Node**: Formats the product data for AI processing. AI Agent Node**: Generates SEO blog content. Google Sheets Nodes**: Reads and updates blog creation status. HTTP Request Nodes**: Posts the blog and article via Shopify’s API. 🔐 Credentials Required Shopify Access Token** – For retrieving product data and posting blogs OpenAI API Key** – For GPT-4o-based AI generation and image processing Google Sheets OAuth** – For accessing the log sheet 👤 Ideal For Ecommerce teams looking to automate content for hundreds of products Shopify store owners aiming to boost organic traffic through blogging Marketing teams building scalable, AI-driven content workflows 💬 Bonus Tip The workflow is modular. You can easily extend it with internal linking, language translation, or even social media sharing — all within the same n8n flow.
by Juan Carlos Cavero Gracia
Attachments Gmail to Drive and Google Sheets Description Automatically process invoice emails by saving attachments to Google Drive and extracting key invoice data to Google Sheets using AI. This workflow monitors your Gmail for unread emails with attachments, saves PDFs to a specified Google Drive folder, and uses OpenAI's GPT-4o to extract invoice details (date, description, amount) into a structured spreadsheet. Use cases Invoice Management**: Automatically organize and track invoices received via email Financial Record Keeping**: Maintain a structured database of all invoice information Document Organization**: Keep digital copies of invoices organized in Google Drive Automated Data Entry**: Eliminate manual data entry for invoice processing Resources Gmail account Google Drive account Google Sheets account OpenAI API key Setup instructions Prerequisites Active Gmail, Google Drive, and Google Sheets accounts OpenAI API key (GPT-4o model access) n8n instance with credentials manager Steps Gmail and Google Drive Setup: Connect your Gmail account in n8n credentials Connect your Google Drive account with appropriate permissions Create a destination folder in Google Drive for invoice storage Google Sheets Setup: Connect your Google Sheets account Create a spreadsheet with columns: Invoice date, Invoice Description, Total price, and Fichero Copy your spreadsheet ID for configuration OpenAI Setup: Add your OpenAI API key to n8n credentials Configure Email Filter: Update the email filter node to match your specific sender requirements Benefits Time Saving**: Eliminates manual downloading, filing, and data entry Accuracy**: AI-powered data extraction reduces human error Organization**: Consistent file naming and storage structure Searchability**: Creates a searchable database of all invoice information Automation**: Runs every minute to process new emails as they arrive Related templates Email Parser to CRM Document Processing Workflow Financial Data Automation
by felipe biava cataneo
What this template does This template serves as a Chatbot that enables you to ask questions about the content of a PDF directly in Telegream. It checks incoming Telegram messages if they contain a document. If they do, it stores the PDF in a Pinecone Vector store. If there's no document, it will search the Vector Store for information and try to answer your question. Setup Open the Telegram app and search for the BotFather user (@BotFather) Start a chat with the BotFather Type /newbot to create a new bot Follow the prompts to name your bot and get a unique API token Save your access token and username Once you set your bot, you can send the pdf, and then ask questions about the content. How to adjust it to your needs You can exchange the Groq chat model with any model that you like Exchange Pinecone with any other vector store tool you like (e.g. Supabase, Postgres or QDrant) #Telegram, #Pinecone, #Openai, #GroQ