by Kyle Morse
Takes your raw, unpolished voice transcripts and transforms them into well-structured LinkedIn posts using AI. Perfect for when you have good ideas but they come out as rambling thoughts. The Problem: You record voice memos with great ideas, but when you read the transcript, it's full of "ums," incomplete sentences, and scattered thoughts. Turning that into a professional LinkedIn post takes forever. The Solution: Email your raw transcript to this workflow. It combines your unpolished content with examples from your inspiration document (posts you've saved that match your desired style), then uses AI to create a clean, engaging LinkedIn post. What actually happens: You email a raw voice transcript to your workflow email -The workflow pulls style examples from your Google Doc AI reformats your scattered thoughts into a coherent 150-300 word LinkedIn post You get an email back with the polished content + suggested image description Copy, paste, and post to LinkedIn You provide: The raw transcript (from your phone's voice recorder or any transcription tool), a Google Doc with LinkedIn posts you admire for style reference. You get: Professional LinkedIn content that sounds like you, but organized and polished. Technical requirements: Anthropic API, email account, Google Doc with example posts. This is basically having an AI writing assistant that knows your voice and preferred style, turning your brain dumps into professional content.
by Dataki
Workflow updated on 17/06/2024:** Added 'Summarize' node to avoid creating a row for each Notion content block in the Supabase table.* Store Notion's Pages as Vector Documents into Supabase This workflow assumes you have a Supabase project with a table that has a vector column. If you don't have it, follow the instructions here: Supabase Langchain Guide Workflow Description This workflow automates the process of storing Notion pages as vector documents in a Supabase database with a vector column. The steps are as follows: Notion Page Added Trigger: Monitors a specified Notion database for newly added pages. You can create a specific Notion database where you copy the pages you want to store in Supabase. Node: Page Added in Notion Database Retrieve Page Content: Fetches all block content from the newly added Notion page. Node: Get Blocks Content Filter Non-Text Content: Excludes blocks of type "image" and "video" to focus on textual content. Node: Filter - Exclude Media Content Summarize Content: Concatenates the Notion blocks content to create a single text for embedding. Node: Summarize - Concatenate Notion's blocks content Store in Supabase: Stores the processed documents and their embeddings into a Supabase table with a vector column. Node: Store Documents in Supabase Generate Embeddings: Utilizes OpenAI's API to generate embeddings for the textual content. Node: Generate Text Embeddings Create Metadata and Load Content: Loads the block content and creates associated metadata, such as page ID and block ID. Node: Load Block Content & Create Metadata Split Content into Chunks: Divides the text into smaller chunks for easier processing and embedding generation. Node: Token Splitter
by Muhammad Farooq Iqbal
This n8n template demonstrates how to create an automated emotional story generation system that produces structured video prompts and generates corresponding images using AI. The workflow creates a complete story with 5 scenes featuring a Pakistani character named Yusra, converts them into Veo 3 video generation prompts, and generates images for each scene. Use cases include: Automated story creation for social media content Video pre-production with AI-generated storyboards Content creation for educational or entertainment purposes Multi-scene narrative development with consistent character design Good to know: Uses Gemini 2.5 Flash Lite for story generation and prompt conversion Uses Gemini 2.0 Flash Exp for image generation The image generation model may be geo-restricted in some regions Workflow includes automatic Google Drive organization and Google Sheets tracking How it works: Story Creation: Gemini AI creates a 5-scene emotional story featuring Yusra, a Pakistani girl aged 20-25 in traditional dress Folder Organization: AI generates a unique folder name with timestamp for project organization Google Sheets Setup: Creates a new sheet to track all scenes and their processing status Scene Processing: Each scene is processed individually with character and action prompts Veo 3 Prompt Conversion: Converts natural language scene descriptions into structured JSON format optimized for Veo 3 video generation, including parameters like: Detailed scene descriptions Camera movements and angles Lighting and mood settings Style and quality specifications Aspect ratios and technical parameters Image Generation: Uses Gemini's image generation model to create visual representations of each scene File Management: Automatically uploads images to Google Drive and organizes them in project folders Status Tracking: Updates Google Sheets with processing status and file URLs Automated Workflow: Includes conditional logic to handle different processing states and file movements How to use: Execute the workflow manually or set up automated triggers The system will automatically create a new story with 5 scenes Each scene gets processed through the AI pipeline Generated images are organized in Google Drive folders Track progress through the Google Sheets interface The workflow handles all file management and status updates automatically Requirements: Gemini API access for both text and image generation Google Drive for file storage and organization Google Sheets for project tracking and management n8n instance with appropriate node access Customizing this workflow: Modify the character description in the Story Creator node Adjust the number of scenes by changing the story prompt Customize the Veo 3 prompt parameters for different video styles Add additional AI models or processing steps Integrate with other content creation tools Modify the folder naming convention or organization structure Technical Features: Automated retry logic for failed operations Conditional processing based on status flags Batch processing for multiple scenes Error handling and status tracking File organization with timestamp-based naming Integration with Google Workspace services This template is perfect for content creators, educators, or anyone looking to automate story-based content creation with AI assistance.
by Rajeet Nair
Overview This workflow implements a privacy-preserving AI document processing pipeline that detects, masks, and securely manages Personally Identifiable Information (PII) before any AI processing occurs. Organizations often need to analyze documents such as invoices, forms, contracts, or reports using AI. However, sending documents containing personal data directly to AI models can create serious privacy, compliance, and security risks. This workflow solves that problem by automatically detecting sensitive information, replacing it with secure tokens, and storing the original values in a protected vault database. Only the masked version of the document is sent to the AI model for analysis. If required, a controlled PII re-injection mechanism can restore original values after processing. The workflow also records all operations in an audit log, making it suitable for environments requiring strong compliance such as GDPR, financial services, healthcare, or enterprise document processing systems. How It Works 1. Document Upload A webhook receives a document (typically a PDF) and triggers the workflow. 2. OCR Text Extraction The OCR Extract node extracts the text content from the document so it can be analyzed for sensitive information. 3. PII Detection Multiple detectors analyze the text to identify different types of sensitive data: Email addresses (regex detection) Phone numbers (multi-pattern detection) Identification numbers such as PAN, SSN, or bank accounts Physical addresses detected using an AI model Each detection includes: detected value location in the text confidence score 4. Detection Consolidation All detected PII results are merged into a single dataset. The workflow resolves overlapping detections and removes duplicates to produce a clean list of sensitive values. 5. Tokenization and Secure Vault Storage Each detected PII value is replaced with a secure token, for example: <<EMAIL_7F3A>> <<PHONE_A12B>> The original values are securely stored in a Postgres vault table. This ensures sensitive data is never exposed to AI models. 6. Masked AI Processing The masked document is sent to an AI model for structured analysis. Possible AI tasks include: Document classification Data extraction Document summarization Entity extraction Since all sensitive data has been tokenized, the AI processes the document without seeing any real personal data. 7. Controlled PII Re-Injection After AI processing, the workflow can optionally restore original values from the vault. The Re-Injection Controller determines which fields are allowed to restore PII based on defined permissions. 8. Compliance Audit Logging All events are recorded in an audit table, including: PII detection token generation AI processing PII restoration This provides traceability and compliance reporting. Setup Instructions 1. Configure Postgres Database Create two tables in your database. PII Vault Table Example structure: token original_value type document_id created_at This table securely stores original PII values mapped to tokens. Audit Log Table Example structure: document_id pii_types_detected token_count ai_access_confirmed re_injection_events timestamp actor This table records workflow activity for compliance tracking. 2. Configure AI Model Credentials This workflow supports multiple AI models: Anthropic Claude (used for AI document processing) Ollama local models (used for address detection) Configure credentials in n8n before running the workflow. 3. Configure Webhook Trigger The workflow starts when a document is sent to the webhook: POST /webhook/gdpr-document-upload Upload a PDF file to this endpoint to trigger processing. 4. Configure Alert Notifications (Optional) Replace the placeholder alert webhook URL with your monitoring or alerting system. Example use cases: Slack alert monitoring system incident notification Alerts are triggered if masking fails. Use Cases This workflow is useful for many privacy-sensitive automation scenarios. GDPR-Compliant Document Processing Safely process documents containing personal data without exposing PII to AI models. AI-Powered Document Analysis Use AI to summarize or extract data from documents while maintaining privacy. Enterprise Data Redaction Pipelines Automatically detect and tokenize sensitive data before sending documents to downstream systems. Financial Document Processing Process invoices, contracts, and financial reports securely. Healthcare Document Automation Analyze patient documents while ensuring sensitive data is protected. Requirements To run this workflow you need: n8n** Postgres database** Anthropic Claude API access** Ollama (optional for local AI address detection)** Webhook endpoint for document uploads** Optional integrations: Monitoring or alert system Compliance audit database Key Features Automated PII detection and tokenization AI-safe document processing** Secure vault storage for sensitive data Controlled PII restoration Full audit logging Works with multiple AI models Designed for GDPR and enterprise compliance Summary This workflow creates a secure bridge between sensitive documents and AI systems. By automatically detecting, masking, and securely storing personal data, it enables organizations to safely apply AI to document processing tasks without exposing sensitive information. The combination of tokenization, secure vault storage, controlled re-injection, and audit logging makes this workflow suitable for privacy-sensitive industries and enterprise automation pipelines.
by Marketing Canopy
UTM Link Creator & QR Code Generator with Scheduled Google Analytics Reports This workflow enables marketers to generate UTM-tagged links, convert them into QR codes, and automate performance tracking in Google Analytics with scheduled reports every 7 days. This solution helps monitor traffic sources from different marketing channels and optimize campaign performance based on analytics data. Prerequisites Before implementing this workflow, ensure you have the following: Google Analytics 4 (GA4) Account & Access Ensure you have a GA4 property set up. Access to the GA4 Data API to schedule performance tracking. Refer to the Google Analytics Data API Overview for more information. Airtable Account & API Key Create an Airtable base to store UTM links, QR codes, and analytics data. Obtain an Airtable API key from your Account Settings. Detailed instructions are available in the Airtable API Authentication Guide. Step-by-Step Guide to Setting Up the Workflow 1. Generate UTM Links Create a form or interface to input: Base URL** (e.g., https://example.com) Campaign Name** (utm_campaign) Source** (utm_source) Medium** (utm_medium) Term** (Optional: utm_term) Content** (Optional: utm_content) Append UTM parameters to generate a trackable URL. 2. Store UTM Links & QR Codes in Airtable Set up an Airtable base with the following columns: UTM Link** QR Code** Campaign Name** Source** Medium** Date Created** Adjust as needed based on your tracking requirements. For guidance on setting up your Airtable base and using the API, refer to the Airtable Web API Documentation. 3. Convert UTM Links to QR Codes Use a QR code generator API (e.g., goqr.me, qrserver.com) to generate QR codes for each UTM link and store them in Airtable. 4. Schedule Google Analytics Performance Reports (Every 7 Days) Use the Google Analytics Data API to pull weekly performance reports based on UTM parameters. Extract key metrics such as: Sessions Users Bounce Rate Conversions Revenue (if applicable) Store the data in Airtable for tracking and analysis. Adjust timeframe as needed For more details on accessing and using the Google Analytics Data API, consult the Google Analytics Data API Overview. Benefits of This Workflow ✅ Track Marketing Campaigns: Easily monitor which channels drive traffic. ✅ Automate QR Code Creation: Seamless integration of UTM links with QR codes. ✅ Scheduled Google Analytics Reports: No manual reporting—everything runs automatically. ✅ Improve Data-Driven Decisions: Optimize ad spend and marketing strategies based on performance insights. This version ensures proper Markdown structure, includes relevant documentation links, and improves readability. Let me know if you need any further refinements! 🚀
by Max Mitcham
Want to check out all my flows, follow me on: https://maxmitcham.substack.com/ https://www.linkedin.com/in/max-mitcham/ Email Manager - Intelligent Gmail Classification This automation flow is designed to automatically monitor incoming Gmail messages, analyze their content and context using AI, and intelligently classify them with appropriate labels for better email organization and prioritization. ⚙️ How It Works (Step-by-Step): 📧 Gmail Monitoring (Trigger) Continuously monitors your Gmail inbox: Polls for new emails every minute Captures all incoming messages automatically Triggers workflow for each new email received 📖 Email Content Extraction Retrieves complete email details: Full email body and headers Sender information and recipient lists Subject line and metadata Existing Gmail labels and categories Email threading information (replies/forwards) 🔍 Email History Analysis AI agent checks relationship context: Searches for previous emails from the same sender Checks sent folder for prior outbound correspondence Determines if this is a first-time contact (cold email) Analyzes conversation thread history 🤖 Intelligent Classification Agent Advanced AI categorization using: Claude Sonnet 4 for sophisticated email analysis Context-aware classification based on email history Content analysis for intent and urgency detection Header analysis for automated vs. human-sent emails 🏷️ Smart Label Assignment Automatically applies appropriate Gmail labels: To Respond: Requires direct action/reply FYI: For awareness, no action needed Notification: Service updates, policy changes Marketing: Promotional content and sales pitches Meeting Update: Calendar-related communications Comment: Document/task feedback 📋 Structured Processing Ensures consistent labeling: Uses structured output parsing for reliability Returns specific Label ID for Gmail integration Applies label automatically to the email Maintains classification accuracy 🛠️ Tools Used: n8n: Workflow automation platform Gmail API: Email monitoring and label management Anthropic Claude: Advanced email content analysis Gmail Tools: Email history checking and search Structured Output Parser: Consistent AI responses 📦 Key Features: Real-time email monitoring and classification Context-aware analysis using email history Intelligent cold vs. warm email detection Multiple classification categories for organization Automatic Gmail label application Header analysis for automated email detection Thread-aware conversation tracking 🚀 Ideal Use Cases: Busy executives managing high email volumes Sales professionals prioritizing prospect communications Support teams organizing customer inquiries Marketing teams filtering promotional content Anyone wanting automated email organization Teams needing consistent email prioritization `
by Yaron Been
Telegram AI Assistant: Summarize Links & Generate Images On Demand This workflow turns any Telegram chat into a smart assistant. By typing simple commands like /summary or /img, users can trigger powerful AI actions—directly from Telegram. ✨ What It Does This automation listens for specific commands in Telegram messages: /help: Sends a help menu explaining available commands. /summary <link>: Fetches a webpage, extracts its content, and summarizes it using OpenAI into 10–12 bullet points. /img <prompt>: Sends the image prompt to OpenAI and replies that the request has been received (designed for future integration with image APIs). 📦 Features ✅ Works instantly in Telegram 🧠 Uses OpenAI for text summarization and image prompt processing 🌐 Scrapes and cleans raw article text before summarizing 📤 Replies directly to the same Telegram thread 🔧 Easily expandable to support more commands 🔧 Use Cases Research Summaries**: Quickly condense articles or reports shared in chat. Content Review**: Get team-friendly TL;DRs of long blog posts or product pages. Creative Brainstorming**: Share visual ideas via /img and get quick prompts logged. Customer Support**: Offer instant answers in group chats (with further extension). Daily Digest Bot**: Connect to news feeds and auto-summarize updates. 🚀 Getting Started Clone this workflow and connect your Telegram Bot. Insert your OpenAI credentials. Deploy and test by messaging /summary https://example.com in your Telegram group or DM. Expand with new commands or connect Stability.ai or other services for real image generation. 🔗 Author & Resources Built by Yaron Been Follow more automations at nofluff.online
by Samir Saci
Tags: Supply Chain, Logistics, AI Agents Context Hey! I’m Samir, a Supply Chain Data Scientist from Paris, and the founder of LogiGreen Consulting. We design tools to help companies improve their logistics processes using data analytics, AI, and automation—to reduce costs and minimize environmental impacts. >Let’s use N8N to improve logistics operations! 📬 For business inquiries, you can add me on LinkedIn Who is this template for? This workflow template is designed for logistics or manufacturing operations that receive orders by email. The example above illustrate the challenge we want to tackle using an AI Agent to parse the information and load them in a Google sheet. If you want to understand how I built this workflow, check my detailed tutorial: 🎥 Step-by-Step Tutorial How does it work? The workflow is connected to a Gmail Trigger to open all the emails that include Inbound Order in their subject. The email is parsed by an AI Agent equipped with OpenAI's GPT to collect all the information. The results are pulled in a Google Sheet. These orderlines can then be transferred to warehouse teams to prepare *order receiving. What do I need to get started? You’ll need: Gmail and Google Drive Accounts** with the API credentials to access it via n8n An OpenAI API key (GPT-4o) for the chat model. A Google Sheet with these columns: PO_NUMBER, EXPECTED_DELIVERY DATE, SKU_ID, QUANTITY Next Steps Follow the sticky notes in the workflow to configure each node and start using AI to support your logistic operations. 🚀 Curious how N8N can transform your logistics operations? 📬 Let’s connect on LinkedIn Notes An example of email is included in the template so you can try it with your mailbox. This workflow was built using N8N version 1.82.1 Submitted: March 28, 2025
by Yaron Been
Replace manual task prioritization with intelligent AI reasoning that thinks like a Chief Operating Officer. This workflow automatically fetches your Asana tasks every morning, analyzes them using advanced AI models, and delivers the single most critical task with detailed reasoning - ensuring your team always focuses on what matters most. ✨ What This Workflow Does: 📋 Automated Task Collection**: Fetches all assigned Asana tasks daily at 9 AM 🤖 AI-Powered Analysis**: Uses OpenAI GPT-4 to evaluate urgency, impact, and strategic importance 🎯 Smart Prioritization**: Identifies the #1 most critical task with detailed reasoning 🧠 Contextual Memory**: Leverages vector database for historical context and pattern recognition 💾 Structured Storage**: Saves prioritized tasks to PostgreSQL with full audit trail 🔄 Continuous Learning**: Builds organizational knowledge over time for better decisions 🔧 Key Features: Daily automation** with zero manual intervention Context-aware AI** that learns from past prioritization decisions Strategic reasoning** explaining why each task is prioritized Vector-powered memory** using Pinecone for intelligent context retrieval Clean structured output** with task names, priority levels, and detailed justifications Database integration** for reporting and historical analysis 📋 Prerequisites: Asana account with API access OpenAI API key (GPT-4 recommended) PostgreSQL database Pinecone account (for vector storage and context) 🎯 Perfect For: Operations teams managing multiple competing priorities Startups needing systematic task management Project managers juggling complex workflows Leadership teams requiring strategic focus Any organization wanting AI-driven operational intelligence 💡 How It Works: Morning Automation: Triggers every day at 9 AM Data Collection: Pulls all relevant tasks from Asana AI Analysis: Evaluates each task using COO-level strategic thinking Context Retrieval: Searches vector database for similar past tasks Smart Prioritization: Identifies the single most important task Structured Output: Delivers priority level with detailed reasoning Data Storage: Saves results for reporting and continuous improvement 📦 What You Get: Complete n8n workflow with all AI components configured PostgreSQL database schema for task storage Vector database setup for contextual intelligence Comprehensive documentation and setup guide Sample task data and output examples 💡 Need Help or Want to Learn More? Created by Yaron Been - Automation & AI Specialist 📧 Support: Yaron@nofluff.online 🎥 YouTube Tutorials: https://www.youtube.com/@YaronBeen/videos 💼 LinkedIn: https://www.linkedin.com/in/yaronbeen/ Discover more advanced automation workflows and AI integration tutorials on my channels! 🏷️ Tags: AI, OpenAI, Asana, Task Management, COO, Prioritization, Automation, Vector Database, Operations, GPT-4
by Dataki
This workflow demonstrates how to enrich data from a list of companies in a spreadsheet. While this workflow is production-ready if all steps are followed, adding error handling would enhance its robustness. Important notes Check legal regulations**: This workflow involves scraping, so make sure to check the legal regulations around scraping in your country before getting started. Better safe than sorry! Mind those tokens**: OpenAI tokens can add up fast, so keep an eye on usage unless you want a surprising bill that could knock your socks off! 💸 Main Workflow Node 1 - Webhook This node triggers the workflow via a webhook call. You can replace it with any other trigger of your choice, such as form submission, a new row added in Google Sheets, or a manual trigger. Node 2 - Get Rows from Google Sheet This node retrieves the list of companies from your spreadsheet. here is the Google Sheet Template you can use. The columns in this Google Sheet are: Company**: The name of the company Website**: The website URL of the company These two fields are required at this step. Business Area**: The business area deduced by OpenAI from the scraped data Offer**: The offer deduced by OpenAI from the scraped data Value Proposition**: The value proposition deduced by OpenAI from the scraped data Business Model**: The business model deduced by OpenAI from the scraped data ICP**: The Ideal Customer Profile deduced by OpenAI from the scraped data Additional Information**: Information related to the scraped data, including: Information Sufficiency: Description: Indicates if the information was sufficient to provide a full analysis. Options: "Sufficient" or "Insufficient" Insufficient Details: Description: If labeled "Insufficient," specifies what information was missing or needed to complete the analysis. Mismatched Content: Description: Indicates whether the page content aligns with that of a typical company page. Suggested Actions: Description: Provides recommendations if the page content is insufficient or mismatched, such as verifying the URL or searching for alternative sources. Node 3 - Loop Over Items This node ensures that, in subsequent steps, the website in "extra workflow input" corresponds to the row being processed. You can delete this node, but you'll need to ensure that the "query" sent to the scraping workflow corresponds to the website of the specific company being scraped (rather than just the first row). Node 4 - AI Agent This AI agent is configured with a prompt to extract data from the content it receives. The node has three sub-nodes: OpenAI Chat Model: The model used is currently gpt4-o-mini. Call n8n Workflow: This sub-node calls the workflow to use ScrapingBee and retrieves the scraped data. Structured Output Parser: This parser structures the output for clarity and ease of use, and then adds rows to the Google Sheet. Node 5 - Update Company Row in Google Sheet This node updates the specific company's row in Google Sheets with the enriched data. Scraper Agent Workflow Node 1 - Tool Called from Agent This is the trigger for when the AI Agent calls the Scraper. A query is sent with: Company name Website (the URL of the website) Node 2 - Set Company URL This node renames a field, which may seem trivial but is useful for performing transformations on data received from the AI Agent. Node 3 - ScrapingBee: Scrape Company's Website This node scrapes data from the URL provided using ScrapingBee. You can use any scraper of your choice, but ScrapingBee is recommended, as it allows you to configure scraper behavior directly. Once configured, copy the provided "curl" command and import it into n8n. Node 4 - HTML to Markdown This node converts the scraped HTML data to Markdown, which is then sent to OpenAI. The Markdown format generally uses fewer tokens than HTML. Improving the Workflow It's always a pleasure to share workflows, but creators sometimes want to keep some magic to themselves ✨. Here are some ways you can enhance this workflow: Handle potential errors Configure the scraper tool to scrape other pages on the website. Although this will cost more tokens, it can be useful (e.g., scraping "Pricing" or "About Us" pages in addition to the homepage). Instead of Google Sheets, connect directly to your CRM to enrich company data. Trigger the workflow from form submissions on your website and send the scraped data about the lead to a Slack or Teams channel.
by AlQaisi
Unleashing Creativity: Transforming Children's English Storytelling with Automation and AI Check this example: https://t.me/st0ries95 Summary In the realm of children's storytelling, automation is revolutionizing the way captivating tales are created and shared. This article highlights the transformative power of setting up a workflow for AI-powered children's English storytelling on Telegram. By delving into the use cases and steps involved, we uncover how this innovative approach is inspiring young minds and fostering a love for storytelling in children. Usecase The workflow for children's stories is a game-changer for content creators, educators, and parents seeking to engage children through imaginative and educational storytelling. Here's how this workflow is making a difference: Streamlined Content Creation: By providing a structured framework and automation for story generation, audio creation, and image production, the workflow simplifies the process of crafting captivating children's stories. Enhanced Educational Resources: Teachers can leverage this workflow to develop interactive educational materials that incorporate storytelling, making learning more engaging for students. Personalized Parental Engagement: Parents can share personalized stories with their children, nurturing a passion for reading and creativity while strengthening family bonds through shared storytelling experiences. Community Connection: Organizations and community groups can use the workflow to connect with their audience and promote literacy and creativity by creating and sharing children's stories. Inspiring Imagination: Through automated creation and sharing of enchanting stories, the workflow aims to spark imagination, inspire young minds, and instill a love for storytelling in children. Node Explanation OpenAI Chat Model: Utilizes the OpenAI Chat Model to generate text for the children's stories. Schedule Trigger: Triggers the workflow at set intervals (every 12 hours) to generate new stories. Recursive Character Text Splitter: Splits text into smaller chunks for processing. OpenAI Chat Model2: Another OpenAI Chat Model node for generating prompts for image creation. Send Story Text: Sends the generated story text to a specified Telegram chat. Send Audio for the Story: Sends audio files of the stories to the Telegram chat. Send Story Picture: Shares images related to the stories on Telegram. Create a Kids Stories: Generates captivating short tales for kids using prompts provided. Generate Audio for the Story: Converts generated text into audio files for storytelling. Create a Prompt for DALL-E: Creates prompts for generating images related to the stories. Generate a Picture for the Story: Generates pictures based on the prompts for visual storytelling. By embracing automation in children's storytelling, we unleash creativity, inspire young minds, and create magical experiences that resonate with both storytellers and listeners alike.
by Anthony
This workflow allows you to recognize a folder with receipts or invoices (make sure your files are in .pdf, .png, or .jpg format). The workflow can be triggered via the "Test workflow" button, and it also monitors the folder for new files, automatically recognizing them. Video Demo https://youtu.be/mGPt7fqGQD8 1. n8n import glitch After import, the trigger node "When clicking 'Test workflow'" might be disconnected. You need to connect it via 2 arrows to "Google Sheets1" and "Google Drive" nodes. So, the workflow has 2 triggers - via button, and via Google Sheets "new file" event - both of these triggers should be connected to 2 nodes. Here is how it should look like: https://ocr.oakpdf.com/n8n_fix.png 2. Set up RapidAPI HTTP auth key Create new "HTTP header" n8n credential and paste your RapidAPI key from https://rapidapi.com/restyler/api/receipt-and-invoice-ocr-api into it. https://ocr.oakpdf.com/n8n_api_key.png Make sure "HTTP Request" node uses this credential. 3. Set up your Google Auth You need a Google connection to work with your Google Sheets and Google Drive accounts: https://docs.n8n.io/integrations/builtin/credentials/google/oauth-generic/#finish-your-n8n-credential 4. Set up Google Sheets Copy this Google Sheets document: https://docs.google.com/spreadsheets/d/1G0w-OMdFRrtvzOLPpfFJpsBVNqJ9cfRLMKCVWfrTQBg/edit?usp=sharing Custom document formats and advanced usage Email: contact@scrapeninja.net Linkedin: https://www.linkedin.com/in/anthony-sidashin/