by Ficky
Build a Redis-Powered CRUD App with HTML Frontend This workflow demonstrates how to use n8n to build a complete, self-contained CRUD (Create, Read, Update, Delete) application without relying on any external server or hosting. It not only acts as the backend, handling all CRUD operations through Webhook endpoints, but also serves a fully functional HTML Single Page Application (SPA) directly via a webhook response. Redis is used as a lightweight data store, providing fast and simple key-value storage with auto-incremented IDs. Because both the frontend (HTML app) and backend (API endpoints) are managed entirely within a single n8n workflow, you can quickly prototype or deploy small tools without additional infrastructure. This approach is ideal for: Rapidly creating no-code or low-code applications Running fully browser-based tools served directly from n8n Teaching or demonstrating n8n + Redis integration in a single workflow Features Add new item with auto-incremented ID Edit existing item Delete specific item Reset all data (clear storage and reset autoincrement id) Single HTML frontend for demonstration (no framework required) Setup Instructions 1. Prerequisites Before importing and running the workflow, make sure you have: A running n8n instance (self-hosted or cloud) A running Redis server (local or remote) 2. API Path Setup For the REST API, use a consistent path. For example, if you choose items as the path: 2a. Get All Items** Method: GET Endpoint: items 2b. Add Item** Method: POST Endpoint: items 2c. Edit Item** Method: PUT Endpoint: items 2d. Delete Item** Method: DELETE Endpoint: items 2e. Reset Items** Method: POST Endpoint: items-reset 3. Configure the API URL Set the API URL in the SET API URL node. Use your n8n webhook URL, for example: https://yourn8n.com/webhook/items 4. Run the HTML App Once everything is set: Open the webhook URL for the HTML app in a browser. The CRUD interface will load and connect to the API endpoints automatically. You can now add, edit, delete, or reset items directly from the web interface. Workflows 1. Render the HTML CRUD App This webhook serves a self-contained HTML Single Page Application (SPA) for basic CRUD operations. The HTML content is returned directly in the webhook response. This setup is ideal for lightweight, browser-based tools without external hosting. How to Use Open the webhook URL in a browser The CRUD interface will load and connect to the data source via API calls Before using, make sure to edit the api_url in the SET API URL node to match your webhook endpoint 2a. REST API: Get All Items This webhook handles retrieving all saved items from Redis. Each item is returned with its corresponding ID and associated data (e.g., name). This endpoint is used by the HTML CRUD App to display the full list of items. Method**: GET Function**: Fetches all items stored in Redis and returns them as a JSON array 2b. REST API: Add Item This webhook handles the Add Item functionality. This endpoint is typically called by the HTML CRUD App when adding a new item. Method**: POST Request Body**: { "name": "item name" } Function**: Generates an auto-incremented ID using Redis and saves the data under that ID 2c. REST API: Edit Item This webhook handles updating an existing item in Redis. Method**: PUT Request Body**: { "id": 1, "name": "Updated Item Name" } Function**: Finds the item by the given id and updates its data in Redis 2d. REST API: Delete Item This webhook handles deleting a specific item from Redis. Method**: DELETE Request Body**: { "id": 1 } Function**: Removes the item with the given id from Redis 2e. REST API: Reset Items This webhook handles resetting all data in the application. Method**: POST Function**: Deletes all stored items from Redis Resets the auto-increment ID by deleting the data in Redis
by David Olusola
๐ฏ JavaScript Master Class - Interactive Code Tutorial ๐ How It Works This tutorial is designed as a self-paced learning experience where you explore working JavaScript code examples. Unlike traditional tutorials, you learn by examining real implementations and understanding how they work. ๐ The Learning Method: Execute first - See the workflow in action Open each node - This is where the real learning happens! Study the code - Read JavaScript implementations and comments Understand the flow - See how data transforms between nodes Experiment - Modify code to test your understanding ๐ฎ The "Game" Concept: It's not a real game - it's a gamified learning experience Uses RPG elements (XP, levels, achievements) to make learning engaging Simulates progression through 3 difficulty levels Main learning happens when you open nodes and read the code!** ๐ Setup Steps Step 1: Import the Template Copy the JSON template provided Open your n8n instance Create a new workflow Press Ctrl+A (or Cmd+A on Mac) to select all Press Ctrl+V (or Cmd+V) to paste the JSON Click "Save" and name it: JavaScript Master Class - Interactive Tutorial Step 2: Execute the Workflow Click "Test workflow" or "Execute workflow" Watch it run through all nodes automatically See the final results and progression simulation Step 3: Start Learning (The Important Part!) Now the real learning begins - you must open each node manually: ๐ For Each Code Node: Double-click the node to open it Read the JavaScript code carefully Study the comments - they explain key concepts Understand the logic - how input becomes output Note the techniques used in each challenge ๐ For Each Sticky Note: Read the explanations and context Understand the learning objectives Note the skills being taught ๐ฏ Learning Path Level 1: Data Warrior (Beginner) ๐ Open Node: ๐ฒ Level 1: Data Warrior Focus:** Data deduplication using filter() and findIndex() Key Skills:** Array methods, duplicate detection What to Study:** How the deduplication algorithm works Level 2: API Ninja (Intermediate) ๐ Open Node: โ๏ธ Level 2: API Ninja Focus:** Data transformation and validation Key Skills:** String manipulation, validation logic, error handling What to Study:** How to clean and validate messy API data Level 3: Automation Master (Advanced) ๐ Open Node: ๐ Final Boss: Automation Master Focus:** Complex workflow processing Key Skills:** Task orchestration, priority sorting, error handling What to Study:** How to build robust automation systems ๐ก Learning Tips ๐ Active Exploration: Don't just run it** - open every single node! Read all comments** - they contain key insights Compare approaches** - see how complexity increases Try modifications** - change values and see what happens ๐ Study Techniques: Take notes** on patterns you see Copy interesting code** snippets for reference Try to explain** each function to yourself Test your understanding** by modifying the code ๐งช Experimentation: Change filter conditions** in Level 1 Modify validation rules** in Level 2 Adjust workflow logic** in Level 3 Break something** and fix it - great for learning! โ ๏ธ Important Notes ๐ฎ "Game" Reality Check: This is NOT an interactive game where you make choices It's a code tutorial with game-like progression themes The "game" runs automatically when executed Real learning happens when you manually open and study each node** ๐ Educational Value: Primary learning:** Understanding JavaScript implementations Secondary learning:** n8n workflow patterns Bonus learning:** Problem-solving approaches ๐ง Technical Requirements: Working n8n instance Basic JavaScript knowledge helpful but not required Willingness to explore and experiment ๐ฏ Success Metrics You'll know you're learning when you can: โ Explain how each deduplication algorithm works โ Identify the validation patterns used โ Understand the workflow orchestration logic โ Modify the code to handle different scenarios โ Apply these patterns to your own projects ๐ค Next Steps After completing this tutorial: Apply the patterns to your own workflows Experiment with variations Build something using these techniques Share your learnings with the community Remember: The magic happens when you open each node and study the code! ๐
by Custom Workflows AI
Introduction The "High-Level Service Page SEO Blueprint Report" workflow is a powerful, AI-driven solution designed to generate comprehensive SEO content strategies for service-based businesses. By analyzing competitor websites and user intent, this workflow creates a detailed blueprint that outlines the optimal structure, content, and conversion elements for a service page. The workflow leverages the JINA Reader API to extract content from competitor websites and uses Google Gemini AI to perform deep analysis across multiple dimensions: competitor content structure, user intent, strategic opportunities, and conversion optimization. The final output is a professionally formatted Markdown document that provides actionable guidance for creating a high-performing service page that satisfies both user needs and search engine requirements. This workflow eliminates the time-consuming process of manually analyzing competitors and developing content strategies, providing a data-driven foundation for service page creation that would typically require hours of expert analysis. Who is this for? This workflow is designed for digital marketers, SEO specialists, content strategists, and web developers who need to create or optimize service pages for businesses. It's particularly valuable for marketing agencies and freelancers who regularly develop content strategies for clients across various industries. Users should have a basic understanding of SEO concepts, content marketing, and website structure. While technical SEO knowledge is beneficial, the workflow is designed to provide comprehensive guidance even for those with intermediate-level expertise. The ideal user is someone who wants to streamline their content planning process and ensure their service pages are built on data-driven insights rather than guesswork. What problem is this workflow solving? Creating effective service pages that rank well in search engines while converting visitors is a complex challenge that typically requires extensive competitive research, content planning, and conversion optimization expertise. This workflow addresses several key pain points: Time-consuming competitor analysis: Manually analyzing multiple competitor websites to identify content patterns, heading structures, and meta tag strategies can take hours. Difficulty identifying content gaps: Determining what topics competitors are missing that could provide a competitive advantage requires deep analysis and industry knowledge. Balancing SEO and conversion elements: Creating content that satisfies both search engines and user needs while driving conversions is a delicate balance that many struggle to achieve. Lack of structured approach: Many content creators work without a comprehensive blueprint, leading to inconsistent results and missed opportunities. Difficulty translating analysis into actionable recommendations: Even when analysis is performed, turning those insights into a concrete content plan can be challenging. This workflow automates these processes, providing a structured, data-driven approach to service page creation that saves hours of research and planning time. What this workflow does Overview The workflow takes a list of competitor URLs and a target keyword as input, then performs a multi-stage analysis to generate a comprehensive service page blueprint. It extracts and analyzes competitor content, evaluates user intent, identifies strategic opportunities, and creates detailed recommendations for page structure, content, and conversion elements. The final output is a professionally formatted Markdown document that serves as a complete roadmap for creating an effective service page. Process Data Collection: The workflow begins with a form that collects essential information: competitor URLs, target keyword, services offered, brand name, and whether the page is a homepage. Competitor Content Extraction: The workflow processes each competitor URL, using the JINA Reader API to extract the HTML content from each site. Content Structure Analysis: For each competitor site, the workflow extracts and analyzes heading structures, meta tags, schema markup, and recurring phrases (n-grams). Competitor Analysis Report: The AI synthesizes the competitive data to identify patterns in meta titles/descriptions, common outline sections, key heading concepts, and structural elements. User Intent Analysis: The workflow analyzes the target keyword to determine primary and secondary user intents, user personas, and their position in the buyer's journey. Gap Analysis: The AI identifies content overlaps ("table stakes"), content gaps (opportunities), SEO keyword priorities, and potential UX/conversion advantages. Page Outline Generation: Based on the previous analyses, the workflow creates an optimal page structure with H1, H2s, H3s, and potentially H4s, with justifications for each section. UX & Conversion Recommendations: The workflow adds detailed recommendations for calls-to-action, trust signals, copywriting tone, visual elements, and risk reversal strategies. Final Blueprint Creation: All analyses and recommendations are compiled into a comprehensive, well-structured Markdown document that serves as a complete service page blueprint. Setup Download or import the "High-Level Service Page SEO Blueprint Report" workflow JSON file into your n8n instance. Create a JINA Reader API key by visiting https://jina.ai/api-dashboard/key-manager. You can claim a free API key that allows up to 1 million tokens. Set up Google Gemini (PaLM) credentials by following the guide at https://docs.n8n.io/integrations/builtin/credentials/googleai/#using-geminipalm-api-key. Update the "Edit Fields" node with: Your JINA Reader API Key Adjust the "Waiting Time" to 20 seconds if using the free Google Gemini API tier (which limits to 5 requests per minute) Optionally change the Gemini model if needed Activate the workflow and start the form trigger. Complete the form with: Competitors (up to 5 direct competitor URLs) Target Keyword (the query related to your service) Services Offered (details of your complete service offerings) Brand Name (your company name) Whether the page is a homepage After processing, download the generated .txt file, which contains the blueprint in Markdown format. How to customize this workflow to your needs Adjust AI parameters: Modify the temperature settings in the Google Gemini Chat Model nodes to control creativity vs. precision in the AI outputs. Customize extraction logic: Edit the "Extract HTML Elements" code node to focus on specific HTML elements that are most relevant to your industry or content type. Modify analysis prompts: Customize the prompts in the various analysis nodes to focus on specific aspects of SEO or content strategy that are most important for your use case. Add industry-specific guidance: Enhance the prompts with industry-specific instructions or examples to make the output more relevant to particular sectors. Integrate with content management systems: Extend the workflow to automatically send the blueprint to content management systems, project management tools, or document storage platforms. Add competitor scoring: Implement a scoring system to evaluate and rank competitors based on specific criteria relevant to your strategy. Expand the analysis: Add additional analysis nodes to evaluate other aspects of competitor websites, such as page speed, mobile-friendliness, or backlink profiles.
by Airtop
Use Case Turn any web page into a compelling LinkedIn post โ complete with an AI-generated image. This automation is ideal for sharing content like blog posts, case studies, or product updates in a polished and engaging format. What This Automation Does Given a page URL and optional user instructions, this automation: Scrapes the content of the webpage Uses AI to write a clear, educational, and LinkedIn-optimized post Generates a brand-aligned visual that matches the content Sends both to Slack for review and approval Handles feedback and revisions via Slack interactions Input: Page URL** โ The link to the webpage (required) Instructions** โ Optional notes on tone, emphasis, or format Output: LinkedIn post text AI-generated visual prompt and image Slack message with review/approval options How It Works Form Submission: User inputs a web page and optional instructions. Web Scraping: Uses Airtop to extract page content. Post Generation: AI agent writes a post based on the page and instructions. Visual Generation: Another AI model creates an image prompt; this is sent to a sub-workflow for image rendering. Slack Review Flow: Post and image sent to Slack for feedback User can approve, request revisions, or decline Revisions trigger reprocessing steps automatically Final Post Delivery: Approved post and image are sent back to Slack, ready to publish. Setup Requirements Airtop API key OpenAI credentials for post and image prompt generation Slack OAuth integration with a review channel A sub-workflow for branded image generation Next Steps Post Directly**: Add LinkedIn publishing to automate the full content workflow. Template Variations**: Offer post style presets (e.g., technical, story-driven, short-form). CRM Sync**: Save approved posts and stats in Airtable or Notion for team use. Read more about content generation with Airtop
by Zach @BrightWayAI
Who's it for Content creators, researchers, educators, and digital marketers who need to discover high-quality YouTube training videos on specific topics. Perfect for building curated learning resource lists, competitive research, or content inspiration. What it does This workflow automatically searches YouTube using multiple search queries, filters for quality content, scores videos by relevance, and exports the top results to Google Sheets. It processes hundreds of videos and delivers only the most valuable educational content ranked by custom relevance criteria. The workflow searches for videos using 10 different AI automation-related queries (easily customizable), filters out low-quality content like shorts and clickbait, then ranks results based on title keywords, view counts, and engagement metrics. How it works Multi-query search: Searches YouTube with an array of related queries to get comprehensive coverage Content filtering: Removes shorts, spam, and low-quality videos using regex patterns Quality assessment: Filters videos based on view count, likes, and publication date Relevance scoring: Assigns scores based on title keywords and engagement metrics Result ranking: Sorts videos by relevance score and limits to top 50 results Export to Sheets: Delivers clean, organized data to Google Sheets with all metadata Requirements YouTube Data API v3 credentials from Google Cloud Console Google Sheets credentials for n8n workspace A Google Sheets document to receive the results How to set up Enable YouTube Data API v3 in your Google Cloud Console Add YouTube OAuth2 credentials to your n8n workspace Add Google Sheets credentials to your n8n workspace Create a Google Sheet and update the Google Sheets node with your document ID Customize search queries in the "Set Query" node for your topic Adjust filtering criteria in the Filter nodes based on your quality requirements How to customize the workflow Search topics: Modify the query array in the "Set Query" node to research any topic: [ "Python tutorial", "JavaScript course", "React beginner guide", // Add your queries here ] Quality thresholds: Adjust minimum views, likes, and date ranges in the "Filter for Quality" node Relevance scoring: Customize keyword weightings in the "Relevance Score" node to match your priorities Result limits: Change the number of final results in the "Limit" node (default: 50) Output format: Modify the "Set Fields" node to include additional YouTube metadata like duration, thumbnails, or category information The workflow is designed to be easily adaptable for any research topic while maintaining high content quality standards.
by InfyOm Technologies
โ What problem does this workflow solve? If you're using a self-hosted n8n instance, there's no built-in version history or undo for your workflows. If a workflow is accidentally modified or deleted, there's no way to roll back. This backup workflow solves that problem by automatically syncing your workflows to Google Drive, giving you version control and peace of mind. โ๏ธ What does this workflow do? โฑ Runs on a set schedule (e.g., daily or every 12 hours). ๐ Fetches all workflows from your self-hosted n8n instance. ๐ง Detects changes to avoid duplicate backups. ๐ Creates a dedicated folder for each workflow in Google Drive. ๐พ Uploads new or updated workflow files in JSON format. ๐๏ธ Keeps backup history organized by date. ๐ Allows for easy restore by importing backed-up JSON into n8n. ๐ง Setup Instructions 1. Google Drive Setup Connect your Google Drive account using the Google Drive node in n8n. Choose or create a root folder (e.g., n8n-workflow-backups) where backups will be stored. 2. n8n API Credentials Generate a Personal Access Token from your self-hosted n8n instance: Go to Settings โ API in your n8n dashboard. Copy the token and use it in the HTTP Request node headers as: Authorization: Bearer <your_token> 3. Schedule the Workflow Use the Cron node to schedule this workflow to run at your desired frequency (e.g., once a day or every 12 hours). ๐ง How it Works Step-by-Step Flow: Scheduled Trigger The workflow begins on a timed schedule using the Cron node. Fetch All Workflows Uses the n8n API (/workflows) to retrieve a list of all existing workflows. Loop Through Workflows For each workflow: A folder is created in Google Drive using the workflow name. The workflowโs last updated timestamp is checked against Google Drive backups. Smart Change Detection If the workflow has changed since the last backup: A new .json file is uploaded to the corresponding folder. The file is named with the last updated date of the workflow (YYYY-MM-DD-HH-mm-ss.json) to maintain a versioned history. If no change is detected, the workflow is skipped. ๐ Google Drive Folder Organization Backups are neatly organized by workflow and version: /n8n-workflow-backups/ โโโ google-drive-backup-KqhdMBHIyAaE7p7v/ โ โโโ 2025-07-15-13-03-32.json โ โโโ 2025-07-14-03-08-12.json โโโ resume-video-avatar-KqhdMBHIyAaE8p8vr/ โ โโโ 2025-07-15-23-05-52.json Each folder is named after the workflow's name+id and contains timestamped versions. ๐ง Customization Options ๐ Change Backup Frequency Adjust the Cron node to run backups daily, weekly, or even hourly based on your needs. ๐ค Use a Different Storage Provider You can swap out Google Drive for Dropbox, S3, or another cloud provider with minimal changes. ๐งช Add Workflow Filtering Only back up workflows that are active or match specific tags by filtering results from the n8n API. โป๏ธ How to Restore a Workflow from Backup Go to the Google Drive backup folder for the workflow you want to restore. Download the desired .json file (based on the date). Open your self-hosted n8n instance. Click Import Workflow from the sidebar menu. Upload the JSON file to restore the workflow. > You can choose to overwrite an existing workflow or import it as a new one. ๐ค Who can use this? This template is ideal for: ๐งโ๐ป Developers running self-hosted n8n ๐ข Teams managing large workflow libraries ๐ Anyone needing workflow versioning, rollback, or disaster recovery ๐พ Productivity enthusiasts looking for automated backups ๐ฃ Tip Consider enabling version history in Google Drive so you get even more fine-grained backup recovery options on top of what this workflow provides! ๐ Ready to use? Just plug in your n8n token, connect Google Drive, and schedule your backups. Your workflows are now protected!
by Intuz
This n8n template delivers a complete AI-powered solution for automated LinkedIn posts, including unique content, custom images, and optimized hashtags. Use cases are many: Generate and schedule tailored LinkedIn content for different use-cases. By feeding the AI specific prompts, you can create specific post depending upon the topics and visuals to maintain a consistency yet and an online presence. How it works Maintaining a consistent and engaging presence on LinkedIn can be time-consuming, requiring constant ideation, content creation, and manual posting. This workflow takes that burden off your shoulders, delivering a fully automated solution for generating and publishing high-quality LinkedIn content. Scheduled Content Engine: Each day (or on your chosen schedule), the workflow kicks into gear, ensuring a fresh stream of content. Smart Topic & Content Generation: Using the power of Google Gemini, it intelligently crafts unique content topics and then expands them into full, engaging posts, ensuring your message is always fresh and relevant. Dynamic Image Creation: To make your posts stand out, the workflow leverages an AI image generator (like DALL-E) to produce a custom, eye-catching visual that perfectly complements your generated text. SEO-Optimized Hashtag Generation: Google Gemini then analyzes your newly created post and automatically generates a set of relevant, trending, and SEO-friendly hashtags, significantly boosting your content's reach and discoverability. Seamless LinkedIn Publishing: Finally, all these elementsโyour compelling text, unique image, and powerful hashtagsโare merged and automatically published to your LinkedIn profile, establishing you as a thought leader with minimal effort. How to Use: Quick Start Guide This guide will get your AI LinkedIn Content Automation workflow up and running in n8n. Import Workflow Template: Download the template's JSON file and import it into your n8n instance via "File" > "Import from JSON." Configure Credentials: Google Gemini: Set up and apply your API key credentials to all "Google Gemini Chat Model" nodes. AI Image Generation (e.g., OpenAI): Create and apply API key credentials for your chosen image generation service to the "Generate an Image" node. LinkedIn: Set up and apply OAuth credentials to the "Create a post" node for your LinkedIn account. Customize Schedule & AI Prompts: Schedule Trigger: Double-click "Schedule Trigger 1" to set how often your workflow runs (e.g., daily, weekly). AI Prompts: Review and edit the prompts within the "Content Topic Generator," "Content Creator," and "Hashtag Generator / SEO" nodes to guide the AI for your desired content style and topics. Test & Activate: Test Run: Click "Execute Workflow" to perform a test run and verify all steps are working as expected. Activate: Once satisfied, toggle the workflow "Active" switch to enable automated posting on your defined schedule. Requirements To use this workflow template, you will need: n8n Instance: A running n8n instance (cloud or self-hosted) to import and execute the workflow. Google Gemini Account: For content topic generation, content creation, and hashtag generation (requires Google Gemini API Key) from Google AI Studios. AI Image Generation Service Account: For creating images (e.g., OpenAI DALL-E API Key or similar service that the "Generate an Image" node uses). LinkedIn Account: For publishing the generated posts (requires LinkedIn OAuth Credentials for n8n connection). Connect with us Website: https://www.intuz.com/cloud/stack/n8n Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz
by Jan Willem Altink
This workflow provides a secure API endpoint to remotely trigger other n8n workflows with custom data and to retrieve information about your existing workflows. It's perfect for users who want to integrate n8n into external systems or programmatically manage their automations. example usage: I use this workflow in a Raycast extension i have build, to execute n8n workflows from within Raycast: see Github ++How it works++ Receives API Calls: A webhook listens for incoming HTTP requests (e.g., POST to trigger, GET to retrieve info). Triggers Workflows: If the request is to trigger a workflow, it dynamically identifies the target workflow ID (from query parameters) and any input data (from the request body), then executes that workflow. This means you can control any of your workflows without modifying this manager template. Retrieves Workflow Info: Similarly, if the request is to get information, it dynamically uses query parameters (workflowId, mode, includedWorkflows) to fetch details about one or more n8n workflows (e.g., specific, all, active, inactive; full or summarized data). Responds: Sends back a JSON response indicating success/failure or the requested workflow data. ++Set it up++ Configure Webhook Security: Set up "Header Auth" credentials for the main Webhook node. This is the API key your external services will use. Add n8n API Credentials: For the nodes that fetch workflow information (like "Get specific workflowid", "get all active workflows", etc.), connect your n8n API credentials. This allows the workflow to query your n8n instance. Note Your Webhook URL: Once active, n8n provides a production URL for the webhook (path: workflow-manager). Use this URL to make API calls. Understand API Parameters: To trigger: Use ?workflowId=[ID_OF_WORKFLOW_TO_RUN] and send JSON data in the request body. To get info: Use parameters like ?workflowId=[ID], ?includedWorkflows=[all/active/inactive], and ?mode=[full/summary].
by Leonardo Grigorio
Want to see it in action? Watch the full breakdown here: ๐บ Video Link Template Description This n8n workflow empowers you to query structured financial data from Google Sheets or CSV files using AI-generated SQL. Unlike traditional vector database solutions that falter with numerical queries, this template leverages PostgreSQL for efficient data storage and an AI agent to dynamically create optimized SQL queries from natural language inputs. What It Does Retrieves data from Google Sheets or CSV files Infers the data schema and builds a PostgreSQL table Populates the table with your data Uses an AI agent to translate natural language questions into SQL queries Returns precise numerical results quickly and efficiently Why Use This? No SQL knowledge requiredโthe AI generates queries for you Bypasses the inefficiencies and costs of vector database approaches Scales effortlessly without overwhelming the language model Fully free and open-source Setup Requirements Pre-Conditions PostgreSQL Database**: A running PostgreSQL instance (no specific extensions required beyond standard installation). Google Sheets Access**: A publicly accessible or shared Google Sheet URL with structured data (e.g., financial records). Need a starting point? Use this Sample Google Sheet Template. n8n Instance**: A working n8n setup with access to the Google Drive and PostgreSQL nodes. Step-by-Step Instructions Add Your Google Sheets URL Open the "Google Drive Trigger" node. Replace the placeholder URL with your Google Sheetโs link. Verify the sheet name matches your data source. Configure PostgreSQL Update the "PostgreSQL" nodes with your database credentials (host, database, user, password). The workflow automatically creates and populates the table based on your data schema. Run the Workflow Execute the workflow manually to set up the database. Once initialized, use the AI agent by asking questions like: "How much did I sell last week?" "What were the total sales for Product X in February?" (Optional) Automate Updates Add a "Schedule Trigger" node to sync your Google Sheets data with PostgreSQL on a regular basis. How It Works Schema Detection**: The workflow analyzes your Google Sheets or CSV data to infer its structure and create an appropriate PostgreSQL table. AI-Powered Queries**: An optimized AI agent converts your natural language questions into precise SQL queries, ensuring accurate results. Efficient Retrieval**: By using PostgreSQL instead of vector-based methods, this template avoids common pitfalls like slow performance or inaccurate numerical outputs. Tips for Success Ensure your Google Sheet or CSV has consistent column headers for smooth schema detection. Test with simple questions first to verify the AI agentโs query generation. Check out the n8n Template Submission Guidelines for more best practices.
by ARRE
Good to know: This workflow automatically transcribes your favorite podcasts or videos saved in a YouTube playlist and generates a comprehensive, AI-powered summaryโso you can quickly understand the main topics and insights without having to watch or listen to the entire episode. ๐ค Who is this for? Podcast fans who want to save time and get the key points from episodes Busy professionals who follow educational or industry videos and need quick takeaways Content creators or researchers who organize and review large amounts of video/audio material Anyone who wants to efficiently capture and summarize information from YouTube playlists โ What problem is this workflow solving? This workflow solves the challenge of information overload from long-form podcasts and videos. It: Automatically transcribes each video or podcast episode in your chosen YouTube playlist Uses AI to create a clear, well-structured summary of the content Lets you learn and extract valuable information without watching or listening to the entire recording Organizes everything in a Google Sheets document for easy tracking and future reference โ What this workflow does: ๐บ Fetches all videos from a specified YouTube playlist ๐ Extracts video titles, URLs, and IDs ๐ Retrieves and combines transcripts for each video or podcast episode ๐ Processes transcript data for clarity ๐ค Uses AI to generate a detailed, sectioned summary that covers all main topics and insights ๐ Automatically logs video titles, transcripts, summaries, and row numbers to a Google Sheets spreadsheet โ๏ธ How it works: ๐ข Trigger: Start the workflow manually or on a schedule ๐บ Fetch videos from your chosen YouTube playlist ๐ Extract and organize video details (title, URL, ID) ๐ Retrieve the transcript for each video or podcast episode ๐ Combine transcript segments into a single script โ๏ธ Extract the first sentences for focused summarization ๐ค AI agent creates a comprehensive summary of the episode or video ๐ Save all dataโtitle, transcript, summary, and row numberโto Google Sheets ๐ ๏ธ How to use: Set up YouTube OAuth2 credentials in n8n Configure Google Sheets OAuth2 credentials Set up API credentials for transcript and AI processing Create and link your Google Sheets document Input your playlist ID and adjust any filters as needed Activate the workflow ๐ Requirements: n8n instance (cloud or self-hosted) YouTube account with OAuth2 access Google Sheets account Access to transcript and AI APIs Basic n8n workflow knowledge ๐ข Customizing this workflow: Change the YouTube playlist ID to target your preferred podcasts or video series Adjust the transcript retrieval process for other APIs or formats Customize the AI prompt for different summary styles or focus areas Add or remove fields in the Google Sheets output Change the workflow trigger or polling frequency Switch to a different AI model if desired This workflow is designed to help you quickly learn from podcasts and videos you care aboutโwithout spending hours consuming the full content.
by Gleb D
This n8n workflow template automates the process of collecting and analyzing Twitter (X) posts for any public profile, then generates a clean, AI-powered summary including key metrics, interests, and activity trends. ๐ What It Does Accepts a user's full name and date range through a public form. Automatically finds the personโs X (formerly Twitter) profile using a Google search. Uses Bright Data to retrieve full post data from the X.com profile. Extracts key post metrics like views, likes, reposts, hashtags, and mentions. Uses Google Gemini (PaLM) to generate a personalized summary: tone, themes, popularity, and sentiments. Stores both raw data and the AI summary into a connected Google Sheet for further review or team collaboration. ๐ ๏ธ Step-by-Step Setup Deploy the public form to collect full name and date range. Build a Google search query using the name to find their X profile. Scrape the search results via Bright Data (Web Unlocker zone). Parse the page content using the HTML node. Use Gemini AI to extract the correct X profile URL. Pull full post data via Bright Data dataset snapshot API. Transform post data into clean structured fields: date_posted, description, hashtags, likes, views, quoted_post.date_posted, quoted_post.description, replies, reposts, quotes, and tagged_users.profile_name. Analyze all posts using Google Gemini for interest detection and persona generation. Save results to a Google Sheet: structured post data + AI-written summary. Show success or fallback messages depending on profile detection or scraping status. ๐ง How It Works: Workflow Overview Trigger: When user submits form Search & Match: Google search โ HTML parse โ Gemini filters matching X profile Data Gathering: Bright Data โ Poll for snapshot completion โ Fetch post data Transformation: Extract and restructure key fields via Code node AI Summary: Use Gemini to analyze tone, interests, and trends Export: Save results to Google Sheet Fallback: Display custom error message if no X profile found ๐จ Final Output A record in your Google Sheet with: Clean post-level data Profile-level engagement summary An AI-written overview including tone, common topics, and post popularity ๐ Credentials Used Bright Data account** (for search & post scraping) Google Gemini (PaLM)** or Gemini Flash via - OpenAI/Google Vertex API Google Sheets (OAuth2) account** (for result storage) โ ๏ธCommunity Node Dependency This workflow uses a custom community node: n8n-nodes-brightdata Install it via UI (Settings โ Community Nodes โ Install).
by Ranjan Dailata
Who this is for The Real Estate Intelligence Tracker is a powerful automated workflow designed for real estate analysts, investors, proptech startups, and market researchers who need to collect and analyze structured data from real estate listings across the web at scale. This workflow is tailored for: Real Estate Analysts** - Tracking property prices, locations, and market trends Investment Firms** - Sourcing high-opportunity listings for portfolio decisions PropTech Developers** - Automating listing insights for SaaS platforms Market Researchers** - Extracting insights from competitive housing data Growth Teams** - Monitoring geographic property trends and pricing fluctuations What problem is this workflow solving? Collecting structured real estate listing data from property websites is difficult due to bot protections and unstructured HTML content. Manual data collection is slow and error-prone, and traditional scrapers often get blocked or miss context. This workflow solves: Automated bypass of anti-bot protection using Bright Data Web Unlocker Conversion of unstructured HTML content into clean text using a Markdown-to-text LLM pipeline Structured extraction of key listing data like price, location, property type, and features using OpenAI Aggregation and delivery of insights to Google Sheets, local storage, and webhook-based alerts What this workflow does Convert to Text: Transforms scraped HTML/markdown into clean text using a Basic LLM Chain Structured Data Extraction: Uses OpenAI GPT-4o with the Information Extractor node to parse property attributes (price, address, area, type, etc.) Aggregate & Merge: Combines data from multiple pages or listings into a cohesive structure Outbound Data Handling: Google Sheets** โ Appends the structured real estate data for further analysis Save to Disk** โ Persists structured JSON/text data locally Webhook Notification** โ Sends data alerts or summaries to any third-party platform Pre-conditions You need to have a Bright Data account and do the necessary setup as mentioned in the "Setup" section below. You need to have an OpenAI Account. Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. In n8n, Configure the Google Sheet Credentials with your own account. Follow this documentation - Set Google Sheet Credential In n8n, configure the OpenAi account credentials. Ensure the URL and Bright Data zone name are correctly set in the Set URL, Filename and Bright Data Zone node. Set the desired local path in the Write a file to disk node to save the responses. How to customize this workflow to your needs Target Multiple Sites or Locations Update the Bright Data URL node dynamically with a list of regional real estate websites Loop through different city/state filter URLs Customize Extracted Fields Modify the Information Extractor prompt to extract fields like: Property size, number of bedrooms/bathrooms Days on market Nearby amenities or schools Agent contact details Integrate with More Destinations Add nodes to export data to Notion, Airtable, HubSpot, or your custom database Generate automated reports using PDF generators and email them Data Quality and Logging Add validation checks (e.g., missing price or address) Save intermediate files (markdown, raw HTML, JSON output) to disk for audit purposes