by Yaron Been
๐ AI-Powered YouTube Video Summary Distributor: From Channel to Community! Workflow Overview This sophisticated n8n automation transforms YouTube content discovery into a seamless, multi-platform intelligence sharing process. By intelligently connecting YouTube RSS, AI summarization, and content distribution platforms, the workflow: Discovers New Content: Monitors YouTube channels via RSS feed Captures latest video uploads Tracks content in real-time AI-Powered Summarization: Extracts video metadata Generates concise, meaningful summaries Leverages GPT-4o for intelligent content analysis Intelligent Distribution: Logs summaries in Google Sheets Sends summaries to Slack for review Publishes approved content to Reddit Detailed Setup Instructions 1. YouTube Data API Configuration Prerequisites Google Cloud Console account Enabled YouTube Data API v3 Setup Steps: Go to Google Cloud Console Create a new project Enable YouTube Data API v3 Create credentials (API Key) Store API key securely in n8n credentials Obtain channel RSS feed URL 2. OpenAI API Setup Prerequisites OpenAI account Active API subscription Configuration: Visit OpenAI Platform Generate API key Select GPT-4o model Configure API key in n8n credentials Set up billing and usage limits 3. Slack Integration Prerequisites Slack workspace Slack app permissions Setup Process: Create a Slack app in your workspace Configure OAuth scopes for sending messages Install app to workspace Obtain webhook or OAuth token Configure in n8n Slack node 4. Reddit API Configuration Prerequisites Reddit account Reddit application created Steps: Go to Reddit's app preferences Create a new application Obtain client ID and secret Configure OAuth2 credentials in n8n Select target subreddit Workflow Customization Channel Modification Replace YouTube RSS feed URL in trigger node Adjust channel_id parameter Modify extraction logic if needed Subreddit Customization Change subreddit parameter in Reddit node Adjust title and text formatting AI Summarization Tuning Modify system message in Summarizer Agent Adjust prompt for different content types Implement custom filtering Key Customization Points Modify RSS feed URL Change target subreddit Adjust AI summarization prompt Add custom filtering logic Implement multi-channel support Technical Requirements n8n v0.220.0 or higher YouTube Data API v3 OpenAI API access Slack workspace Reddit application Stable internet connection Potential Use Cases Content creator content tracking Research and trend analysis Social media content distribution Automated content curation Community engagement Security Considerations Use environment variables for API keys Implement proper OAuth2 authentication Respect platform usage guidelines Maintain user privacy Future Enhancement Roadmap Multi-language support Advanced content filtering Sentiment analysis integration Expanded platform distribution Customizable summarization parameters Workflow Visualization [YouTube RSS Trigger] โฌ๏ธ [Extract Channel ID] โฌ๏ธ [Fetch Video Details] โฌ๏ธ [AI Summarization] โฌ๏ธ [Google Sheets Logging] โฌ๏ธ [Slack Approval] โฌ๏ธ [Reddit Publishing] Hashtag Performance Boost ๐ #YouTubeAutomation #AIContentDistribution #WorkflowInnovation #ContentCuration #AIMarketing #DigitalMediaTech #AutomatedSummaries #CrossPlatformContent Connect With Me Ready to revolutionize your content workflow? ๐ง Email: Yaron@nofluff.online ๐ฅ YouTube: @YaronBeen ๐ผ LinkedIn: Yaron Been Transform your content strategy with intelligent, automated workflows! Note: Always test and customize the workflow to fit your specific use case and comply with platform guidelines.
by Yaron Been
This cutting-edge n8n automation is a powerful digital marketing tool designed to streamline the process of transforming Google Drive videos into Facebook advertising assets. By intelligently connecting cloud storage, video upload, and ad creation platforms, this workflow: Discovers Marketing Content: Automatically scans Google Drive Identifies video marketing materials Eliminates manual content searching Seamless Video Distribution: Downloads selected video files Uploads directly to Facebook Prepares videos for advertising Instant Ad Creative Generation: Creates Facebook ad creatives Leverages uploaded video content Accelerates marketing campaign setup Automated Platform Integration: Connects Google Drive and Facebook Reduces manual intervention Speeds up content deployment Key Benefits ๐ค Full Automation: Zero-touch video marketing ๐ก Smart Content Management: Effortless video distribution ๐ Rapid Campaign Setup: Quick ad creative generation ๐ Multi-Platform Synchronization: Seamless content flow Workflow Architecture ๐น Stage 1: Content Discovery Manual Trigger**: Workflow initiation Google Drive Integration**: Video file scanning Intelligent File Selection**: Identifies MP4 video files Prepares for marketing use ๐น Stage 2: Video Preparation Automatic Download** File Validation** Marketing-Ready Formatting** ๐น Stage 3: Facebook Upload Direct Video Upload** Ad Account Integration** Seamless Platform Transfer** ๐น Stage 4: Ad Creative Generation Automated Creative Setup** Video-Based Ad Creation** Instant Marketing Asset Preparation** Potential Use Cases Digital Marketing Teams**: Rapid content deployment Social Media Managers**: Streamlined ad creation Content Creators**: Efficient video marketing Small Business Owners**: Simplified advertising workflow Marketing Agencies**: Scalable content distribution Setup Requirements Google Drive Connected Google account Configured video folder Appropriate sharing settings Facebook Ads Ad account credentials Page ID configuration API access token n8n Installation Cloud or self-hosted instance Workflow configuration API credential management Future Enhancement Suggestions ๐ค AI-powered video selection ๐ Performance tracking integration ๐ Campaign launch notifications ๐ Multi-platform ad deployment ๐ง Intelligent content routing Technical Considerations Implement robust error handling Use secure API authentication Maintain flexible file processing Ensure compliance with platform guidelines Ethical Guidelines Respect copyright and usage rights Maintain transparent marketing practices Ensure appropriate content selection Provide clear advertising disclosures Hashtag Performance Boost ๐ #MarketingAutomation #VideoAdvertising #FacebookAds #DigitalMarketing #ContentMarketing #AIMarketing #WorkflowAutomation #SocialMediaStrategy #AdTech #MarketingInnovation Workflow Visualization [Manual Trigger] โฌ๏ธ [List Drive Videos] โฌ๏ธ [Download Video] โฌ๏ธ [Upload to Facebook] โฌ๏ธ [Create Ad Creative] Connect With Me Ready to revolutionize your digital marketing? ๐ง Email: Yaron@nofluff.online ๐ฅ YouTube: @YaronBeen ๐ผ LinkedIn: Yaron Been Transform your marketing workflow with intelligent, automated solutions!
by Yaron Been
๐ Automated Startup Intelligence: CrunchBase Updates to Email Digest Workflow! Workflow Overview This cutting-edge n8n automation is a sophisticated startup intelligence tool designed to transform market research into actionable insights. By intelligently connecting CrunchBase, AI processing, and Gmail, this workflow: Discovers Startup Updates: Automatically retrieves latest company information Tracks recent organizational changes Eliminates manual market research efforts Intelligent Data Processing: Filters and extracts key company details Generates AI-powered summaries Ensures comprehensive market intelligence Smart Summarization: Uses AI to create readable company updates Transforms complex data into digestible insights Provides professional, context-rich summaries Seamless Email Distribution: Automatically sends daily update digests Delivers insights directly to your inbox Enables rapid market awareness Key Benefits ๐ค Full Automation: Zero-touch startup research ๐ก Smart Filtering: Targeted company insights ๐ Comprehensive Tracking: Detailed market intelligence ๐ Multi-Source Synchronization: Seamless data flow Workflow Architecture ๐น Stage 1: Company Discovery Manual/Scheduled Trigger**: Market scanning CrunchBase API Integration** Intelligent Filtering**: Recent updates Specific time frames Key organizational information ๐น Stage 2: Data Extraction Comprehensive Metadata Parsing** Key Information Retrieval** Structured Data Preparation** ๐น Stage 3: AI Summarization OpenAI GPT Processing** Professional Summary Generation** Contextual Insight Creation** ๐น Stage 4: Email Distribution Gmail Integration** Automated Update Digest** Personalized Delivery** Potential Use Cases Venture Capitalists**: Startup ecosystem tracking Market Researchers**: Industry trend analysis Startup Founders**: Competitive intelligence Business Strategists**: Market opportunity identification Investors**: Real-time company insights Setup Requirements CrunchBase API API credentials Configured access permissions Company update tracking setup OpenAI API GPT model access Summarization configuration API key management Gmail Account Connected email Digest email configuration Appropriate sending permissions n8n Installation Cloud or self-hosted instance Workflow configuration API credential management Future Enhancement Suggestions ๐ค Advanced company trend analysis ๐ Multi-source intelligence gathering ๐ Customizable alert mechanisms ๐ Expanded industry tracking ๐ง Machine learning insights generation Technical Considerations Implement robust error handling Use secure API authentication Maintain flexible data processing Ensure compliance with API usage guidelines Ethical Guidelines Respect business privacy Use data for legitimate research Maintain transparent information gathering Provide proper attribution Hashtag Performance Boost ๐ #StartupIntelligence #MarketResearch #AIWorkflow #CompanyUpdates #BusinessIntelligence #TechInnovation #DataAutomation #StartupEcosystem #InvestorInsights #TrendTracking Workflow Visualization [Manual/Scheduled Trigger] โฌ๏ธ [Fetch Crunchbase Updates] โฌ๏ธ [Extract Company Details] โฌ๏ธ [AI Summarization] โฌ๏ธ [Send Email Digest] Connect With Me Ready to revolutionize your startup intelligence? ๐ง Email: Yaron@nofluff.online ๐ฅ YouTube: @YaronBeen ๐ผ LinkedIn: Yaron Been Transform your market research with intelligent, automated workflows!
by Yaron Been
Workflow Overview This advanced n8n automation is a sophisticated content intelligence tool that transforms YouTube video discovery into a seamless, multi-platform content distribution system. By leveraging RSS, AI, and multiple communication platforms, this workflow: Discovers New Content: Monitors YouTube channels via RSS feed Captures new video uploads automatically Extracts critical video metadata Generates Intelligent Summaries: Leverages OpenAI's GPT models to analyze video descriptions Creates concise, engaging video summaries Ensures high-quality, contextually accurate content Collaborative Approval Process: Sends summaries to Slack for human review Allows team members to approve or reject content Maintains rigorous quality control Multi-Platform Distribution: Logs summaries in Google Sheets for internal tracking Posts approved summaries to Discord Extends content reach with minimal manual effort Key Benefits ๐ค Full Automation: From video upload to Discord post ๐ก Smart Summarization: AI-powered content distillation ๐ Human Oversight: Slack approval ensures quality ๐ Comprehensive Tracking: Google Sheets documentation ๐ Multi-Platform Sharing: Seamless content distribution Workflow Architecture ๐น Stage 1: Content Discovery RSS Trigger**: Monitors YouTube channel for new videos Metadata Extraction**: Parses video URLs and IDs YouTube API Integration**: Retrieves detailed video information ๐น Stage 2: AI-Powered Summarization GPT Model**: Generates concise, relevant summaries Contextual Understanding**: Analyzes video descriptions Adaptive Summarization**: Handles various content types ๐น Stage 3: Collaborative Approval Slack Notification**: Sends summary for human review Interactive Approval**: Team can approve or reject content Quality Control Mechanism**: Prevents inappropriate or low-quality posts ๐น Stage 4: Multi-Platform Distribution Google Sheets Logging**: Maintains comprehensive content archive Discord Posting**: Shares approved summaries with wider audience Potential Use Cases Content Creators tracking channel performance Marketing teams automating content distribution Social media managers expanding online presence Community managers engaging across platforms Researchers monitoring specific YouTube channels Setup Requirements YouTube Data API Credentials Google Cloud API key Channel RSS feed URL OpenAI API Access OpenAI account API key for GPT model Preferred GPT model (GPT-4o, GPT-3.5) Slack Workspace Slack app with appropriate permissions Designated approval channel Discord Server Discord application credentials Target channel for posting summaries n8n Installation n8n platform (cloud or self-hosted) Import workflow configuration Configure API credentials Future Enhancements Multi-channel support Advanced filtering mechanisms Sentiment analysis integration Expanded platform distribution Customizable summarization parameters Technical Considerations Implement robust error handling Use exponential backoff for API calls Ensure secure credential management Maintain flexible parsing strategies Ethical Guidelines Respect content creator's intellectual property Provide proper attribution Ensure summaries add value Maintain transparency in content distribution Connect With Me Want to revolutionize your content workflow? ๐ง Email: Yaron@nofluff.online ๐ฅ YouTube: @YaronBeen ๐ผ LinkedIn: Yaron Been Transform your content strategy with intelligent, automated workflows!
by Yaron Been
Workflow Overview This cutting-edge n8n workflow is a powerful automation tool designed to revolutionize how content creators and marketers engage with YouTube channels. By leveraging AI and the YouTube Data API, this workflow automatically: Discovers New Content: Monitors a specific YouTube channel Retrieves the latest video in real-time Checks for new uploads at regular intervals Generates Intelligent Comments: Uses advanced AI (OpenAI's GPT models) to analyze video metadata Crafts contextually relevant, human-like comments Ensures each comment feels organic and engaging Seamless Deployment: Automatically posts the AI-generated comment directly on the video Eliminates manual interaction Increases potential channel visibility and engagement Key Benefits ๐ค Full Automation: No manual comment writing required ๐ก Smart Contextual Comments: AI understands video content โฑ๏ธ Time-Saving: Instant engagement without human intervention ๐ Potential Increased Visibility: Regular, intelligent interactions Setup Requirements YouTube Data API Credentials Obtain a Google Cloud API key Configure channel ID you want to target Set up OAuth2 authentication for comment posting OpenAI API Access Create an OpenAI account Generate an API key for comment generation Select preferred GPT model (GPT-4o, GPT-3.5, etc.) n8n Installation Install n8n (cloud or self-hosted) Import the workflow configuration Configure API credentials Set up scheduling preferences Potential Use Cases Content Creators monitoring competitor channels Marketing teams maintaining online presence Social media managers automating engagement Researchers tracking specific YouTube channels Future Enhancements Logging comment history Dynamic OAuth2 token management Multi-channel support Sentiment analysis for comment generation Connect With Me Got questions? Want to dive deeper? ๐ง Email: Yaron@nofluff.online ๐ฅ YouTube: @YaronBeen ๐ผ LinkedIn: Yaron Been **Unlock the power of AI-driven YouTube engagement โ automate, optimize, and amplify your online# Automate YouTube Engagement with GPT-4o Generated Comments Workflow Overview This n8n automation leverages AI to streamline YouTube channel engagement, providing a sophisticated solution for content interaction. By combining the YouTube Data API and OpenAI's GPT-4o, the workflow: Intelligent Content Discovery: Dynamically monitors specified YouTube channels Real-time detection of new video uploads Configurable monitoring intervals AI-Powered Comment Generation: Utilizes GPT-4o for contextual analysis Generates nuanced, platform-appropriate comments Ensures authentic, relevant interactions Automated Engagement: Seamlessly posts AI-crafted comments Enhances channel visibility Reduces manual social media management Key Benefits ๐ค Advanced Automation: AI-driven engagement ๐ก Contextual Intelligence: GPT-4o powered insights โฑ๏ธ Efficiency Optimization: Instant, scalable interactions ๐ Strategic Visibility: Consistent, meaningful channel presence Detailed Setup Instructions Prerequisites n8n instance (cloud or self-hosted) YouTube Data API access OpenAI API key Target YouTube channel(s) Configuration Steps YouTube Data API Setup Create a Google Cloud project Enable YouTube Data API v3 Generate OAuth2 credentials Store credentials securely in n8n OpenAI API Configuration Create OpenAI account Generate API key Select GPT-4o model Configure API key in n8n credentials Workflow Customization Replace placeholder channel IDs Adjust monitoring frequency Customize AI prompt for comment generation Configure OAuth2 authentication Workflow Customization Options Modify AI prompt to match specific content styles Add keyword filters for video selection Implement multi-channel support Create custom engagement rules Potential Use Cases Content creator audience engagement Brand social media management Community interaction automation Research and monitoring Ethical Considerations Maintain transparency about AI-generated comments Respect platform guidelines Avoid spam or misleading interactions Ensure comments add genuine value Future Enhancement Roadmap Advanced sentiment analysis Multi-language support Engagement performance tracking Adaptive comment generation Security Best Practices Never hardcode API keys Use n8n's credential management Implement secure OAuth2 authentication Regularly rotate API credentials Technical Requirements n8n v0.220.0 or higher YouTube Data API v3 OpenAI API access Stable internet connection Workflow Architecture [YouTube Channel Trigger] โฌ๏ธ [Fetch Latest Video] โฌ๏ธ [AI Comment Generation] โฌ๏ธ [Post Comment] #YouTubeAutomation #AIEngagement #ContentMarketing #SocialMediaTech #GPT4Automation #WorkflowInnovation #AIComments #DigitalMarketing Connect With Me Exploring AI-Powered Social Media Automation? ๐ง Email: Yaron@nofluff.online ๐ฅ YouTube: @YaronBeen ๐ผ LinkedIn: Yaron Been Transform your YouTube engagement with intelligent, responsible automation! Note: This workflow template is a starting point. Always customize and test thoroughly in your specific environment.
by Manu
In Grist, when I mark a row as confirmed (via a toggle): a webhook is set up to notify n8n, and this workflow will create derived records in the destination table. Design decisions Confirmation-based In the source table there is a boolean column "Confirmed" that will trigger the transfer. This way there is a manual check involved & it's a conscious step to trigger the workflow. Runs once If the destination table already contains an entry, we will not re-create/update it (as it might've already been changed manually) Setup Create a boolean column Confirmed in source table Add a webhook in Grist Settings Add grist API credentials in n8n Set document ID & source table ID/Name in the 'get existing' node Set docID, the destination table ID/Name - and the columns & values you want in the Create Row node
by ist00dent
This n8n template empowers you to instantly summarize long pieces of text by sending a simple webhook request. By integrating with ApyHub's summarization API, you can distil complex articles, reports, or messages into concise summaries, significantly boosting efficiency across various domains. ๐ง How it works Receive Content Webhook:** This node acts as the entry point, listening for incoming POST requests. It expects a JSON body containing: content: The long text you want to summarize. summary_length (optional): The desired length of the summary (e.g., 'short', 'medium', 'long'). Defaults to 'medium'. And a header containing your apy-token for the ApyHub API. Start Summarization Job:** This node sends a POST request to ApyHub's summarization endpoint (api.apyhub.com/sharpapi/api/v1/content/summarize). It passes the content and summary_length from the webhook body, along with your apy-token from the headers. ApyHub processes the text asynchronously, and this node immediately returns a job_id. Get Summarization Result:** Since ApyHub's summarization is an asynchronous process, this node is crucial. It polls ApyHub's job status endpoint (api.apyhub.com/sharpapi/api/v1/content/summarize/job/status/{{job_id}}) using the job_id obtained from the previous step. It continues to check the status until the summarization is finished, at which point it retrieves the final summarized text. Respond with Summarized Content:** This node sends the final, distilled summarized text back to the service that initiated the webhook. ๐ค Who is it for? This workflow is extremely useful for: Content Creators & Marketers:** Quickly summarize articles for social media snippets, email newsletters, or blog post intros. Researchers & Students:** Efficiently get the gist of academic papers, reports, or long documents without reading every word. Customer Support & Sales Teams:** Summarize customer inquiries, long email chains, or call transcripts to quickly understand key issues or discussion points. News Aggregators & Media Monitoring:** Automatically generate summaries of news articles from various sources for quick consumption. Business Professionals:** Condense lengthy reports, meeting minutes, or project updates into digestible summaries for busy stakeholders. Legal & Compliance:** Summarize legal documents or regulatory texts to highlight critical clauses or changes. Anyone Dealing with Information Overload:** Use it to save time and extract key information from overwhelming amounts of text. ๐Data Structure When you trigger the webhook, send a POST ** request with a **JSON body and an apy-token in the headers: { "content": "Your very long text goes here. This could be an article, a report, a transcript, or any other textual content you want to summarize. The longer the text, the more valuable summarization becomes!", "summary_length": "medium" // Optional: "short", "medium", or "long" } Headers: apy-token: YOUR_APYHUB_API_KEY Note: You'll need to obtain an API Key from ApyHub to use their API services. They typically offer a free tier for testing. The workflow will return a JSON response similar to this (the summary content will vary based on input): { "summary": "Max Verstappen believes the Las Vegas Grand Prix is '99% show and 1% sporting event', not looking forward to the razzmatazz. Other drivers, like Fernando Alonso, were more equivocal about the hype, acknowledging the investment and spectacle. Lewis Hamilton praised the city's energy but emphasized it's 'a business, ultimately', believing there will still be good racing.", "status": "finished", "result_file_id": "..." // ApyHub might provide a file ID for larger results } โ๏ธ Setup Instructions Get an ApyHub API Key:** Go to https://apyhub.com/ and sign up to get your API key. Import Workflow:** In your n8n editor, click "Import from JSON" and paste the provided workflow JSON. Configure Webhook Path:** Double-click the Receive Content Webhook node. In the 'Path' field, set a unique and descriptive path (e.g., /summarize-content). Activate Workflow:** Save and activate the workflow. ๐ Tips This content summarizer is a powerful component. Here's how to supercharge it and make it an indispensable part of your automation arsenal: Integrate with Document/File Storage:** Google Drive/Dropbox/OneDrive:* Automatically summarize documents uploaded to these services. Add a Watch New Files trigger (if available for your service) or a Cron node to regularly check for new files. Then, read the file content, pass it to this summarizer, and save the summary back to a designated folder or as a comment on the original file. CRM/CMS Systems:* Pull long notes, customer interactions, or article drafts from your CRM/CMS, summarize them, and update the records with the concise version. Email Processing & Triage:** Email Trigger: Use an Email node to trigger the workflow when new emails arrive. Extract the email body, summarize it, and then: Send a shortened summary as a notification to your Slack or Telegram. Add a summary to a task management tool (e.g., Trello, Asana) for quicker triaging. Create a summary for an email digest. Slack/Discord Bot Integration:** Create a Slack/Discord command (using a custom webhook or a dedicated Slack/Discord node) where users can paste long text. The bot then sends the summarized version back to the channel. Dynamic Summary Length & Options:** Allow the user to specify summary_length (short, medium, long) in the webhook body, as already implemented. Explore ApyHub's documentation for more parameters (if any) and dynamically pass them. Error Handling & User Feedback:** Add an IF node after Get Summarization Result to check for status: 'failed' or error messages. If an error occurs, send a helpful message back to the webhook caller or an internal alert. For very long texts that might exceed API limits, add a Function node to truncate the input content if it's too long, and notify the user. Multi-language Support (if ApyHub offers it):** If ApyHub supports summarization in multiple languages, extend the webhook to accept a language parameter and pass it to the API. Web Scraping & Article Summaries:** Combine this with a HTTP Request node to scrape content from a web page (e.g., a news article). Then, pass the extracted article text to this summarizer to get quick insights. Data Storage & Archiving:** Store the original content alongside its summary in a database (e.g., PostgreSQL, MongoDB) or a simple spreadsheet (Google Sheets, Airtable). This creates a searchable, summarized archive of your content. Automated Report Generation:** If you receive daily/weekly reports, use this workflow to summarize key sections, then compile these summaries into a concise digest or dashboard using a Merge node and send it out automatically.
by Ranjan Dailata
Who this is for? Google SERP Tracker + Trends and Recommendations is an AI-powered n8n workflow that extracts Google search results via Bright Data, parses them into structured JSON using Google Gemini, and generates actionable recommendations and search trends. It outputs CSV reports and sends real-time Webhook notifications. This workflow is ideal for: SEO Agencies needing automated rank & trend tracking Growth Marketers seeking daily/weekly search-based insights Product Teams monitoring brand or competitor visibility Market Researchers performing search behavior analysis No-code Builders automating search intelligence workflows What problem is this workflow solving? Traditional tracking of search engine rankings and search trends is often fragmented and manual. Analyzing SERP changes and trends requires: Manual extraction or using unstable scrapers Unstructured or cluttered HTML data Lack of actionable insights or recommendations This workflow solves the problem by: Automating real-time Google SERP data extraction using Bright Data Structuring unstructured search data using Google Gemini LLM Generating actionable recommendations and trends Exporting both CSV reports automatically to disk for downstream use Notifying external systems via Webhook What this workflow does Accepts search input, zone name, and webhook notification URL Uses Bright Data to extract Google Search Results Uses Google Gemini LLM to parse the SERP data into structured JSON Loops over structured results to: Extract recommendations Extract trends Saves both as .csv files (example below): Google_SERP_Recommendations_Response_2025-06-10T23-01-50-650Z.csv Google_SERP_Trends_Response_2025-06-10T23-01-38-915Z.csv Sends a Webhook with the summary or file reference LLM Usage Google Gemini LLM handles: Parsing Google Search HTML into structured JSON Summarizing recommendation data Deriving trends from the extracted SERP metadata Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. A Google Gemini API key (or access through Vertex AI or proxy). Update the Set input fields with the search criteria, Bright Data Zone name, Webhook notification URL. How to customize this workflow to your needs Input Customization Set your target keyword/phrase in the search field Add your webhook_notification_url for external triggers or notifications SERP Source You can extend the Bright Data search logic to include other engines like Bing or DuckDuckGo. Output Format Edit the .csv structure in the Convert to File nodes if you want to include/exclude specific columns. LLM Prompt Tuning The Gemini LLM prompt inside the Recommendation or Trends extractor nodes can be fine-tuned for domain-specific insight (e.g., SEO vs eCommerce focus).
by Amit Mehta
How it Works This workflow reads sheet details from a source Google Spreadsheet, creates a new spreadsheet, replicates the sheet structure, enriches the content by reading data, and writes it into the corresponding sheets in the new spreadsheet. The process is looped for every sheet, providing an automated way to duplicate and transform structured data. ๐ฏ Use Case Automate duplication and data enrichment for multi-sheet Google Spreadsheets Replicate templates across new documents with consistent formatting Data team workflows requiring repetitive structured Google Sheets setup Setup Instructions 1. Required Google Sheets You must have a source spreadsheet with multiple sheets. The destination spreadsheet will be created automatically. 2. API Credentials Google Sheets OAuth2** โ connect to both read and write spreadsheets. HTTP Request Auth** โ if external API headers are needed. 3. Configure Fields in Write Sheet Ensure you define appropriate columns and mapping for the destination sheet. ๐ Workflow Logic Manual Trigger: Starts the flow on user demand. Create New Spreadsheet: Generates a blank spreadsheet. HTTP Request: Retrieves all sheet names from the source spreadsheet. JavaScript Code: Extracts titles and metadata from the HTTP response. Loop Over Sheets: Iterates through each sheet retrieved. Delete Default Sheet: Removes the placeholder 'Sheet1'. Create Sheets: Replicates each original sheet in the new document. Read Spreadsheet1: Pulls data from the matching original sheet. Write Sheet: Appends the data to the newly created sheets. ๐งฉ Node Descriptions | Node Name | Description | |-----------|-------------| | Manual Trigger | Starts the workflow manually by user test. | | Create New Spreadsheet | Creates a new Google Spreadsheet for output. | | HTTP Request | Fetches metadata from the source spreadsheet including sheet names. | | Code | Processes sheet metadata into a list for iteration. | | Loop Over Items | Loops over each sheet to replicate and populate. | | Google Sheets2 | Deletes the default 'Sheet1' from the new spreadsheet. | | Create Sheets | Creates a new sheet matching each source sheet. | | Read Spreadsheet1 | Reads data from the source sheet. | | Write sheet | Writes the data into the corresponding new sheet. | ๐ ๏ธ Customization Tips Adjust the Google Sheet title to be dynamic or user-input driven Add filtering logic before writing data Append custom audit columns like 'Timestamp' or 'Processed By' Enable logging or Slack alerts after each sheet is created ๐ Required Files | File Name | Purpose | |-----------|---------| | My_workflow_4.json | Main workflow JSON file for sheet duplication and enrichment | ๐งช Testing Tips Test with a spreadsheet containing 2โ3 simple sheets Validate whether all sheets are duplicated Check if columns and data structure remain intact Watch for authentication issues in Google Sheets nodes ๐ท Suggested Tags & Categories #GoogleSheets #Automation #DataEnrichment #Workflow #Spreadsheet
by Eric
Why use this You need to delete (many) posts on a WordPress website and also delete the featured image associated with each post. Hours of rote work cut into a fraction with this automation. How it works set your wordpress URL in the manual trigger node set your WP post search parameters (WP API returns 10 posts by default; you could also set up pagination for scaling this automation beyond 10 posts per execution) decide (and build) your filter/approval process What you can expect this automation is set up to run the 10 oldest pending posts, with oldest first if you remove the 'Filter' node from the workflow, after each run, another 10 posts will be returned from WP Notes on Filter/Approval This is arbitrary and depends on your own use case. Maybe you have an editor who needs to approve the post deletion. You might want to get approval by email, slack msg or ticketing system. Or maybe you just want to monitor the process and spare specific posts from deletion. I used the Filter node to only grab the first item (itemIndex < 1) which in this case was the oldest pending post. This could also be expanded to two separate workflows: One triggered when a pending post is created that sends an approval request A second triggered by the approval/rejection that either publishes or deletes the post, depending on the approval result This would require another HTTP request, similar to the DELETE post request, that instead publishes the post.
by Alex
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How It Works This template orchestrates a multi-step workflow that constructs a comprehensive four-zone automation matrixโGreen, Yellow, Red, and Whiteโgrounded in the Human Agency Scale (HAS). When a user sends a job title via Telegram, the workflow routes both text and voice messages appropriately. Voice messages are transcribed via OpenAI's Whisper, while text inputs bypass transcription. Both streams merge into a single data flow. The AI Agent node, powered by GPT-4, analyzes the user's profession and core tasks. It also leverages live context by calling the Tavily search tool, ensuring the analysis incorporates up-to-date information. After the evaluation, the workflow formats and returns the completed matrix, with detailed task examples and rationales for each zone, back to the user via Telegram. Setup Instructions Create an OpenAI credential in n8n (model: GPT-4.1 mini). Add a Tavily credential with your API key (FREE plan available). Configure a Telegram Bot credential: API bot token. Import this JSON as a new workflow in n8n and map credentials in each node. Activate the workflow; test by sending sample job titles; adjust node timeouts and webhook settings as needed. Requirements n8n v1.0.0 or higher Active OpenAI API key (GPT-4.1 mini access) Tavily API key for web context search Telegram Bot token with correctly configured webhook Stable internet connectivity Audience & Problem This template is designed for consultants, HR professionals, and analysts who need a scalable, standardized approach to evaluate which routine tasks in a given profession can be automated, which require human oversight, and which should remain manual to preserve strategic judgment, creativity, and expertise.
by Joseph LePage
Transform your local N8N instance into a powerful chat interface using any local & private Ollama model, with zero cloud dependencies โ๏ธ. This workflow creates a structured chat experience that processes messages locally through a language model chain and returns formatted responses ๐ฌ. How it works ๐ ๐ญ Chat messages trigger the workflow ๐ง Messages are processed through Llama 3.2 via Ollama (or any other Ollama compatible model) ๐ Responses are formatted as structured JSON โก Error handling ensures robust operation Set up steps ๐ ๏ธ ๐ฅ Install N8N and Ollama โ๏ธ Download Ollama 3.2 model (or other model) ๐ Configure Ollama API credentials โจ Import and activate workflow This template provides a foundation for building AI-powered chat applications while maintaining full control over your data and infrastructure ๐.