by Khairul Muhtadin
Stop wasting hours watching long videos. This n8n workflow acts as your personal "TL;DW" (Too Long; Didn't Watch) assistant. It automatically pulls YouTube transcripts using Decodo, analyzes them with Google Gemini, and sends a detailed summary straight to your Telegram. Why You Need This Save Time:** Turn a 2-hour video into a 5-minute read (95% faster). Don't Miss a Thing:** Captures key points, chapters, tools mentioned, and quotes that you might miss while skimming. Instant Results:** Get a structured summary in Telegram within 30-60 seconds. Multi-Language:** Works with any video language that has YouTube captions. Who Is This For? Creators & Marketers:** Spy on competitor strategies and extract tools without watching endless footage. Students:** Turn lecture recordings into instant study notes. Busy Pros:** Digest conference talks and webinars on the go. How It Works Send Link: You message a YouTube link to your Telegram bot. Scrape: The bot uses the Decodo API to grab the video transcript and metadata (views, chapters, etc.). Analyze: Google Gemini reads the text and writes a structured summary (overview, takeaways, tools). Deliver: You receive the formatted summary in chat. Setup Guide What You Need n8n instance** (to run the workflow) Telegram Bot Token** (free via @BotFather) Decodo Scraper API Key** (for YouTube data - Get it here) Google Gemini API Key** (for the AI - Get it here) Quick Installation Import: Load the JSON file into your n8n instance. Credentials: Add your API keys for Telegram, Decodo, and Google Gemini in the n8n credentials section. Configure: In the "Alert Admin" node, set the chatId to your Telegram User ID (find it via @userinfobot). (Optional) Change the languageCode in the Config node if you want non-English transcripts. Test: Send a YouTube link to your bot. You should see a "Processing..." message followed by your summary! Troubleshooting & Tips "Not a YouTube URL":** Make sure you are sending a standard youtube.com or youtu.be link. No Transcript:** The video must have captions (auto-generated or manual) for this to work. Customization:** You can edit the AI Prompt in the "Generate TLDR" node to change how the summary looks (e.g., "Make it funny" or "Focus on technical details"). Created by: Khaisa Studio Category: AI-Powered Automation Tags: YouTube, AI, Telegram, Summarization, Decodo, Gemini Need custom workflows? Contact us Connect with the creator: Portfolio • Workflows • LinkedIn • Medium • Threads
by Yves Tkaczyk
Automated image processing for e-commerce product catalog Use cases Monitor a Google Drive folder, process each image based on the prompt defined in Workflow Configuration and save the new image to the specified output Google Drive folder. Maintain a processing log in Google Sheets. 👍 This use case can be extended to any scenario requiring batch image processing, for example, unifying the look and feel of team photos on a company website. How it works Trigger: Watches a Google Drive folder for new or updated files. Downloads the image, processes it using Google Gemini (Nano Banana), and uploads the new image to the specified output folder. How to use Google Drive and Google Sheets nodes: Create Google credentials with access to Google Drive and Google Sheets. Read more about Google Credentials. Update all Google Drive and Google Sheets nodes (6 nodes total) to use these credentials Gemini AI node: Create Google Gemini(PaLM) Api credentials. Read more about Google Gemini(PaLM) credentials. Update the Edit Image node to use the Gemini Api credentials. Create a Google Sheets spreadsheet following the steps in Google Sheets Configuration (see right ➡️). Ensure the spreadsheet can be accessed as Editor by the account used for the Google Credentials. Create input and output directories in Google Drive. Ensure these directories are accessible by the account used for the credentials. Update the File Created, File Updated and Workflow Configuration node following the steps in the green Notes (see right ➡️). Requirements Google account with Google API access Google AI Studio account with ability to create a Google Gemini API key. Basic n8n knowledge: understanding of triggers, expressions, and credential management Who’s it for Anyone wanting to batch process images for product catalog. Other use cases are applicable. Please reach out reach out if you need help customizing this workflow. 🔒 Security All credentials are stored securely using n8n's credential system. The only potentially sensitive information stored in the workflow is the Google Drive folder and Sheet IDs. These should be secured according to your organization’s needs. Need Help? Reach out on LinkedIn or Ask in the Forum!
by octik5
🤖 This n8n workflow automatically posts new articles from an RSS feed to your Telegram channel. It enhances article text using AI, adds a branded watermark to the article image, and keeps your channel updated with fresh and consistent content. Use Cases Automatically publish RSS feed updates to Telegram. Use AI to rewrite, summarize, or stylize text. Add watermarked visuals to keep your content on-brand. Perfect for news aggregators, media channels, and content creators. How It Works RSS Trigger: Monitors an RSS feed for new articles. Check Google Sheet: Compares links to avoid reposting. Fetch Article: Retrieves the full article content for new links. AI Enhancement: Uses an AI agent to improve readability and engagement. Image Watermarking: Fetches the main image and adds a watermark. Telegram Publishing: Sends the final AI-enhanced post to your Telegram channel. Setup Steps Google Sheet:** Create and share a sheet to store processed links. RSS Source:** Add your feed URL to the RSS Trigger node. AI Agent:** Configure a prompt and model (e.g., OpenRouter or Gemini). Telegram Bot:** Add your bot token and chat ID for message delivery. Test & Automate:** Run once manually, then let it update automatically. Tips You can tweak the AI prompt to match your tone (e.g., newsy, casual, concise). Adjust watermark placement, font, and color for brand consistency. AI models may have usage costs or regional restrictions. ✅ Key Advantage: Fully automated — from RSS feed detection to AI-enhanced publishing with branded visuals.
by Vasyl Pavlyuchok
What this template does This workflow turns the arXiv AI feed into a daily research assistant. Every morning it fetches the latest Artificial Intelligence papers from arXiv.org, deduplicates them, stores one page per paper in Notion (with metadata + PDF link), generates a Deep Research Summary of each PDF using Gemini, and finally posts a short update to a Telegram channel with links to both the paper and the Notion summary. Who is this for & what problem it solves This template is designed for founders, researchers, builders and curious professionals who want to stay up-to-date with AI research without reading every paper in full. It solves the “information overload” problem: instead of manually checking arXiv and skimming PDFs, you get a curated daily feed with human-style explanations stored in your own Notion database. Use Cases Daily AI research digest for solo founders or small teams. Private “AI research hub” in Notion for your company or lab. Telegram channel that shares the latest AI papers plus plain-English summaries. Personal learning pipeline: track, tag and revisit important papers over time. How it works (workflow overview) Scheduled trigger runs every day at 08:00. HTTP Request pulls the latest AI papers from arXiv’s API. Results are converted from XML to JSON and cleaned. A time-window filter keeps only recent papers and removes duplicates. For each paper, a Notion page is created (metadata + PDF URL). Gemini reads the PDF and returns a structured, multi-chunk summary. Each chunk is appended to the same Notion page as rich-text blocks. A Telegram message is sent with title, short abstract, PDF link and Notion link. Setup (step-by-step) Create a Notion database and connect your Notion integration. Map the properties in the Register to Notion Database node (title, arxiv_id, abstract, authors, categories/tags, published date, pdf URL). Add your Gemini API key and model in the Analyze doc (Prompt Ultra-Pro) node. Add your Telegram bot token and chat_id in the Send a text message node. (Optional) Adjust the arXiv query in HTTP Request to focus on your preferred AI categories or keywords. Enable the Scheduled Daily Trigger when you’re ready to run it in production. Customization options Change the arXiv search query (keywords, categories, max_results). Modify the time window logic (e.g. 24h, 48h, or no filter). Adapt the Notion properties to your own schema (status, tags, priority, etc.). Switch the messaging channel (Telegram, Discord, Slack) using similar nodes.
by Rapiwa
Who is this for? This workflow is for Shopify store owners, customer success, and marketing teams who want to automatically check customers’ WhatsApp numbers and send personalized messages with discount codes for canceled orders. It helps recover lost sales by reaching out with special offers. What This Workflow Does Automatically checks for canceled orders on a schedule Fetches canceled orders from Shopify Creates personalized recovery messages based on customer data Verifies customers’ WhatsApp numbers via Rapiwa Logs results in Google Sheets: “Verified & Sent” for successful messages, “Unverified & Not Sent” for unverified numbers Requirements Shopify store with API access enabled Shopify API credentials with access to orders and customer data Rapiwa account and a valid Bearer token Google account with Sheets access and OAuth2 credentials Setup plan Add your credentials Rapiwa: Create an HTTP Bearer credential in n8n and paste your token (example name: Rapiwa Bearer Auth). Google Sheets: Add an OAuth2 credential (example name: Google Sheets). Set up Shopify Replace your_shopify_domain with your real Shopify domain. Replace your_shop_access-token with your actual Shopify API token. Set up Google Sheets Update the example spreadsheet ID and sheet gid with your own. Make sure your sheet’s column headers match the mapping keys exactly—same spelling, case, and no extra spaces. Configure the Schedule Trigger Choose how often you want the workflow to check for canceled orders (daily, weekly, etc.). Check the HTTP Request nodes Verify endpoint: Should call Rapiwa’s verifyWhatsAppNumber. Send endpoint: Should use Rapiwa’s send-message API with your template (includes customer name, reorder link, discount code). Google Sheet Column Structure The Google Sheets nodes in the flow append rows with these column. Make sure the sheet headers match exactly. A Google Sheet formatted like this ➤ sample | Name | Number | Item Name | Coupon | Item Link | Validity | Status | | ------------ | ------------- | ------------------------------------------- | -------- | --------------------------------------------------------------------------------------------------------------------------------------------- | ----------- | ---------- | ------ | | Abdul Mannan | 8801322827799 | Samsung Galaxy S24 Ultra 5G 256GB-512GB-1TB | REORDER5 | Re-order Link | verified | sent | | Abdul Mannan | 8801322827790 | Samsung Galaxy S24 Ultra 5G 256GB-512GB-1TB | REORDER5 | Re-order Link | unverified | not sent | Important Notes Do not hard-code API keys or tokens; always use n8n credentials. Google Sheets column header names must match the mapping keys used in the nodes. Trailing spaces are common accidental problems — trim them in the spreadsheet or adjust the mapping. Message templates reference - update templates if you need to reference different data. The workflow processes cancelled orders in batches to avoid rate limits. Adjust the batch size if needed. Useful Links Install Rapiwa**: How to install Rapiwa Dashboard:** https://app.rapiwa.com Official Website:** https://rapiwa.com Documentation:** https://docs.rapiwa.com Shopify API Documentation:** https://shopify.dev/docs/admin-api Support & Help WhatsApp**: Chat on WhatsApp Discord**: SpaGreen Community Facebook Group**: SpaGreen Support Website**: https://spagreen.net Developer Portfolio**: Codecanyon SpaGreen
by Patrick Jennings
Sleeper NFL Team Chatbot Starter A Telegram chatbot built to look up your fantasy football team in the Sleeper app and return your roster details, player names, positions, and team info. This starter workflow is perfect for users who want a simple, conversational way to view their Sleeper team in-season or pre-draft. What It Does When a user types their Sleeper username into Telegram, this workflow: Extracts the username from Telegram Pulls their Sleeper User ID Retrieves their Leagues and selects the first one (by default) Pulls the full league Rosters Finds the matching roster owned by that user Uses player_ids to look up full player info from a connected database (e.g. Airtable or Google Sheets) Returns a clean list of player names, positions, and teams via Telegram Requirements To get this running, you’ll need: A Telegram bot (set up through BotFather) A Sleeper Fantasy Football account A synced player database that matches player_id to full player details (we recommend using the companion template: Sleeper NFL Players Daily Sync) Setup Instructions Import the workflow into your n8n instance Add the required credentials: Telegram (API Key from BotFather) Airtable (or replace with another database method like Google Sheets or HTTP request to a hosted JSON file) Trigger the workflow by sending your exact Sleeper username to the bot Your full team roster will return as a formatted message > If the user is in multiple Sleeper leagues, the current logic returns the first league found. Example Output You have 19 players on your roster: Cam Akers (RB - NO), Jared Goff (QB - DET), ... Customization Notes Replace Telegram Trigger with any other input method (webhook, form input, etc.) Replace Airtable node with Google Sheets, SQL DB, or even a local file if preferred You can hardcode a Sleeper username if you're using this for a single user Related Templates Sleeper NFL Players Daily Sync (syncs player_id to player name, position, team) -Create Player Sync first then either integrate it to this template or reate a subworkflow from it & use most recent data set. Difficulty Rating & Comment (from the author) 3 out of 10 if this ain't you're first rodeo, respectfully. Just a little bit more work on adding the Players Sync as your data table & knowing how to GET from Sleeper. If you use Sleeper for fantasy football, lets go win some games!
by Curso Bot com IA
Template Overview This workflow demonstrates how to build a simple Telegram bot that can schedule events, check service prices, and query company documents using AI integrated with MCP and RAG. It’s designed to show how n8n can connect conversational interfaces with internal tools in a clear and scalable way. Key Concepts (explained simply): MCP (Multi‑Channel Processing): A framework that lets the AI agent connect to external services (like Google Calendar, Docs, Sheets) through MCP Clients and Servers. Think of it as the “bridge” between the bot and your tools. RAG (Retrieval‑Augmented Generation): A method where the AI retrieves information from documents before generating a response. This ensures answers are accurate and based on your actual data, not just the AI’s memory. ⚙️ Setup Instructions (step‑by‑step) Create a Telegram Bot Use BotFather to generate a bot and get the API token. Configure Google Services Make sure you have access to Google Calendar, Docs, and Sheets. Connect them via MCP Server so the agent can call these tools. Set up Redis Memory Create an Upstash Redis account. Configure it in the workflow to store conversation history. Import the Template into n8n Load the workflow and update credentials (Telegram, Google, Redis). Test the Bot Send a message like “Schedule laptop maintenance tomorrow” and check if it creates an event in Google Calendar. 🛠 Troubleshooting Section Bot not responding? Verify your Telegram API token is correct. Google services not working? Check that your MCP Server is running and properly connected to Calendar, Docs, and Sheets. Conversation context lost? Ensure Redis memory is configured and accessible. Wrong date/time? Confirm that relative dates (“tomorrow”, “next week”) are being converted into ISO format correctly. Customization Examples This template is flexible and can be adapted to different scenarios. Here are some ideas: Change the communication channel Replace the Telegram Trigger with WhatsApp, Slack, or a Webhook to fit your preferred platform. Expand document sources Connect additional Google Docs or Sheets, or integrate with other storage (e.g., Notion, Confluence, or internal databases) to broaden the bot’s knowledge base. Add new services Extend the workflow to handle more requests, such as booking meeting rooms, checking inventory, or creating support tickets. Personalize responses Customize the AI Agent’s tone and style to match your company’s branding (formal, friendly, or technical). Segment agents by role Create specialized agents (e.g., one for scheduling, one for pricing, one for troubleshooting) to keep the workflow modular and scalable. Integrate external APIs Connect to services like Google Calendar, CRM systems, or helpdesk platforms to automate more complex tasks.
by Harry Gunadi Permana
Get Forex Factory News Release to Telegram, Google Sheets. Record News Data and Live Price from MyFxBook for Affected Currency Pairs. This n8n template demonstrates how to capture Actual Data Releases as quickly as possible for trading decisions. Use cases: Get notified if the actual data release is positive or negative for the relevant currency. Use the Telegram chat message about the news release as a trigger to open a trading position in MetaTrader 4. Record news data and live price to google sheets for analysis. Currency Pairs EURUSD, GBPUSD, AUDUSD, NZDUSD, USDJPY, USDCHF, USDCAD, XAUUSD How it works A news release event acts as the trigger. Only news with a numerical Forecast value will be processed. Events that cannot be measured numerically (e.g., speeches) are ignored. Extract news details: currency, impact level (high/medium), release date, and news link. Wait 10 seconds to ensure the Actual value is available on the news page. Scrape the Actual value from the news link using Airtop. If the Actual value is not available, wait another 5 seconds and retry scraping. Extract both Actual and Forecast values from the scraped content. Remove non-numeric characters (%, K, M, B, T) and convert values to numbers. Determine the effect: If the Actual value is lower than the Forecast value (and lower is better), send it to the True branch. Otherwise, send it to the False branch. Record news data and live price from MyFxBook to Google Sheets. How to use Enter all required credentials. Create or Download Google Sheets file like this in your Google Drive: https://docs.google.com/spreadsheets/d/1OhrbUQEc_lGegk5pRWWKz5nrnMbTZGT0lxK9aJqqId4/edit?usp=drive_link Run the workflow. Requirements Google Calendar credentials Airtop API key Telegram Chat ID Telegram Bot API token Enable Google Drive API in Google Cloud Console Google Sheets credentials Need Help? Join the Discord or ask in the Forum! Thank you! Update Sept 26, 2025: Add new edit node
by Vitorio Magalhães
Who’s it for This template is designed for individuals who want to gain full control over their personal finances without the hassle of manual tracking. Ideal for freelancers, small business owners, or anyone who wants a simple, automated way to monitor income and expenses. How it works / What it does Using n8n, Telegram, and Google Sheets, this workflow allows you to log, edit, and query your financial transactions through simple Telegram messages. The AI interprets your input—whether text or audio—and automatically categorizes your income and expenses. Responses are delivered fully formatted in Telegram HTML, giving you clean, readable summaries and insights. Features include: Add, edit, and delete transactions automatically Query totals and category-specific expenses, e.g., “How much did I spend on food this month?” Generate financial summaries and monthly reports Automatic ID assignment and date handling How to set up Deploy this workflow on your self-hosted n8n instance. Connect your Telegram Bot and Google Sheets account. Configure the Google Gemini AI node for message interpretation. Update sheet headers and categories if needed. Start sending messages to your Telegram bot to track expenses instantly. How to Set Up the Google Sheet To use this workflow, you’ll need a Google Sheet with the following structure: | Column Name | Description | | ------------------- | ----------------------------------------------- | | id | Unique sequential identifier (auto-incremented) | | type | "income" or "expense" | | value | Monetary value (format: 1234.56) | | category | Classification of the transaction | | payment\_method | Payment method used (e.g., card, cash, PIX) | | description | Details about the transaction | | date | Transaction date (format: yyyy-MM-dd) | Make sure the column headers match exactly as shown above, and leave the rows empty for the bot to fill automatically. Requirements n8n (self-hosted or cloud instance) Telegram Bot API Google Sheets Google Gemini AI or equivalent AI node
by Atta
What it does The job search process is filled with manual, frustrating tasks—reading endless job descriptions only to find the seniority is wrong, the role requires a language you don't speak, or a "hybrid" job has an impossible commute. This workflow acts as a personal AI assistant that automates the entire top of your job search funnel. It doesn't just find jobs; it reads the full description, checks the commute time from your home, filters by your specific criteria, and even compares the job requirements against your CV to calculate a match score. It's a personalized, decision-making engine that only alerts you to the opportunities that are a perfect fit. How it works The workflow is designed to be fully customized from a single Config node and runs in a multi-layered sequence to find and qualify job opportunities. Scrape Jobs: The workflow triggers and uses Apify to find new job postings on LinkedIn based on a list of keywords you define (e.g., "AI Workflow Engineer," "Automation Specialist"). AI Triage & Smart Filtering: For each job found, a Google Gemini AI performs an initial triage, extracting key data like the job's language, work model (Remote, Hybrid, On-site), and seniority level. The workflow then applies a series of smart filters based on your personal preferences: Language & Seniority: It discards any jobs that don't match your target language and experience level. Commute Check: For hybrid or on-site roles, it uses the Google Maps API to calculate the commute time from your home address and filters out any that exceed your maximum desired travel time. AI Deep Analysis vs. CV: For the handful of jobs that pass the initial filters, a second, more advanced Google Gemini agent performs a deep analysis. It compares the job description against your personal CV (which you paste into the config) to generate a summary, a list of key required skills, and a final match score (e.g., 8/10). Log & Alert: The final step is action. The full analysis of every qualified job is logged in a Supabase database for your records. However, only jobs with a match score above your set threshold will trigger an immediate, detailed alert in Telegram, ensuring you only focus on the best opportunities. Setup Instructions This workflow is designed for easy setup, with most personal preferences controlled from a single node. Required Credentials Apify: You will need an Apify API Token. Google Cloud: You will need credentials for a Google Cloud project with the Google AI (Gemini) and Google Maps APIs enabled. Supabase: You will need your Supabase Project URL and Service Role Key. Telegram: You will need a Telegram Bot Token and the Chat ID for the channel where you want to receive alerts. Step-by-Step Configuration Almost all customization is done in the Config node. Open it and set the following parameters to match your personal job search criteria: MyCV: Paste the full text of your CV/resume here. This is used by the AI to compare your skills against the job requirements. JobKeywords: Search keywords for jobs (e.g., "engineer", "product manager"). JobsToScrape: The maximum number of relevant job postings to scrape in each run. HomeLocation: Your home city and country (e.g., "Breda, Netherlands"). This is used as the starting point for calculating commute times for hybrid or onsite jobs. MaxCommuteMinutes: Your personal maximum one-way commute time in minutes. The workflow will filter out any jobs that require a longer travel time. TargetLanguage: Your preferred language for job postings. The workflow will filter out any jobs not written in this language. You can list multiple languages, separated by a comma. ExperienceLevel: The seniority level you are looking for. The AI will validate this against the job description. The value can be: "" → (Any) "internship" → (Internship) "entry" → (Entry Level) "associate" → (Associate) "mid_senior" → (Mid-Senior Level) "director" → (Director) "executive" → (Executive) Under10Applicants: Set to true if you only want to see jobs with fewer than 10 applicants. Set to false to see all jobs. After setting up the Config node, configure the Supabase and Telegram nodes with your specific credentials and table/chat details. How to Adapt the Template This workflow is a powerful framework for any search and qualification process. Change Job Source:* Swap the *Apify** node to scrape different job boards, or use an RSS Feed Reader node to get jobs from sites that provide feeds. Refine AI Logic:* The prompts in the two *Google Gemini** nodes are the core of the engine. You can edit them to extract different data points, change the scoring criteria, or even ask the AI to evaluate a company's culture based on the tone of the job description. Change the Database:* Replace the *Supabase* node with *Airtable, **Google Sheets, or a traditional database node like Postgres to log your results. Modify Alerts:* Change the *Telegram* node to send alerts via *Slack, **Discord, or Email. You could also add a step to automatically create a draft application or add the job to a personal CRM.
by Aliz
This workflow automates the daily backfill of Google Analytics 4 (GA4) data into BigQuery. It fetches 13 essential pre-processed reports (including User Acquisition, Traffic, and E-commerce) and uploads them to automatically created tables in BigQuery, and then send an alert in telegram. How it works Configuration:** You define your Project ID, Dataset, and Date Range in a central "Config" node. Parallel Fetching:** The workflow runs 13 parallel API calls to GA4 to retrieve key reports (e.g., ga4_traffic_sources, ga4_ecommerce_items). Dynamic Tables:** It automatically checks if the target BigQuery table exists and creates it with the correct schema if it's missing. Telegram Alerts:** After execution, it sends a summary message to Telegram indicating success or failure for the day's run. Set up steps Google Credentials (OAuth): This workflow uses n8n's built-in "Google OAuth2 API" credential. You do not need a Service Account key. Connect your Google account and ensure you grant scopes for Google Analytics API and BigQuery API. Config Node: Open the "Backfill Config" node and fill in: GA4 Property ID Google Cloud Project ID BigQuery Dataset ID Telegram Setup (Optional): If you want alerts, configure the Telegram node with your Bot Token and Chat ID. If not, you can disable/remove this node. Schedule: By default, this is set to run daily. It is recommended to use a date expression (e.g., Today - 2 Days) to allow GA4 time to process data fully before fetching.
by koichi nagino
Description Ever wanted to share the beauty of space with a unique, artistic touch? This workflow automatically generates a stunning "space postcard" and shares it in your Slack channel. It fetches a random image from NASA's Astronomy Picture of the Day (APOD) archive, uses an AI to write a short, poetic message inspired by the image, and overlays this text directly onto the picture before posting. Who’s it for Space lovers who want a daily dose of cosmic beauty in their Slack channels. Community managers looking for engaging, automated content to keep their workspace active. Creative teams who appreciate the fusion of technology, art, and science. Anyone looking for a fun, impressive demonstration of AI-powered image manipulation. What it does / How it works Fetches a Random Image: The workflow triggers and fetches a random image from the last 10 years of NASA's 'Astronomy Picture of the Day' (APOD) collection. It checks to ensure the content is an image, not a video. Engages AI for Creativity: It sends the image's title and official explanation to an AI model. The AI is prompted to generate a short, poetic message (under 50 characters) and to calculate the precise coordinates to place this text in the bottom-left corner of the image. Creates the Postcard: The workflow then takes the poetic text and coordinates from the AI and dynamically writes the message onto the NASA image, creating a unique digital postcard. Shares to Slack: Finally, it uploads the newly created postcard image to your specified Slack channel and posts a follow-up message. Requirements An n8n instance. A NASA API Key. An OpenAI API Key (or credentials for another compatible AI model). A Slack workspace and the permissions to connect an app. How to set up NASA Credentials: Add your NASA API key credentials to the Get NASA APOD node. OpenAI Credentials: Add your OpenAI API key credentials to the OpenAI Chat Model node. Slack Configuration: Add your Slack credentials to both the Upload a file and Send a message nodes. IMPORTANT: In both Slack nodes, you must select the channel where you want the postcard to be posted. How to customize the workflow Automate It: Replace the Manual Trigger with a Schedule Trigger to have a new postcard sent automatically every day. Change the AI's Personality: Edit the prompt in the AI Agent node. You can ask it to generate a funny caption, a haiku, or place the text in a different corner—the possibilities are endless! Use a Different Platform: Swap the Slack nodes for Discord, Telegram, or Email nodes to share the postcard on your platform of choice.