by ankitkansaldev
๐ฌ TikTok Influencer Scraper (URL Input) via Bright Data + n8n & Sheets A comprehensive n8n automation that scrapes TikTok influencer profiles using Bright Data's TikTok dataset and automatically saves detailed profile information to Google Sheets. ๐ Overview This workflow provides an automated TikTok influencer data collection solution that scrapes comprehensive profile information and saves it to Google Sheets. Perfect for influencer marketing research, competitor analysis, social media monitoring, and marketing campaign planning. โจ Key Features ๐ Form-Based Input: Simple web form to submit TikTok profile URLs ๐ค Bright Data Integration: Uses Bright Data's TikTok dataset for reliable scraping โณ Status Monitoring: Intelligent polling system to check scraping progress ๐ Retry Logic: Automatic retry mechanism with 30-second intervals ๐ Data Extraction: Comprehensive profile data including engagement metrics ๐ Google Sheets Storage: Automatic data storage and organization โก Error Handling: Built-in error handling and status reporting ๐ฏ Custom Fields: Configurable output fields for specific data needs ๐ฏ What This Workflow Does Input Profile URLs**: TikTok profile URLs submitted through web form Custom Fields**: Configurable data fields for extraction Country Settings**: Geo-targeting for accurate data collection Processing Form Submission: User submits TikTok profile URL through web form API Trigger: Sends profile data to Bright Data for scraping Status Polling: Continuously checks scraping progress Wait & Retry: Implements 30-second delays between status checks Data Retrieval: Fetches complete profile data when ready Sheet Update: Saves extracted data to Google Sheets Status Reporting: Provides completion status and messages Output Data Points | Field | Description | Example | |-------|-------------|---------| | Account ID | Unique TikTok account identifier | @username123 | | Nickname | Display name on profile | "John Doe" | | Biography | Profile bio/description | "Content creator & influencer" | | Followers | Number of followers | 1,250,000 | | Following | Number of accounts following | 500 | | Likes | Total likes across all videos | 50,000,000 | | Videos Count | Total number of videos posted | 1,200 | | Profile URL | Direct link to TikTok profile | https://www.tiktok.com/@username | | Profile Picture | Profile image URL | https://p16-sign-sg.tiktokcdn.com/... | | Profile Picture HD | High-definition profile image | https://p16-sign-sg.tiktokcdn.com/... | | Is Verified | Verification status | true/false | | Bio Link | External link in bio | https://linktr.ee/username | | Like Engagement Rate | Engagement rate based on likes | 5.2% | | Comment Engagement Rate | Engagement rate based on comments | 2.1% | | Top Videos | List of top performing videos | [video_objects] | | Region | Geographic region | "US" | | Is Under Age 18 | Age status indicator | true/false | ๐ Setup Instructions Prerequisites n8n instance (self-hosted or cloud) Google account with Sheets access Bright Data account with TikTok dataset access Valid TikTok profile URLs for testing 10-15 minutes for setup Step 1: Import the Workflow Copy the JSON workflow code from the provided file In n8n: Workflows โ + Add workflow โ Import from JSON Paste JSON and click Import Step 2: Configure Bright Data Set up Bright Data credentials: In n8n: Credentials โ + Add credential โ HTTP Request Generic Credential Name: "Bright Data API" Authentication: Bearer Token Token: Your Bright Data API key Test the connection Configure dataset: Ensure you have access to TikTok dataset (gd_l1villgoiiidt09ci) Verify dataset permissions in Bright Data dashboard Check dataset limits and pricing Step 3: Configure Google Sheets Integration Create a Google Sheet: Go to Google Sheets Create a new spreadsheet named "TikTok Influencer Data" Create a sheet tab named "TikTok profile by url" Copy the Sheet ID from URL: https://docs.google.com/spreadsheets/d/SHEET_ID_HERE/edit Set up Google Sheets credentials: In n8n: Credentials โ + Add credential โ Google Sheets OAuth2 API Complete OAuth setup and test connection Prepare your data sheet with columns: Column A: Account ID Column B: Nickname Column C: Biography Column D: Followers Column E: Following Column F: Likes Column G: Videos Count Column H: Profile URL Column I: Is Verified Column J: Bio Link Column K: Like Engagement Rate Column L: Comment Engagement Rate Column M: Region Column N: Status Column O: Message Step 4: Update Workflow Settings Update API credentials: Open "Sends profile URLs to Bright Data to trigger scraping" node Replace BRIGHT_DATA_API_KEY with your actual API key Update dataset ID if different Update Google Sheets nodes: Open "Google Sheets" node Replace document ID: 1OeqtCFm4Wek9DI5YFOWQXTpQJS-SJxC10iAPKEKkmiY Select your Google Sheets credential Choose the correct sheet/tab name Configure form settings: Open "Search by Profile URL" node Customize form title and field labels as needed Note the webhook URL for form access Step 5: Test & Activate Add test profiles: Access the form using the webhook URL Submit 1-2 TikTok profile URLs for testing Use full URLs (e.g., https://www.tiktok.com/@username) Test the workflow: Submit a test profile through the form Monitor execution in n8n Verify data appears in Google Sheet Check for any error messages ๐ Usage Guide Submitting TikTok Profiles Navigate to your form URL (found in Form Trigger node) Enter TikTok profile URL in the format: https://www.tiktok.com/@username Click Submit to start the scraping process Wait for processing (typically 1-3 minutes) Understanding the Process The workflow follows this sequence: Form Submission โ Profile URL captured API Trigger โ Scraping job submitted to Bright Data Status Polling โ Checks every 30 seconds if data is ready Data Retrieval โ Fetches complete profile information Sheet Update โ Saves data to Google Sheets Monitoring Progress Check n8n execution logs for real-time status Bright Data dashboard shows scraping progress Google Sheets will populate when data is ready Status column shows "ready" when complete Reading the Results Your Google Sheet will show: Complete TikTok profile information Engagement metrics and statistics Profile verification status Bio links and external connections Timestamp of data collection ๐ง Customization Options Adding More Data Points Edit the JSON body in "Sends profile URLs to Bright Data" node to include additional fields: "custom_output_fields": [ "account_id", "nickname", "biography", "followers", "following", "likes", "videos_count", "language", "creation_time", "last_post_time", "avg_video_duration", "hashtags_used", "music_used" ] Modifying Input Parameters Customize the scraping parameters: Country targeting**: Change "country" field in input Search limits**: Adjust "limit_per_input" value Discovery method**: Modify "discover_by" parameter Error handling**: Toggle "include_errors" setting Batch Processing Multiple Profiles To process multiple profiles simultaneously: Modify the input array in the API call Add multiple profile URLs in single request Implement loop logic for processing results Add rate limiting between requests Custom Form Fields Enhance the form with additional inputs: Open "Search by Profile URL" node Add form fields for: Country selection Number of videos to analyze Specific date ranges Custom tags or categories ๐จ Troubleshooting Common Issues & Solutions "Bright Data connection failed" Cause: Invalid API credentials or dataset access Solution: Verify API key in Bright Data dashboard, check dataset permissions "Profile not found or private" Cause: Invalid TikTok URL or private profile Solution: Verify profile URL format, ensure profile is public "Google Sheets permission denied" Cause: Incorrect credentials or sheet permissions Solution: Re-authenticate Google Sheets, check sheet sharing settings "Scraping timeout" Cause: Profile data taking too long to process Solution: Increase wait time or implement longer polling intervals "Invalid dataset ID" Cause: Incorrect or expired dataset configuration Solution: Check Bright Data dashboard for correct dataset ID "Form submission failed" Cause: Webhook configuration issues Solution: Verify webhook URL and form trigger settings Advanced Troubleshooting Check execution logs** in n8n for detailed error messages Test individual nodes** by running them separately Verify data formats** ensure URLs are properly formatted Monitor API limits** check Bright Data usage quotas Add error handling** implement try-catch logic for robust operation ๐ Use Cases & Examples 1. Influencer Marketing Research Goal: Identify and analyze potential influencers for campaigns Research influencers in specific niches Analyze engagement rates and audience size Compare multiple influencers for campaign selection Track influencer growth over time 2. Competitive Analysis Goal: Monitor competitors' TikTok presence and performance Track competitor follower growth Analyze content strategies and engagement Monitor posting frequency and timing Identify trending content themes 3. Social Media Monitoring Goal: Track brand mentions and user-generated content Monitor branded hashtag usage Track brand advocates and micro-influencers Analyze sentiment and engagement patterns Identify trending topics in your industry 4. Market Research Pipeline Goal: Gather social media intelligence for business decisions Analyze target audience behavior Study content preferences and trends Generate reports for stakeholders Support marketing strategy development โ Advanced Configuration Rate Limiting and Performance To optimize for large-scale scraping: Adjust wait times between status checks Implement exponential backoff for retries Add batch processing for multiple profiles Monitor API usage to avoid limits Data Validation and Cleaning Enhance data quality with validation: Add data type validation for numeric fields Implement URL format checking Clean and standardize text fields Add data completeness checks Integration with Business Tools Connect the workflow to your existing systems: CRM Integration**: Update customer records with influencer data Slack Notifications**: Send alerts when new data is available Database Storage**: Store data in PostgreSQL/MySQL for analysis BI Tools**: Connect to Tableau/Power BI for visualization Webhook Integration For real-time updates: Add webhook triggers for immediate profile checks Integrate with external systems via webhooks Create API endpoints for programmatic access Implement authentication for secure access ๐ Performance & Limits Expected Performance Single Profile**: 30-60 seconds average processing time Concurrent Requests**: 5-10 simultaneous (depends on Bright Data plan) Data Accuracy**: 95%+ for public TikTok profiles Success Rate**: 90%+ for accessible profiles Daily Capacity**: 100-1000 profiles (depends on rate limits) Resource Usage Memory**: ~50MB per execution Storage**: Minimal (data stored in Google Sheets) API Calls**: 3-5 Bright Data calls per profile (including status checks) Bandwidth**: ~1-2MB per profile scraped Execution Time**: 1-2 minutes per profile Scaling Considerations Rate Limiting**: Add delays for high-volume scraping Error Handling**: Implement retry logic for failed requests Data Validation**: Add checks for malformed profile data Monitoring**: Track success/failure rates over time Cost Optimization**: Monitor API usage to control costs ๐ค Support & Community Getting Help n8n Community Forum**: community.n8n.io Documentation**: docs.n8n.io Bright Data Support**: Contact through your dashboard GitHub Issues**: Report bugs and feature requests Contributing Share improvements with the community Report issues and suggest enhancements Create variations for specific use cases Document best practices and lessons learned ๐ Quick Setup Checklist Before You Start โ n8n instance running (self-hosted or cloud) โ Google account with Sheets access โ Bright Data account with TikTok dataset access โ Valid TikTok profile URLs for testing โ 15 minutes for setup Setup Steps โ Import Workflow - Copy JSON and import to n8n โ Configure Bright Data - Set up API credentials and test โ Create Google Sheet - New sheet with proper column structure โ Set up Google Sheets credentials - OAuth setup and test โ Update workflow settings - Replace sheet ID and API keys โ Test with sample profiles - Submit 1-2 URLs and verify results โ Activate workflow - Enable form trigger for production use Ready to Use! ๐ Your form URL: https://your-n8n-instance.com/form/[webhook-id] ๐ฏ Happy TikTok Scraping! This workflow provides a solid foundation for automated TikTok influencer data collection. Customize it to fit your specific needs and use cases for influencer marketing, competitive analysis, and social media research.
by Vadym Nahornyi
This workflow automatically transcribes audio files, translates the content between languages, and generates natural-sounding speech from the translated text - all in one seamless process. Who's it for Content creators, educators, and businesses needing to make their audio content accessible across language barriers. Perfect for translating podcasts, voice messages, lectures, or any audio content while preserving the spoken format. How it works The workflow receives an audio file through a webhook, transcribes it using OpenAI's Whisper, translates and structures the text with GPT-4, generates new audio in the target language, and stores it in S3 for easy access. The entire process takes seconds and returns both the transcribed/translated text and a URL to the translated audio file. How to set up Configure OpenAI credentials - Add your OpenAI API key for Whisper transcription and GPT-4 translation Set up AWS S3 - Create a bucket with public read permissions for audio storage Update configuration - Replace 'YOUR-BUCKET-NAME' with your actual S3 bucket name Activate webhook - Deploy and copy your webhook URL for receiving audio files Send a POST request with: Binary audio file (as 'audiofile') Languages parameter (e.g., "English, Spanish") Requirements OpenAI API account with access to Whisper and GPT-4 AWS account with S3 bucket configured Basic understanding of webhooks and API requests How to customize Add language detection** - Automatically detect source language if not specified Customize voice settings** - Adjust speech speed, pitch, or select different voices Add file validation** - Implement size limits and format checks Enhance security** - Add webhook authentication and rate limiting Extend functionality** - Add subtitle generation or multiple output formats
by Vincent Belmehel
Purpose This workflow automatically creates a subscriber in a given Beehiiv publication when a new opt-in is registered in a given Systeme.io sales funnel. Good to know: the integration with Systeme.io is done at the sales funnel level, not at the account level. If you have several sales funnels, you can use the same workflow several times. Quick Setup Configure your sales funnel in Systeme.io to create and trigger a webhook after an opt-in Open the โOn New Systeme.io Optinโ node to find the webhook URL needed to configure your sales funnel on Systeme.io Configure the โConfigure Workflowโ node Add your Beehiiv publication ID If you know the subscriber's first and last name and want to send it to Beehiiv, configure the custom field names for first and last name Add one or more email addresses to which to send alert notifications in the event of a problem (separated by commas). If you have not already done so : Connect your Beehiiv account in the โCreate New Beehiiv Subscriberโ node Connect your Gmail account in the โSend Email Alert (Beehiiv API error)โ node How It Works As soon as a new opt-in is registered on your sales funnel, Systeme.io triggers the workflow (via a webhook) Only requests actually coming from Systeme.io are considered (whitelisting of their IP addresses for security reasons) A new subscriber is added to your Beehiiv publication (via an API call) If available in Systeme.io, UTM tags (utm_source, utm_medium and utm_campaign) are transferred to Beehiiv to correctly track where your subscribers are coming from If an error occurs during the Beehiiv API call, an alert notification is sent to you (via email) Requirements A Systeme.io account A Beehiiv account with an active publication A Gmail account Benefits Automate & scale your email marketing efforts seamlessly No more manual tasks to keep your subscriber list always up-to-date Focus on creating a newsletter that stands out, not on the technical side Check Out My Other Templates ๐ https://n8n.io/creators/belmehel/
by Dvir Sharon
๐ฐ Publish Latest News on X and Other Social Media Platforms Using Keyword A comprehensive n8n automation that fetches the latest news based on keywords, generates AI-powered social media content, and automatically publishes to X (Twitter) with complete tracking and notification systems. ๐ Overview This workflow provides a professional news publishing solution that automatically discovers breaking news, creates engaging social media content using AI, and publishes to X (Twitter) with comprehensive tracking. Perfect for news organizations, content creators, social media managers, and businesses wanting to stay current with automated news sharing. The system uses BrightData's Google News dataset, OpenAI's GPT-4o for content generation, and multi-platform integration for complete automation. โญ Key Features ๐ Form-Based Input**: Clean web form for keyword and country submission ๐ฐ Real-Time News Fetching**: BrightData Google News integration for latest articles ๐ค AI Content Generation**: GPT-4o powered tweet creation with hashtags ๐ฑ Auto X Publishing**: Direct posting to X (Twitter) with URL tracking ๐ Complete Tracking**: Google Sheets logging of all published content ๐ Email Notifications**: Success alerts with tweet links ๐ Multi-Country Support**: Localized news for US, India, UK, Australia โก Status Monitoring**: Real-time progress tracking with retry logic ๐ก Error Handling**: Robust error management and validation ๐ Loop Management**: Intelligent waiting for news processing completion ๐ฏ What This Workflow Does Input: News Name**: Keyword or topic for news search (required) Country**: Target country for localized news (dropdown: US/IN/GB/AU) Processing: Form Submission: Captures news keyword and target country News Triggering: Initiates BrightData Google News scraping job Status Monitoring: Checks scraping progress with intelligent retry loop Data Retrieval: Fetches latest news articles when ready AI Content Creation: Generates engaging tweet content using GPT-4o Social Publishing: Posts content to X (Twitter) automatically URL Generation: Creates direct tweet links for tracking Data Logging: Saves content and URLs to Google Sheets Email Notification: Sends success confirmation with tweet link Completion: Workflow ends with full audit trail ๐ Output Data Points | Field | Description | Example | | :------------ | :---------------------------------- | :----------------------------------------------------------------------------------------------------- | | TweetMessage | AI-generated social media content | "Breaking: AI revolution transforming healthcare with 40% efficiency gains. New study shows promising results in patient care automation. #AI #Healthcare #Innovation #TechNews #US" | | TweetURL | Direct link to published tweet | https://twitter.com/i/web/status/1234567890123456789 | ๐ ๏ธ Setup Instructions Prerequisites: n8n instance (self-hosted or cloud) X (Twitter) account with API v2 access OpenAI account with GPT-4o access Gmail account for notifications Google account with Sheets access BrightData account with Google News dataset access Basic understanding of social media automation Step 1: Import the Workflow Copy the JSON workflow code from the provided file. In n8n, click "+ Add workflow". Select "Import from JSON". Paste the workflow code and click "Import". The workflow will appear with all nodes properly connected. Step 2: Configure API Credentials X (Twitter) API Setup: Create X Developer Account at developer.twitter.com. Create new app and generate API keys. In n8n: Credentials โ + Add credential โ Twitter OAuth2 API. Add your Twitter API credentials: API Key API Secret Key Bearer Token Access Token Access Token Secret Test the connection with a sample tweet. OpenAI API Configuration: Get API key from platform.openai.com. Ensure GPT-4o model access is available. In n8n: Credentials โ + Add credential โ OpenAI API. Add your OpenAI API key. Verify model access in the "OpenAI Chat Model" node. Gmail Integration: Create "Gmail OAuth2" credential. Follow OAuth setup process. Grant email sending permissions. Test with sample email. BrightData News API: The workflow uses pre-configured token: 5662edde-6735-4c5d-a6c6-693043a5a9a5. Dataset ID: gd_lnsxoxzi1omrwnka5r (Google News). Verify access to Google News dataset. Test API connection. Google Sheets Integration: Create "Google Sheets OAuth2 API" credential. Complete OAuth authentication. Grant read/write permissions. Test connection. Step 3: Configure Google Sheets Integration Create Google Sheets Structure: Sheet Name: "Publish Latest News on Social Media Platforms Using Keyword" Tab: "Data" (default) Columns: Tweet Message: AI-generated content posted to X Tweet URL: Direct link to published tweet Sheet Configuration: Create new Google Sheet or use existing one. Add the required column headers. Copy Sheet ID from URL: https://docs.google.com/spreadsheets/d/SHEET_ID_HERE/edit. Current configured Sheet ID: 1koxNrwdeuaSBdREuKc7JQh3d9blEk0sQDJ8VgVLjPOo. Update Workflow Settings: Open "Google Sheets" node. Replace Document ID with your Sheet ID. Select your Google Sheets credential. Choose "Data" sheet/tab. Verify column mapping is correct. Step 4: Configure Form Interface Form Settings: Open "On form submission" node. Form configuration: Title: "News Publisher" Description: "publish latest news to direct social media" Fields: News Name (text, required) Country (dropdown: US, IN, GB, AU, required) Webhook URL: Copy webhook URL from form trigger node. Current webhook ID: 8d320705-688c-4150-a393-cf899d2bbb52. Test form accessibility and submission. Step 5: Configure Email Notifications Gmail Setup: Open "Gmail" node. Update recipient email: raushan@iwantonlinemarketing.com. Email template includes: Success confirmation Direct tweet link Professional formatting Test email delivery. Step 6: Test the Workflow Sample Test Data: Use these examples for testing: | News Name | Country | Expected Results | | :-------------------- | :------ | :------------------------------------------------- | | artificial intelligence | US | Latest AI news with US-specific hashtags | | cricket world cup | IN | Sports news with India-focused content | | brexit update | GB | UK political news with British hashtags | | bushfire news | AU | Australian environmental news | Testing Process: Activate the workflow (toggle switch). Navigate to the webhook form URL. Submit test data. Monitor execution progress: News fetching (30-60 seconds) AI content generation (10-15 seconds) X publishing (5-10 seconds) Sheet update and email (5 seconds) Verify results in all platforms. ๐ Usage Guide Using the Form Interface Navigate to the webhook URL provided by the form trigger. Enter news keyword or topic (e.g., "climate change", "stock market", "technology"). Select target country from dropdown. Click submit and wait for processing. Check email for success notification with tweet link. Example Inputs to Test | News Name | Country | Expected | | :-------------------------------- | :------ | :----------------------------------------------------- | | "artificial intelligence breakthrough" | "US" | Latest AI developments with tech hashtags | | "football premier league" | "GB" | UK football news with sports hashtags | | "stock market updates" | "IN" | Indian market news with finance hashtags | | "hollywood movies" | "AU" | Entertainment news with Australian perspective | Country-Specific Considerations United States (US)**: Focus on national news and global impact. Hashtags: #USA, #American, #Breaking, #News. Time zone considerations for optimal posting. India (IN)**: Emphasis on regional relevance. Hashtags: #India, #Indian, #News, #Breaking. Cultural context in content generation. United Kingdom (GB)**: British perspective and terminology. Hashtags: #UK, #British, #News, #Breaking. Focus on European context. Australia (AU)**: Australian angle and regional focus. Hashtags: #Australia, #Australian, #News, #Breaking. Pacific region context. ๐ Reading the Results Google Sheets Data The output sheet contains: Complete tweet content with hashtags and formatting. Direct tweet URLs for easy access and sharing. Chronological record of all published content. Audit trail for content management. Email Notifications Success emails include: Confirmation that content was published. Direct link to view the tweet. Professional formatting for easy reference. X (Twitter) Posts Published content features: AI-optimized messaging within 260 character limit. Relevant hashtags based on topic and country. Engaging format designed for social media. Professional tone suitable for news sharing. ๐ง Customization Options Expanding Social Media Platforms Add more platforms to the publishing workflow: // Add LinkedIn publishing { "node": "LinkedIn", "type": "n8n-nodes-base.linkedin", "parameters": { "text": "={{ $json.output }}", "additionalFields": {} } } // Add Facebook posting { "node": "Facebook", "type": "n8n-nodes-base.facebook", "parameters": { "pageId": "YOUR_PAGE_ID", "message": "={{ $json.output }}" } }
by Ajith joseph
๐ค Create a Telegram Bot with Mistral AI and Conversation Memory A sophisticated Telegram bot that provides AI-powered responses with conversation memory. This template demonstrates how to integrate any AI API service with Telegram, making it easy to swap between different AI providers like OpenAI, Anthropic, Google AI, or any other API-based AI model. ๐ง How it works The workflow creates an intelligent Telegram bot that: ๐ฌ Maintains conversation history for each user ๐ง Provides contextual AI responses using any AI API service ๐ฑ Handles different message types and commands ๐ Manages chat sessions with clear functionality ๐ Easily adaptable to any AI provider (OpenAI, Anthropic, Google AI, etc.) โ๏ธ Set up steps ๐ Prerequisites ๐ค Telegram Bot Token (from @BotFather) ๐ AI API Key (from any AI service provider) ๐ n8n instance with webhook capability ๐ ๏ธ Configuration Steps ๐ค Create Telegram Bot Message @BotFather on Telegram Create new bot with /newbot command Save the bot token for credentials setup ๐ง Choose Your AI Provider OpenAI: Get API key from OpenAI platform Anthropic: Sign up for Claude API access Google AI: Get Gemini API key NVIDIA: Access LLaMA models Hugging Face: Use inference API Any other AI API service ๐ Set up Credentials in n8n Add Telegram API credentials with your bot token Add Bearer Auth/API Key credentials for your chosen AI service Test both connections ๐ Deploy Workflow Import the workflow JSON Customize the AI API call (see customization section) Activate the workflow Set webhook URL in Telegram bot settings โจ Features ๐ Core Functionality ๐จ Smart Message Routing**: Automatically categorizes incoming messages (commands, text, non-text) ๐ง Conversation Memory**: Maintains chat history for each user (last 10 messages) ๐ค AI-Powered Responses**: Integrates with any AI API service for intelligent replies โก Command Support**: Built-in /start and /clear commands ๐ฑ Message Types Handled ๐ฌ Text Messages**: Processed through AI model with context ๐ง Commands**: Special handling for bot commands โ Non-text Messages**: Polite error message for unsupported content ๐พ Memory Management ๐ค User-specific chat history storage ๐ Automatic history trimming (keeps last 10 messages) ๐ Global state management across workflow executions ๐ค Bot Commands /start ๐ฏ - Welcome message with bot introduction /clear ๐๏ธ - Clears conversation history for fresh start Regular text ๐ฌ - Processed by AI with conversation context ๐ง Technical Details ๐๏ธ Workflow Structure ๐ก Telegram Trigger - Receives all incoming messages ๐ Message Filtering - Routes messages based on type/content ๐พ History Management - Maintains conversation context ๐ง AI Processing - Generates intelligent responses ๐ค Response Delivery - Sends formatted replies back to user ๐ค AI API Integration (Customizable) Current Example (NVIDIA): Model: mistralai/mistral-nemotron Temperature: 0.6 (balanced creativity) Max tokens: 4096 Response limit: Under 200 words ๐ Easy to Replace with Any AI Service: OpenAI Example: { "model": "gpt-4", "messages": [...], "temperature": 0.7, "max_tokens": 1000 } Anthropic Claude Example: { "model": "claude-3-sonnet-20240229", "messages": [...], "max_tokens": 1000 } Google Gemini Example: { "contents": [...], "generationConfig": { "temperature": 0.7, "maxOutputTokens": 1000 } } ๐ก๏ธ Error Handling โ Non-text message detection and appropriate responses ๐ง API failure handling โ ๏ธ Invalid command processing ๐จ Customization Options ๐ค AI Provider Switching To use a different AI service, modify the "NVIDIA LLaMA Chat Model" node: ๐ Change the URL in HTTP Request node ๐ง Update the request body format in "Prepare API Request" node ๐ Update authentication method if needed ๐ Adjust response parsing in "Save AI Response to History" node ๐ง AI Behavior ๐ Modify system prompt in "Prepare API Request" node ๐ก๏ธ Adjust temperature and response parameters ๐ Change response length limits ๐ฏ Customize model-specific parameters ๐พ Memory Settings ๐ Adjust history length (currently 10 messages) ๐ค Modify user identification logic ๐๏ธ Customize data persistence approach ๐ญ Bot Personality ๐ Update welcome message content โ ๏ธ Customize error messages and responses โ Add new command handlers ๐ก Use Cases ๐ง Customer Support**: Automated first-line support with context awareness ๐ Educational Assistant**: Homework help and learning support ๐ฅ Personal AI Companion**: General conversation and assistance ๐ผ Business Assistant**: FAQ handling and information retrieval ๐ฌ AI API Testing**: Perfect template for testing different AI services ๐ Prototype Development**: Quick AI chatbot prototyping ๐ Notes ๐ Requires active n8n instance for webhook handling ๐ฐ AI API usage may have rate limits and costs (varies by provider) ๐พ Bot memory persists across workflow restarts ๐ฅ Supports multiple concurrent users with separate histories ๐ Template is provider-agnostic - easily switch between AI services ๐ ๏ธ Perfect starting point for any AI-powered Telegram bot project ๐ง Popular AI Services You Can Use | Provider | Model Examples | API Endpoint Style | |----------|---------------|-------------------| | ๐ข OpenAI | GPT-4, GPT-3.5 | https://api.openai.com/v1/chat/completions | | ๐ต Anthropic | Claude 3 Opus, Sonnet | https://api.anthropic.com/v1/messages | | ๐ด Google | Gemini Pro, Gemini Flash | https://generativelanguage.googleapis.com/v1beta/models/ | | ๐ก NVIDIA | LLaMA, Mistral | https://integrate.api.nvidia.com/v1/chat/completions | | ๐ Hugging Face | Various OSS models | https://api-inference.huggingface.co/models/ | | ๐ฃ Cohere | Command, Generate | https://api.cohere.ai/v1/generate | Simply replace the HTTP Request node configuration to switch providers!
by David Olusola
๐จ AI Image Editor with Form Upload + Telegram Delivery ๐ Whoโs it for? ๐ฅ This workflow is built for content creators, social media managers, designers, and agencies who need fast, AI-powered image editing without the hassle. Whether you're batch-editing for clients or spicing up personal projects, this tool gets it done โ effortlessly. What it does ๐ ๏ธ A seamless pipeline that: ๐ฅ Accepts uploads + prompts via a clean form โ๏ธ Saves images to Google Drive automatically ๐ง Edits images with OpenAIโs image API ๐ Converts results to downloadable PNGs ๐ฌ Delivers the final image instantly via Telegram Perfect for AI-enhanced workflows that need speed, structure, and simplicity. How it works โ๏ธ User Uploads: Fill a form with an image + editing prompt Cloud Save: Auto-upload to your Google Drive folder AI Editing: OpenAI processes the image with your prompt Convert & Format: Image saved as PNG Telegram Delivery: Final result sent straight to your chat ๐ฌ Youโll need โ ๐ OpenAI API key ๐ Google Drive OAuth2 setup ๐ค Telegram bot token & chat ID โ๏ธ n8n instance (self-hosted or cloud) Setup in 4 Easy Steps ๐ ๏ธ 1. Connect APIs Add OpenAI, Google Drive, and Telegram credentials to n8n Store keys securely (avoid hardcoding!) 2. Configure Settings Set Google Drive folder ID Add Telegram chat ID Tweak image size (default: 1024ร1024) 3. Deploy the Form Add a Webhook Trigger node Test with a sample image Share the form link with users ๐ฏ Fine-Tune Variables In the Set node, customize: ๐ Image size ๐ Folder path ๐ฒ Delivery options โฑ๏ธ Timeout duration Want to customize more? ๐๏ธ ๐ผ๏ธ Image Settings Change size (e.g. 512x512 or 2048x2048) Update the model (when new versions drop) ๐ Storage Auto-organize files by date/category Add dynamic file names using n8n expressions ๐ค Delivery Swap Telegram with Slack, email, Discord Add multiple delivery channels Include image prompt or metadata in messages ๐ Form Upgrades Add fields for advanced editing Validate file types (e.g. PNG/JPEG only) Show a progress bar for long edits โก Advanced Features Add error handling or retry flows Support batch editing Include approvals or watermarking before delivery โ ๏ธ Notes & Best Practices โ Check OpenAI credit balance ๐ผ๏ธ Test with different image sizes/types โฑ๏ธ Adjust timeout settings for larger files ๐ Always secure your API keys
by Don Jayamaha Jr
A medium-term trend analyzer for the Binance Spot Market that leverages core technical indicators across 4-hour candle data to provide human-readable swing-trade signals via AI. ๐ฅ Watch Tutorial: ๐ฏ What It Does Accepts a Binance trading pair (e.g., AVAXUSDT) Sends the symbol to an internal webhook for technical indicator calculation Computes 4h RSI, MACD, Bollinger Bands, SMA, EMA, ADX Returns structured, GPT-analyzed signals ready for Telegram delivery ๐ง AI Agent Details Model:** GPT-4.1-mini (OpenAI Chat) Agent Role:** Translates raw indicator values into sentiment-labeled signals Memory:** Tracks session + symbol context for cleaner multi-turn logic ๐ Required Backend Workflow To calculate indicators, this tool depends on: POST https://treasurium.app.n8n.cloud/webhook/4h-indicators { "symbol": "AVAXUSDT" } Returns a JSON object with the latest 40ร4h candle-based calculations. ๐ฅ Input Format { "message": "AVAXUSDT", "sessionId": "telegram_chat_id" } ๐ Sample Output ๐ 4h Technical Signals โ AVAXUSDT โข RSI: 64 โ Slightly Bullish โข MACD: Bullish Cross above baseline โข BB: Upper band touch โ volatility expanding โข EMA > SMA โ Confirmed Upside Momentum โข ADX: 31 โ Strengthening Trend ๐ Use Case Scenarios | Use Case | Result | | ----------------------------- | ---------------------------------------------------- | | Swing trend confirmation | Uses 4h indicators to validate or reject setups | | Breakout signal confluence | Helps assess if momentum is real or noise | | Inputs to Quant AI or Analyst | Supports higher-frame trade recommendation synthesis | ๐ ๏ธ Setup Instructions Import the JSON template into your n8n workspace. Set your OpenAI API credentials for the GPT node. Ensure the /webhook/4h-indicators backend tool is live and accessible. Connect this to your Binance Financial Analyst Tool or master Quant AI orchestrator. ๐ค Parent Workflows That Use This Tool Binance SM Financial Analyst Tool Binance Spot Market Quant AI Agent ๐ Sticky Notes & Annotations This workflow includes internal sticky notes describing: Node roles (GPT, webhook, memory) System behavior (reasoning agent logic) Telegram formatting guidance ๐ Licensing & Attribution ยฉ 2025 Treasurium Capital Limited Company All architecture, prompt logic, and signal formatting are proprietary. Redistribution or rebranding is prohibited. ๐ Connect with the creator: Don Jayamaha โ LinkedIn
by Agent Studio
Overview This workflow allows you to trigger custom logic in n8n directly from Retell's Voice Agent using Custom Functions. It captures a POST webhook from Retell every time a Voice Agent reaches a Custom Function node. You can plug in any logicโcall an external API, book a meeting, update a CRM, or even return a dynamic response back to the agent. Who is it for For builders using Retell who want to extend Voice Agent functionality with real-time custom workflows or AI-generated responses. Prerequisites Have a Retell AI Account A Retell agent with a Custom Function node in its conversation flow (see template below) Set your n8n webhook URL in the Custom Function configuration (see "How to use it" below) (Optional) Familiarity with Retell's Custom Function docs Start a conversation with the agent (text or voice) Retell Agent Example To get you started, we've prepared a Retell Agent ready to be imported, that includes the call to this template. Import the agent to your Retell workspace (top-right button on your agent's page) You will need to modify the function URL in order to call your own instance. This template is a simple hotel agent that calls the custom function to confirm a booking, passing basic formatted data. How it works Retell sends a webhook to n8n whenever a Custom Function is triggered during a call (or test chat). The webhook includes: Full call context (transcript, call ID, etc.) Parameters defined in the Retell function node You can process this data and return a response string back to the Voice Agent in real-time. How to use it Copy the webhook URL (e.g. https://your-instance.app.n8n.cloud/webhook/hotel-retell-template) Modify the Retell Custom Function webhook URL (see template description for screenshots) Edit the function Modify the URL Modify the logic in the Set node or replace it with your own custom flow Deploy and test: Retell will hit your n8n workflow during the conversation Extension Ideas Call a third-party API to fetch data (e.g. hotel availability, CRM records) Use an LLM node to generate dynamic responses Trigger a parallel automation (Slack message, calendar invite, etc.) ๐ Reach out to us if you're interested in analyzing your Retell Agent conversations.
by Raz Hadas
This n8n template demonstrates how to automate stock market technical analysis to detect key trading signals and send real-time alerts to Discord. It's built to monitor for the Golden Cross (a bullish signal) and the Death Cross (a bearish signal) using simple moving averages. Use cases are many: Automate your personal trading strategy, monitor a portfolio for significant trend changes, or provide automated analysis highlights for a trading community or client group. ๐ก Good to know This template relies on the Alpha Vantage API, which has a free tier with usage limits (e.g., API calls per minute and per day). Be mindful of these limits, especially if monitoring many tickers. The data provided by free APIs may have a slight delay and is intended for informational and analysis purposes. Disclaimer**: This workflow is an informational tool and does not constitute financial advice. Always do your own research before making any investment decisions. โ๏ธ How it works The workflow triggers automatically every weekday at 5 PM, after the typical market close. It fetches a list of user-defined stock tickers from the Set node. For each stock, it gets the latest daily price data from Alpha Vantage via an HTTP Request and stores the new data in a PostgreSQL database to maintain a history. The workflow then queries the database for the last 121 days of data for each stock. A Code node calculates two Simple Moving Averages (SMAs): a short-term (60-day) and a long-term (120-day) average for both today and the previous day. Using If nodes, it compares the SMAs to see if a Golden Cross (short-term crosses above long-term) or a Death Cross (short-term crosses below long-term) has just occurred. Finally, a formatted alert message is sent to a specified Discord channel via a webhook. ๐ How to use Configure your credentials for PostgreSQL and select them in the two database nodes. Get a free Alpha Vantage API Key and add it to the "Fetch Daily History" node. For best practice, create a Header Auth credential for it. Paste your Discord Webhook URL into the final "HTTP Request" node. Update the list of stock symbols in the "Set - Ticker List" node to monitor the assets you care about. The workflow is set to run on a schedule, but you can press "Test workflow" to trigger it manually at any time. โ Requirements An Alpha Vantage account for an API key. A PostgreSQL database to store historical price data. A Discord account and a server where you can create a webhook. ๐จ Customising this workflow Easily change the moving average periods (e.g., from 60/120 to 50/200) by adjusting the SMA_SHORT and SMA_LONG variables in the "Compute 60/120 SMAs" Code node. Modify the alert messages in the "Set - Golden Cross Msg" and "Set - Death Cross Msg" nodes. Swap out Discord for another notification service like Slack or Telegram by replacing the final HTTP Request node.
by L Hรนng
Pre-Conditions A Facebook Developer account with an active app. Basic understanding of n8n workflows. Access to a database (optional, for storing tokens). Setup Webhook Activation: Configure the Webhook to receive user requests and process input data. Ensure the Webhook URL is correctly set in your Facebook App settings. Short-Lived Token Retrieval: Use Facebook OAuth to fetch a short-lived token from the authorization code. Long-Lived Token Conversion: Convert the short-lived token into a long-lived token (valid for ~60 days). Page Token Retrieval: Follow the provided instructions to retrieve Page Tokens for posting on managed Facebook Pages. Customizable Scopes: Edit the correctScopes array to include or exclude permissions as needed. Optional Database Storage: Extend the workflow to save tokens to a database instead of displaying them on-screen. Step-by-Step Instructions: Detailed guidance is provided via sticky notes for activating the app, configuring Webhook, and editing parameters like fb_redirect_uri, app_id, and app_secret. Who the Template is For Developers**: Integrating Facebook APIs into their applications. Social Media Managers**: Automating posting and engagement on Facebook Pages. n8n Users**: Looking for a ready-to-use workflow for Facebook Token management. Primary Use Automates Facebook Token retrieval and management. Supports posting to Facebook Pages via Page Tokens. Easily customizable and extendable for specific requirements.
by Ferenc Erb
Use Case Automate chat interactions in Bitrix24 with a customizable bot that can handle various events and respond to user messages. What This Workflow Does Processes incoming webhook requests from Bitrix24 Handles authentication and token validation Routes different event types (messages, joins, installations) Provides automated responses and bot registration Manages secure communication between Bitrix24 and external services Setup Instructions Configure Bitrix24 webhook endpoints Set up authentication credentials Customize bot responses and behavior Deploy and test the workflow
by Joseph
This workflow automates invoice generation from form submissions, ensuring unique order IDs, creating PDF invoices, storing files, emailing customers, and logging invoice data โ all seamlessly integrated. ๐น Workflow Overview Trigger (Webhook) Starts when an order form is submitted, capturing customer and order details. Generate Random Order ID A Function node creates a unique alphanumeric invoice ID (e.g., INV-X92B7D). Check for Duplicate Order ID Google Sheets looks up the generated order ID in your invoice log sheet to prevent duplicates. Conditional Check (IF Node) If the ID already exists โ regenerates a new ID (loops back) If unique โ proceeds to invoice creation Prepare Invoice Data A Set node formats customer info, date, order items, and the unique order ID to fit your invoice template. Convert HTML to PDF HTTP Request node sends your invoice HTML to the RapidAPI HTML-to-PDF service and receives the PDF file. Upload PDF to Cloud Storage Save the PDF in Google Drive or Dropbox with a clear file name like Invoice-INV-X92B7D.pdf. Send Invoice Email to Customer Email node attaches the PDF and includes the order ID in the email subject/body. Log Invoice Details Append invoice data (customer info, order ID, total, PDF link) to your Google Sheet for tracking. โ๏ธ Node Details & Setup 1. Webhook Trigger Configure to receive form submissions (order details like name, email, items, total). 2. Function: Generate Random Order ID Sample JS code generates unique IDs prefixed by INV-. 3. Google Sheets: Lookup Row Set up connection to your invoice log sheet. Search for existing order ID to avoid duplicates. 4. IF Node: Check Order ID Existence Condition: If order ID found โ loop to regenerate. Else โ continue workflow. 5. Set Node: Prepare Invoice HTML Define variables like customer name, date, items, and order ID. This data populates your HTML invoice template. 6. HTTP Request: Convert HTML to PDF API URL to get your key Send invoice HTML in the request body. Receive PDF file blob or download URL. 7. Google Drive (or Dropbox) Upload Upload the PDF file. Use file name format: Invoice-{{$json["order_id"]}}.pdf 8. Email Node Recipient: customer email from the form data. Attach generated PDF. Include order ID in email subject or body for reference. 9. Google Sheets: Append Row Log invoice metadata to keep records updated. ๐ Google Sheets Template You can make a copy of the invoice log template here This sheet includes columns for order\_id, customer name, email, total, and invoice PDF link. Customize it as needed. ๐ Additional Notes Customize the invoice HTML template inside the Set node to match your branding. Ensure API credentials for RapidAPI, Google Drive/Dropbox, and email are properly set up in your n8n credentials. You can expand this workflow by adding payment processing or SMS notifications. Need help or want a custom workflow? Reach out via email at joseph@uppfy.com.