by Lucas Walter
Who's it for Content creators, social media managers, and marketing teams who want to automatically extract the most engaging clips from long-form YouTube videos and identify content with high viral potential. What it does This workflow analyzes any YouTube video using Vizard AI's clipping technology and automatically generates up to 8 short clips with viral score ratings. It then filters for the highest-scoring clips (9/10 or above) and posts them to a designated Slack channel for team review and distribution. How it works Video submission: Enter a YouTube URL through a user-friendly form AI analysis: Submits the video to Vizard AI for automated clipping and viral score analysis Smart polling: Waits for processing completion and retrieves results Quality filtering: Only surfaces clips with viral scores of 9/10 or higher Team notification: Posts results to Slack with clip titles, scores, and download links Requirements Vizard AI API credentials (sign up at vizard.ai) Slack workspace with OAuth app configured How to set up Configure Vizard AI credentials: Add your Vizard AI API key to the HTTP Request nodes Set up Slack integration: Configure the Slack OAuth2 credentials and select your target channel Customize filtering: Adjust the viral score threshold in the filter node (currently set to 9/10) Test the workflow: Submit a test YouTube URL to ensure everything works properly How to customize the workflow Adjust clip quantity**: Modify the maxClipNumber parameter (currently 8) in the initial API request Change viral score threshold**: Update the filter condition to match your quality standards Extend with automation**: Connect to social media posting tools or caption generation workflows for full automation Add scheduling**: Integrate with webhook triggers, scheduled triggers, or RSS feeds for batch processing videos
by M Sayed
Get a fun and clear weather report sent to your phone automatically! 📰 This little bot wakes up, checks the weather for you, and builds a super simple summary of your day. What it does: 🌡️ Grabs the current temperature and what it actually feels like. 📉 Figures out the high and low for the whole day. 📅 Gives you a 5-day forecast so you can plan your week. ✈️ Sends it all to you in a clean Telegram message! Setup is easy: Just plug in your info for Telegram, add your location, and you're good to go! ✨
by LukaszB
Crypto Price Alert – n8n Workflow A simple and effective crypto alert system for anyone who wants to stay up to date with coin price changes — without refreshing charts all day. This workflow checks the current price of your chosen cryptocurrency (via CoinGecko) and sends you an alert on Discord if it goes above or below your target range. It’s lightweight, easy to set up, and runs on autopilot. What the Workflow Does Checks the live price of a selected coin using the CoinGecko API. Compares it to the max/min prices you define manually. Decides if the price is too high or too low. Sends an alert message to Discord depending on the result. How It Works The flow is triggered manually or on a schedule (your choice). It pulls the current price of the coin you set. Compares that price with your min and max values. Sends a “high” or “low” message to your Discord webhook. Setup Steps Enter your coin ID and price thresholds in the “Set Low and High” node. Paste your Discord webhook URLs in the "Message High" and "Message Low" nodes. Optional: Adjust the schedule trigger to run every X minutes/hours. Run once manually to test — takes under 1 minutes. Full instructions and config tips are in sticky notes inside the workflow.
by Olek
How it works This workflow will activate and deactivate a selected other workflow on schedule. > ⚠️ Warning! > This approach won't work for trial users as it requires n8n API that is not available to trial users. > See https://docs.n8n.io/api/ for details. Set up steps Adjust activation/deactivation schedule per your needs. Custom (cron) interval is a recommended approach. Set targeted Workflow ID. You will find it in the URL of the workflow you want to manage. Set n8n API credentials: Create an API key: how to Create n8n credentials using the API key: how to This workflow uses n8n node. #DevOps #workflow-management Other useful stuff Need a universal Error workflow to catch both execution and trigger errors? Here you go: Error handling: Send email via Gmail on execution or trigger-level errors More stuff by Olek and do not forget to backup your workflows often by automating.
by Taiki
Workflow Setup Guide This workflow collects the most-viewed videos from specified YouTube channels and saves the data to a Google Sheet. Follow these steps to set it up: 1. Credentials Setup Google Sheets:** You need to have a Google Sheets credential configured in your n8n instance. If you don't have one, go to the 'Credentials' section in n8n and add a new credential for Google Sheets. YouTube API Key:** You need a YouTube Data API v3 key. Go to the Google Cloud Console. Create a new project or select an existing one. Go to 'APIs & Services' > 'Library' and enable the YouTube Data API v3. Go to 'APIs & Services' > 'Credentials', click 'Create Credentials', and choose 'API key'. Copy the generated API key. 2. Google Sheet Setup You will need one Google Sheet with two separate sheets (tabs) inside it. Input Sheet Use Template This sheet provides the list of YouTube channels to process. Required Columns:** Create a sheet with the following two columns: ChannelID: The ID of the YouTube channel (e.g., T7M3PpjBZzw). video_num_to_get: The number of top videos to retrieve for that channel (e.g., 5). Output Sheet This sheet is where the results will be saved. Required Columns:** The workflow will automatically append data to the following columns. You can create them beforehand or let the workflow do it. channelName title videoId videoLink 3. Node Configuration Read Channel Info from Sheet:** Select your Google Sheets credential. Enter your Spreadsheet ID. Enter the name of your Input Sheet. Fetch Most-Viewed Videos via YouTube API:** Replace YOUR_YOUTUBE_API_KEY with the API key you generated in Step 1. Append Video Details to Sheet:** Select your Google Sheets credential. Enter your Spreadsheet ID (the same one as before). Enter the name of your Output Sheet.
by Juan Carlos Cavero Gracia
Description This automation template is designed for Instagram marketers, influencers, and businesses looking to supercharge their Instagram engagement strategy. It automatically monitors Instagram post comments and sends personalized direct messages (DMs) to new commenters, while maintaining a smart tracking system to prevent duplicate messages. The workflow runs continuously, checking for new comments every 15 minutes and responding instantly to maintain high engagement rates. Note: This workflow uses the upload-post.com API for Instagram interactions and Google Sheets for contact tracking. The workflow is configured to monitor a specific Instagram post* Who Is This For? Instagram Marketers & Influencers:** Automatically engage with every commenter by sending personalized DMs with valuable content, links, or offers. E-commerce Businesses:** Convert Instagram comments into sales opportunities by instantly sending product links, discount codes, or catalog information via DM. Content Creators & Coaches:** Build deeper relationships with your audience by automatically reaching out to commenters with additional resources, course links, or exclusive content. Social Media Managers:** Scale client engagement without manual monitoring, ensuring no potential lead or follower interaction goes unnoticed. What Problem Does This Workflow Solve? Manually monitoring Instagram comments and sending follow-up DMs is time-consuming and often leads to missed opportunities. This workflow addresses these challenges by: Automated Comment Monitoring:** Continuously checks for new comments on your specified Instagram post every 15 minutes. Smart Duplicate Prevention:** Uses Google Sheets to track already contacted users, preventing spam and maintaining professional communication. Instant Response System:** Sends personalized DMs immediately when new comments are detected, maximizing engagement while the interaction is fresh. Scalable Engagement:** Handles multiple commenters simultaneously without manual intervention, perfect for viral posts or high-engagement content. Comprehensive Tracking:** Maintains detailed logs of all interactions including timestamps, usernames, and message content for analytics and follow-up. How It Works Post Configuration: Set your Instagram post URL, reply message, and profile username in the configuration node. Comment Monitoring: The workflow fetches all comments from your specified Instagram post using the upload-post.com API. Smart Filtering: Compares new comments against your Google Sheets database to identify users who haven't been contacted yet. Automated DM Sending: Sends personalized direct messages to new commenters with your configured message. Contact Tracking: Records each successful interaction in Google Sheets with comment ID, username, message sent, timestamp, and post URL. Continuous Monitoring: Automatically repeats the process every 15 minutes using the built-in scheduler. Setup Upload-Post API Credentials: Create an account at upload-post.com connect your Instagram account and add your API credentials to the HTTP request nodes. Google Sheets Setup: Create a Google Sheet with columns: comment_id, username, message_sent, timestamp, post_url Connect your Google account to the Google Sheets nodes Update the document ID in the "Read Contacted Users" and "Record Contacted User" nodes Instagram Post Configuration: In the "Configure Post & Message" node, update: postUrl: Your Instagram post URL to monitor replyMessage: The DM message to send to commenters profileUsername: Your Upload-post profile username Monitoring Schedule: The workflow is set to run every 15 minutes. You can adjust this in the "Schedule Trigger" node based on your needs. Requirements Accounts:** n8n, upload-post.com, Google (for Sheets access), Instagram business account. API Keys & Credentials:** Upload-post.com API token, Google Sheets OAuth2 credentials. Instagram Setup:** Business/Creator account with API access through upload-post.com. Features Duplicate Prevention:** Advanced comment ID tracking prevents sending multiple DMs to the same user Error Handling:** Robust error handling for API failures and edge cases Detailed Logging:** Comprehensive console logging for debugging and monitoring Flexible Configuration:** Easy to modify for different posts, messages, and monitoring intervals Success Tracking:** Monitors both successful and failed DM attempts for analytics Use this template to transform your Instagram engagement strategy, automatically converting every comment into a potential lead or deeper connection while maintaining professional communication standards.
by explorium
HubSpot Contact Enrichment with Explorium Template Download the following json file and import it to a new n8n workflow: hubspot\_flow.json Overview This n8n workflow monitors your HubSpot instance for newly created contacts and automatically enriches them with additional contact information. When a contact is created, the workflow: Detects the new contact via HubSpot webhook trigger Retrieves recent contact details from HubSpot Matches the contact against Explorium's database using name, company, and email Enriches the contact with professional emails and phone numbers Updates the HubSpot contact record with discovered information This automation ensures your sales and marketing teams have complete contact information, improving outreach success rates and data quality. Key Features Real-time Webhook Trigger**: Instantly processes new contacts as they're created Intelligent Matching**: Uses multiple data points (name, company, email) for accurate matching Comprehensive Enrichment**: Adds both professional and work emails, plus phone numbers Batch Processing**: Efficiently handles multiple contacts to optimize API usage Smart Data Mapping**: Intelligently maps multiple emails and phone numbers Profile Enrichment**: Optional additional enrichment for deeper contact insights Error Resilience**: Continues processing other contacts if some fail to match Prerequisites Before setting up this workflow, ensure you have: n8n instance (self-hosted or cloud) HubSpot account with: Developer API access (for webhooks) Private App or OAuth2 app created Contact object permissions (read/write) Explorium API credentials (Bearer token) - Get explorium api key Understanding of HubSpot contact properties HubSpot Requirements Required Contact Properties The workflow uses these HubSpot contact properties: firstname - Contact's first name lastname - Contact's last name company - Associated company name email - Primary email (read and updated) work_email - Work email (updated by workflow) phone - Phone number (updated by workflow) API Access Setup Create a Private App in HubSpot: Navigate to Settings → Integrations → Private Apps Create new app with Contact read/write scopes Copy the Access Token Set up Webhooks (for Developer API): Create app in HubSpot Developers portal Configure webhook for contact.creation events Note the App ID and Developer API Key Custom Properties (Optional) Consider creating custom properties for: Multiple email addresses Mobile vs. office phone numbers Data enrichment timestamps Match confidence scores Installation & Setup Step 1: Import the Workflow Copy the workflow JSON from the template In n8n: Navigate to Workflows → Add Workflow → Import from File Paste the JSON and click Import Step 2: Configure HubSpot Developer API (Webhook) Click on the HubSpot Trigger node Under Credentials, click Create New Enter your HubSpot Developer credentials: App ID: From your HubSpot app Developer API Key: From your developer account Client Secret: From your app settings Save as "HubSpot Developer account" Step 3: Configure HubSpot App Token Click on the HubSpot Recently Created node Under Credentials, click Create New (App Token) Enter your Private App access token Save as "HubSpot App Token account" Apply the same credentials to the Update HubSpot node Step 4: Configure Explorium API Credentials Click on the Explorium Match Prospects node Under Credentials, click Create New (HTTP Header Auth) Configure the authentication: Name: Authorization Value: Bearer YOUR_EXPLORIUM_API_TOKEN Save as "Header Auth Connection" Apply to all Explorium nodes: Explorium Enrich Contacts Information Explorium Enrich Profiles Step 5: Configure Webhook Subscription In HubSpot Developers portal: Go to your app's webhook settings Add subscription for contact.creation events Set the target URL from the HubSpot Trigger node Activate the subscription Step 6: Activate the Workflow Save the workflow Toggle the Active switch to ON The webhook is now listening for new contacts Node Descriptions HubSpot Trigger: Webhook that fires when new contacts are created HubSpot Recently Created: Fetches details of recently created contacts Loop Over Items: Processes contacts in batches of 6 Explorium Match Prospects: Finds matching person in Explorium database Filter: Validates successful matches Extract Prospect IDs: Collects matched prospect identifiers Enrich Contacts Information: Fetches emails and phone numbers Enrich Profiles: Gets additional profile data (optional) Merge: Combines all enrichment results Split Out: Separates individual enriched records Update HubSpot: Updates contact with new information Data Mapping Logic The workflow maps Explorium data to HubSpot properties: | Explorium Data | HubSpot Property | Notes | | ------------------------------ | ------------------ | ----------------------------- | | professions_email | email | Primary professional email | | emails[].address | work_email | All email addresses joined | | phone_numbers[].phone_number | phone | All phones joined with commas | | mobile_phone | phone (fallback) | Used if no other phones found | Data Processing The workflow handles complex data scenarios: Multiple emails**: Joins all discovered emails with commas Phone numbers**: Combines all phone numbers into a single field Missing data**: Uses "null" as placeholder for empty fields Name parsing**: Cleans sample data and special characters Usage & Operation Automatic Processing Once activated: Every new contact triggers the webhook immediately Contact is enriched within seconds HubSpot record is updated automatically Process repeats for each new contact Manual Testing To test the workflow: Use the pinned test data in the HubSpot Trigger node, or Create a test contact in HubSpot Monitor the execution in n8n Verify the contact was updated in HubSpot Monitoring Performance Track workflow health: Go to Executions in n8n Filter by this workflow Monitor success rates Review any failed executions Check webhook delivery in HubSpot Troubleshooting Common Issues Webhook not triggering Verify webhook subscription is active in HubSpot Check the webhook URL is correct and accessible Ensure workflow is activated in n8n Test webhook delivery in HubSpot developers portal Contacts not matching Verify contact has firstname, lastname, and company Check for typos or abbreviations in company names Some individuals may not be in Explorium's database Email matching improves accuracy significantly Updates failing in HubSpot Check API token has contact write permissions Verify property names exist in HubSpot Ensure rate limits haven't been exceeded Check for validation rules on properties Missing enrichment data Not all prospects have all data types Phone numbers may be less available than emails Profile enrichment is optional and may not always return data Error Handling Built-in error resilience: Failed matches don't block other contacts Each batch processes independently Partial enrichment is possible All errors are logged for review Debugging Tips Check webhook logs: HubSpot shows delivery attempts Review executions: n8n logs show detailed error messages Test with pinned data: Use the sample data for isolated testing Verify API responses: Check Explorium API returns expected data Best Practices Data Quality Complete contact records: Ensure name and company are populated Standardize company names: Use official names, not abbreviations Include existing emails: Improves match accuracy Regular data hygiene: Clean up test and invalid contacts Performance Optimization Batch size: 6 is optimal for rate limits Webhook reliability: Monitor delivery success API quotas: Track usage in both platforms Execution history: Regularly clean old executions Compliance & Privacy GDPR compliance: Ensure lawful basis for enrichment Data minimization: Only enrich necessary fields Access controls: Limit who can modify enriched data Audit trail: Document enrichment for compliance Customization Options Additional Enrichment Extend with more Explorium data: Job titles and departments Social media profiles Professional experience Skills and interests Company information Enhanced Processing Add workflow logic for: Lead scoring based on enrichment Routing based on data quality Notifications for high-value matches Custom field mapping Integration Extensions Connect to other systems: Sync enriched data to CRM Trigger marketing automation Update data warehouse Send notifications to Slack API Considerations HubSpot Limits API calls**: Monitor daily limits Webhook payload**: Max 200 contacts per trigger Rate limits**: 100 requests per 10 seconds Property limits**: Max 1000 custom properties Explorium Limits Match API**: Batched for efficiency Enrichment calls**: Two parallel enrichments Rate limits**: Based on your plan Data freshness**: Real-time matching Architecture Considerations This workflow integrates with: HubSpot workflows and automation Marketing campaigns and sequences Sales engagement tools Reporting and analytics Other enrichment services Security Best Practices Webhook validation**: Verify requests are from HubSpot Token security**: Rotate API tokens regularly Access control**: Limit workflow modifications Data encryption**: All API calls use HTTPS Audit logging**: Track all enrichments Advanced Configuration Custom Field Mapping Modify the Update HubSpot node to map to custom properties: // Example custom mapping { "custom_mobile": "{{ $json.data.mobile_phone }}", "custom_linkedin": "{{ $json.data.linkedin_url }}", "enrichment_date": "{{ $now.toISO() }}" } Conditional Processing Add logic to process only certain contacts: Filter by contact source Check for specific properties Validate email domains Exclude test contacts Support Resources For assistance: n8n issues**: Check n8n documentation and forums HubSpot API**: Reference HubSpot developers documentation Explorium API**: Contact Explorium support Webhook issues**: Use HubSpot webhook testing tools
by Airtop
Automating Person Data Enrichment and CRM Update Use Case This automation enriches a person’s professional profile using their name and work email, scores them against an ICP (Ideal Customer Profile), and updates their record in HubSpot. It’s ideal for sales, marketing, and recruitment teams needing reliable contact insights. What This Automation Does This automation performs the following using the input parameters: Person name**: The full name of the individual. Work email**: The professional email address of the contact. Airtop Profile (connected to LinkedIn)**: An authenticated Airtop Profile used for LinkedIn-based enrichment. Hubspot object id**: The internal HubSpot ID for the contact to be updated. How It Works Initiates the workflow using a form or external trigger. Uses the name and email to extract and enrich the person’s data, including: LinkedIn profile and company page About section, job title, location ICP score, seniority level, AI interest, technical depth, connection and follower counts Formats and maps the enriched data. Pushes the updated data to HubSpot using the object ID. Setup Requirements Airtop API Key Airtop Profile logged in to LinkedIn. HubSpot access with object ID field for each contact to update. Next Steps Combine with Lead Generation**: Use as part of an end-to-end workflow that sources leads and enriches them in real time. Trigger from CRM**: Initiate this workflow when a new contact is added in HubSpot or another CRM. Customize Scoring Logic**: Tailor the ICP calculation to your team’s specific criteria. Read more about person data enrichment
by Louis Chan
How it works Transform medical documents into structured data using Google Gemini AI with enterprise-grade accuracy. Classifies document types (receipts, prescriptions, lab reports, clinical notes) Extracts text with 95%+ accuracy using advanced OCR Structures data according to medical taxonomy standards Supports multiple languages (English, Chinese, auto-detect) Tracks processing costs and quality metrics automatically Set up steps Prerequisites Google Gemini API key (get from Google AI Studio) Quick setup Import this workflow template Configure Google Gemini API credentials in n8n Test with a sample medical document URL Deploy your webhook endpoint Usage Send POST request to your webhook: { "image_url": "https://example.com/medical-receipt.jpg", "expected_type": "financial", "language_hint": "auto" } Get structured response: json{ "success": true, "result": { "documentType": "financial", "metadata": { "providerName": "Dr. Smith Clinic", "createdDate": "2025-01-06", "currency": "USD" }, "content": { "amount": 150.00, "services": [...] }, "quality_metrics": { "overall_confidence": 0.95 } } } Use cases Healthcare Organizations Medical billing automation - Process receipts and invoices automatically Insurance claim processing - Extract data from claim documents Clinical documentation - Digitize patient records and notes Data standardization - Consistent structured output format System Integrators EMR integration - Connect with existing healthcare systems Workflow automation - Reduce manual data entry by 90% Multi-language support - Handle international medical documents Quality assurance - Built-in confidence scoring and validation Supported Document Types Financial: Medical receipts, bills, insurance claims, invoices Clinical: Medical charts, progress notes, consultation reports Prescription: Prescriptions, medication lists, pharmacy records Administrative: Referrals, authorizations, patient registration Diagnostic: Lab reports, test results, screening reports Legal: Medical certificates, documentation forms
by Jimleuk
This n8n template demonstrates how to build a simple but effective vintage image restoration service using an AI model with image editing capabilities. With Gemini now capable of multimodal output, it's a great time to explore this capability for image or graphics automation. Let's see how well it does for a task such as image restoration. Good to know At time of writing, each image generated will cost $0.039 USD. See Gemini Pricing for updated info. The model used in this workflow is geo-restricted! If it says model not found, it may not be available in your country or region. How it works Images are imported into our workflow via the HTTP node and converted to base64 strings using the Extract from file node. The image data is then pipelined to Gemini's Image Generation model. A prompt is provided to instruct Gemini to "restore" the image to near new condition - of course, feel free to experiment with this prompt to improve the results! Gemini's responds with the image as a base64 string and hence, a convert to file node is used to transform the data to binary. With the restored image as a binary, we can then use this with our Google Drive node to upload it to our desired folder. How to use This demonstration uses 3 random images sourced from the internet but any typical image file will work. Use a webhook node to allow integration from other applications. Use a telegram trigger for instant mobile service! Requirements Google Gemini for LLM/Image generation Google Drive for Upload Storage Customising this workflow AI image editing can be applied to many use-cases not just image restoration. Try using it to add watermarks, branding or modify an existing image for marketing purposes.
by Incrementors
🐦 Twitter Profile Scraper via Bright Data API with Google Sheets Output A comprehensive n8n automation that scrapes Twitter profile data using Bright Data's Twitter dataset and stores comprehensive tweet analytics, user metrics, and engagement data directly into Google Sheets. 📋 Overview This workflow provides an automated Twitter data collection solution that extracts profile information and tweet data from specified Twitter accounts within custom date ranges. Perfect for social media analytics, competitor research, brand monitoring, and content strategy analysis. ✨ Key Features 🔗 Form-Based Input: Easy-to-use form for Twitter URL and date range selection 🐦 Twitter Integration: Uses Bright Data's Twitter dataset for accurate data extraction 📊 Comprehensive Data: Captures tweets, engagement metrics, and profile information 📈 Google Sheets Storage: Automatically stores all data in organized spreadsheet format 🔄 Progress Monitoring: Real-time status tracking with automatic retry mechanisms ⚡ Fast & Reliable: Professional scraping with built-in error handling 📅 Date Range Control: Flexible time period selection for targeted data collection 🎯 Customizable Fields: Advanced data field selection and mapping 🎯 What This Workflow Does Input Twitter Profile URL**: Target Twitter account for data scraping Date Range**: Start and end dates for tweet collection period Custom Fields**: Configurable data points to extract Processing Form Trigger: Collects Twitter URL and date range from user input API Request: Sends scraping request to Bright Data with specified parameters Progress Monitoring: Continuously checks scraping job status until completion Data Retrieval: Downloads complete dataset when scraping is finished Data Processing: Formats and structures extracted information Sheet Integration: Automatically populates Google Sheets with organized data Output Data Points | Field | Description | Example | |-------|-------------|---------| | user_posted | Username who posted the tweet | @elonmusk | | name | Display name of the user | Elon Musk | | description | Tweet content/text | "Exciting updates coming soon..." | | date_posted | When the tweet was posted | 2025-01-15T10:30:00Z | | likes | Number of likes on the tweet | 1,234 | | reposts | Number of retweets | 567 | | replies | Number of replies | 89 | | views | Total view count | 12,345 | | followers | User's follower count | 50M | | following | Users they follow | 123 | | is_verified | Verification status | true/false | | hashtags | Hashtags used in tweet | #AI #Technology | | photos | Image URLs in tweet | image1.jpg, image2.jpg | | videos | Video content URLs | video1.mp4 | | user_id | Unique user identifier | 12345678 | | timestamp | Data extraction timestamp | 2025-01-15T11:00:00Z | 🚀 Setup Instructions Prerequisites n8n instance (self-hosted or cloud) Bright Data account with Twitter dataset access Google account with Sheets access Valid Twitter profile URLs to scrape 10-15 minutes for setup Step 1: Import the Workflow Copy the JSON workflow code from the provided file In n8n: Workflows → + Add workflow → Import from JSON Paste JSON and click Import Step 2: Configure Bright Data Set up Bright Data credentials: In n8n: Credentials → + Add credential → HTTP Header Auth Enter your Bright Data API credentials Test the connection Configure dataset: Ensure you have access to Twitter dataset (gd_lwxkxvnf1cynvib9co) Verify dataset permissions in Bright Data dashboard Step 3: Configure Google Sheets Integration Create a Google Sheet: Go to Google Sheets Create a new spreadsheet named "Twitter Data" or similar Copy the Sheet ID from URL: https://docs.google.com/spreadsheets/d/SHEET_ID_HERE/edit Set up Google Sheets credentials: In n8n: Credentials → + Add credential → Google Sheets OAuth2 API Complete OAuth setup and test connection Prepare your data sheet with columns: Use the column headers from the data points table above The workflow will automatically populate these fields Step 4: Update Workflow Settings Update Bright Data nodes: Open "🚀 Trigger Twitter Scraping" node Replace BRIGHT_DATA_API_KEY with your actual API token Verify dataset ID is correct Update Google Sheets node: Open "📊 Store Twitter Data in Google Sheet" node Replace YOUR_GOOGLE_SHEET_ID with your Sheet ID Select your Google Sheets credential Choose the correct sheet/tab name Step 5: Test & Activate Add test data: Use the form trigger to input a Twitter profile URL Set a small date range for testing (e.g., last 7 days) Test the workflow: Submit the form to trigger the workflow Monitor progress in n8n execution logs Verify data appears in Google Sheet Check all expected columns are populated 📖 Usage Guide Running the Workflow Access the workflow form trigger URL (available when workflow is active) Enter the Twitter profile URL you want to scrape Set the start and end dates for tweet collection Submit the form to initiate scraping Monitor progress - the workflow will automatically check status every minute Once complete, data will appear in your Google Sheet Understanding the Data Your Google Sheet will show: Real-time tweet data** for the specified date range User engagement metrics** (likes, replies, retweets, views) Profile information** (followers, following, verification status) Content details** (hashtags, media URLs, quoted tweets) Timestamps** for each tweet and data extraction Customizing Date Ranges Recent data**: Use last 7-30 days for current activity analysis Historical analysis**: Select specific months or quarters for trend analysis Event tracking**: Focus on specific date ranges around events or campaigns Comparative studies**: Use consistent time periods across different profiles 🔧 Customization Options Modifying Data Fields Edit the custom_output_fields array in the "🚀 Trigger Twitter Scraping" node to add or remove data points: "custom_output_fields": [ "id", "user_posted", "name", "description", "date_posted", "likes", "reposts", "replies", "views", "hashtags", "followers", "is_verified" ] Changing Google Sheet Structure Modify the column mapping in the "📊 Store Twitter Data in Google Sheet" node to match your preferred sheet layout and add custom formulas or calculations. Adding Multiple Recipients To process multiple Twitter profiles: Modify the form to accept multiple URLs Add a loop node to process each URL separately Implement delays between requests to respect rate limits 🚨 Troubleshooting Common Issues & Solutions "Bright Data connection failed" Cause: Invalid API credentials or dataset access Solution: Verify credentials in Bright Data dashboard, check dataset permissions "No data extracted" Cause: Invalid Twitter URLs or private/protected accounts Solution: Verify URLs are valid public Twitter profiles, test with different accounts "Google Sheets permission denied" Cause: Incorrect credentials or sheet permissions Solution: Re-authenticate Google Sheets, check sheet sharing settings "Workflow timeout" Cause: Large date ranges or high-volume accounts Solution: Use smaller date ranges, implement pagination for high-volume accounts "Progress monitoring stuck" Cause: Scraping job failed or API issues Solution: Check Bright Data dashboard for job status, restart workflow if needed Advanced Troubleshooting Check execution logs in n8n for detailed error messages Test individual nodes by running them separately Verify data formats and ensure consistent field mapping Monitor rate limits if scraping multiple profiles consecutively Add error handling and implement retry logic for robust operation 📊 Use Cases & Examples 1. Social Media Analytics Goal: Track engagement metrics and content performance Monitor tweet engagement rates over time Analyze hashtag effectiveness and reach Track follower growth and audience interaction Generate weekly/monthly performance reports 2. Competitor Research Goal: Monitor competitor social media activity Track competitor posting frequency and timing Analyze competitor content themes and strategies Monitor competitor engagement and audience response Identify trending topics and hashtags in your industry 3. Brand Monitoring Goal: Track brand mentions and sentiment analysis Monitor specific Twitter accounts for brand mentions Track hashtag campaigns and user-generated content Analyze sentiment trends and audience feedback Identify influencers and brand advocates 4. Content Strategy Development Goal: Analyze successful content patterns Identify high-performing tweet formats and topics Track optimal posting times and frequencies Analyze hashtag performance and reach Study audience engagement patterns 5. Market Research Goal: Collect social media data for market analysis Gather consumer opinions and feedback Track industry trends and discussions Monitor product launches and market reactions Support product development with social insights ⚙ Advanced Configuration Batch Processing Multiple Profiles To monitor multiple Twitter accounts efficiently: Create a master sheet with profile URLs and date ranges Add a loop node to process each profile separately Implement delays between requests to respect rate limits Use separate sheets or tabs for different profiles Adding Data Analysis Enhance the workflow with analytical capabilities: Create additional sheets for processed data and insights Add formulas to calculate engagement rates and trends Implement data visualization with charts and graphs Generate automated reports and summaries Integration with Business Tools Connect the workflow to your existing systems: CRM Integration**: Update customer records with social media data Slack Notifications**: Send alerts when data collection is complete Database Storage**: Store data in PostgreSQL/MySQL for advanced analysis BI Tools**: Connect to Tableau/Power BI for comprehensive visualization 📈 Performance & Limits Expected Performance Single profile**: 30 seconds to 5 minutes (depending on date range) Data accuracy**: 95%+ for public Twitter profiles Success rate**: 90%+ for accessible accounts Daily capacity**: 10-50 profiles (depends on rate limits and data volume) Resource Usage Memory**: ~200MB per execution Storage**: Minimal (data stored in Google Sheets) API calls**: 1 Bright Data call + multiple Google Sheets calls per profile Bandwidth**: ~5-10MB per profile scraped Execution time**: 2-10 minutes for typical date ranges Scaling Considerations Rate limiting**: Add delays for high-volume scraping Error handling**: Implement retry logic for failed requests Data validation**: Add checks for malformed or missing data Monitoring**: Track success/failure rates over time Cost optimization**: Monitor API usage to control costs 🤝 Support & Community Getting Help n8n Community Forum**: community.n8n.io Documentation**: docs.n8n.io Bright Data Support**: Contact through your dashboard GitHub Issues**: Report bugs and feature requests Contributing Share improvements with the community Report issues and suggest enhancements Create variations for specific use cases Document best practices and lessons learned 📋 Quick Setup Checklist Before You Start ☐ n8n instance running (self-hosted or cloud) ☐ Bright Data account with Twitter dataset access ☐ Google account with Sheets access ☐ Valid Twitter profile URLs ready for scraping ☐ 10-15 minutes available for setup Setup Steps ☐ Import Workflow - Copy JSON and import to n8n ☐ Configure Bright Data - Set up API credentials and test ☐ Create Google Sheet - New sheet with proper column structure ☐ Set up Google Sheets credentials - OAuth setup and test ☐ Update workflow settings - Replace API keys and sheet IDs ☐ Test with sample data - Add 1 Twitter URL and small date range ☐ Verify data flow - Check data appears in Google Sheet correctly ☐ Activate workflow - Enable form trigger for production use Ready to Use! 🎉 Your workflow URL: Access form trigger when workflow is active 🎯 Happy Twitter Scraping! This workflow provides a solid foundation for automated Twitter data collection. Customize it to fit your specific social media analytics and research needs. For any questions or support, please contact: info@incrementors.com or fill out this form: https://www.incrementors.com/contact-us/
by Don Jayamaha Jr
This advanced agent analyzes long-term price action in the Binance Spot Market using 1-day candles. It calculates key macro indicators like RSI, MACD, BBANDS, EMA, SMA, and ADX to identify high-confidence trend setups and market momentum. Used by the Quant AI system for directional bias and macro-level signal validation. 🎥 Watch Tutorial: 🎯 Purpose Detect major trend reversals, consolidation zones, and macro bias Support long-term swing trading decisions Provide reliable 1-day signals for downstream agents 🧠 Core Features | Feature | Description | | --------------------------- | ------------------------------------------------------------ | | 🔁 Trigger | Called by parent workflows via Execute Workflow | | 📥 Input Format | { "message": "MATICUSDT", "sessionId": "telegram_id" } | | 📡 Webhook Call | Sends request to internal 1d indicators webhook | | 🧮 Technical Indicators | RSI, MACD, BBANDS, EMA, SMA, ADX (based on 40 daily candles) | | 🧠 GPT (gpt-4.1-mini) Agent | Interprets numerical data into human-readable trend signals | | 💬 Output | Summary suitable for Telegram or further agent consumption | 🔗 External Tools Called https://treasurium.app.n8n.cloud/webhook/1d-indicators Sends: { "symbol": "SOLUSDT" } 📊 Indicator Calculations | Indicator | Purpose | | -------------- | ------------------------------- | | RSI (14) | Overbought / Oversold Signals | | MACD (12,26,9) | Trend Reversals / Momentum | | BBANDS (20, 2) | Volatility Expansion | | EMA (20) | Short-Term Trend Confirmation | | SMA (20) | Macro-Level Support/Resistance | | ADX (14) | Trend Strength + Directional DI | 📦 Setup Import the JSON into n8n. Add your OpenAI API credentials. Ensure webhook /1d-indicators is connected and working. Use this agent as a sub-workflow in: Binance SM Financial Analyst Tool Binance Spot Market Quant AI Agent 📤 Output Example 📅 1D Overview – MATICUSDT • RSI: 71 → Overbought • MACD: Bearish Cross forming • BBANDS: Widening Volatility • EMA < SMA → Downtrend Momentum • ADX: 33 → High Trend Strength 📌 Notes Not user-facing — outputs are structured JSON or Telegram-style summaries. Pairs well with shorter timeframe tools (15m–4h) for confidence stacking. 🧾 Licensing & Attribution © 2025 Treasurium Capital Limited Company Architecture, prompts, and trade report structure are IP-protected. No unauthorized rebranding permitted. 🔗 Need help? Reach out on LinkedIn – Don Jayamaha