by Jamot
Send one WhatsApp message → Get AI-optimized content across 7+ social platforms. Who It's For Solopreneur managing multiple platforms on the go (X/Twitter, Instagram, LinkedIn, Facebook, TikTok, Threads, YouTube Shorts). What It Solves Eliminates 80% of manual content creation work while maintaining brand consistency across all channels. How It Works AI Content Generation GPT-4/Gemini creates platform-specific posts with hashtags, CTAs, and emojis Auto-generates images via OpenAI/Pollinations.ai SERP API finds relevant trending content Approval Workflow HTML email previews for human review Double-approval system via Gmail integration One-Click Publishing Instagram/Facebook (Graph API) X/Twitter (Official API) LinkedIn (Sales Navigator) Setup Requirements API keys: OpenAI, Gemini, social platform tokens, ImgBB, SERP API Gmail and Telegram for notifications Replace "your-unique-id" placeholders in API nodes Customization Content**: Edit AI prompts for brand voice Approval**: Modify email templates and thresholds Analytics**: Connect Google Sheets for tracking Images**: Switch between AI image generators
by ConnectSafely
Who's it for This workflow is built for sales professionals, recruiters, founders, and growth marketers who need to build targeted prospect lists from LinkedIn without risking their accounts. Perfect for anyone who wants to find decision-makers, build lead lists, or research target audiences at scale. If you're running outbound campaigns, building ABM lists, sourcing candidates, or doing competitive research, this automation handles LinkedIn searches and exports results directly to your Google Sheet—no browser cookies, no session hijacking, no ban risk. How it works The workflow automates LinkedIn people searches by leveraging ConnectSafely.ai's compliant API, then exports structured results to Google Sheets or JSON files. The process flow: Define your search parameters (keywords, location, job title, result limit) Execute the search via ConnectSafely.ai API Process and normalize the response data Export to Google Sheets for CRM import or further automation Optionally save as JSON file for data backup or processing No LinkedIn cookies required. No browser automation. Platform-compliant searches that won't get your account restricted. Watch the complete step-by-step implementation guide: LinkedIn Search Export Automation Tutorial Setup steps Step 1: Get Your ConnectSafely.ai API Credentials Obtain API Key: Log into ConnectSafely.ai Dashboard Navigate to Settings → API Keys Generate a new API key Copy your API key (you'll need this in the next step) Add Bearer Auth Credential in n8n: Go to Credentials in n8n Click Add Credential → HTTP Bearer Auth Paste your ConnectSafely.ai API key Save the credential Step 2: Configure Search Parameters Open the Set Search Parameters node and customize your search: | Parameter | Description | Example | |-----------|-------------|---------| | keywords | Search terms for profiles | CEO SaaS, Marketing Director | | location | Geographic filter | United States, San Francisco Bay Area | | title | Job title filter | Head of Growth, VP Sales | | limit | Maximum results to return | 100 (max varies by plan) | Pro Tips: Use specific keywords for better targeting Combine title + keywords for precision (e.g., keywords: "B2B" + title: "VP Sales") Start with smaller limits (25-50) for testing Step 3: Configure Google Sheets Integration 3.1 Connect Google Sheets Account Go to Credentials → Add Credential → Google Sheets OAuth2 Follow the OAuth flow to connect your Google account Grant access to Google Sheets 3.2 Prepare Your Google Sheet Create a new Google Sheet with the following columns (the workflow will auto-populate these): | Column Name | Description | |-------------|-------------| | profileUrl | LinkedIn profile URL | | fullName | Contact's full name | | firstName | First name | | lastName | Last name | | headline | LinkedIn headline/tagline | | currentPosition | Current job title | | company | Company name (extracted from headline) | | location | Geographic location | | connectionDegree | 1st, 2nd, or 3rd degree connection | | isPremium | LinkedIn Premium member (true/false) | | isOpenToWork | Open to work badge (true/false) | | profilePicture | Profile image URL | | extractedAt | Timestamp of extraction | 3.3 Configure the Export Node Open the Export to Google Sheets node Select your Google Sheets credential Enter your Document ID (from the sheet URL) Select the Sheet Name The column mapping is pre-configured for auto-mapping Step 4: Test the Workflow Click the Manual Trigger node Click Test Workflow Verify: Search executes successfully Results appear in the Format Results output Data exports to your Google Sheet JSON file is generated (optional) Customization Search Parameter Combinations Sales Prospecting: keywords: "B2B SaaS" location: "United States" title: "VP of Sales" limit: 100 Recruiting: keywords: "Python Machine Learning" location: "San Francisco Bay Area" title: "Senior Engineer" limit: 50 Founder Networking: keywords: "Seed Series A" location: "New York City" title: "Founder CEO" limit: 100 Extending the Workflow Add to CRM: Connect the Format Results output to HubSpot, Salesforce, or Pipedrive nodes Enrich Data: Add a loop to fetch full profile details for each result using the /linkedin/profile endpoint Chain with Outreach: Connect to the LinkedIn Connection Request Workflow to automatically send personalized invites to your search results Schedule Searches: Replace Manual Trigger with a Schedule Trigger to run daily/weekly searches Output Data Format Each result includes: { "profileUrl": "https://www.linkedin.com/in/johndoe", "profileId": "johndoe", "profileUrn": "urn:li:member:123456789", "fullName": "John Doe", "firstName": "John", "lastName": "Doe", "headline": "VP of Sales at TechCorp | B2B SaaS", "currentPosition": "VP of Sales", "company": "TechCorp", "location": "San Francisco, California", "connectionDegree": "2nd", "isPremium": true, "isOpenToWork": false, "profilePicture": "https://media.licdn.com/...", "extractedAt": "2024-01-15T10:30:00.000Z" } Use Cases Sales Prospecting: Build targeted lead lists of decision-makers at companies matching your ICP Recruiting & Talent Sourcing: Find passive candidates with specific skills and experience levels Market Research: Analyze competitor employee profiles and organizational structures Event Planning: Build invite lists for webinars, conferences, or virtual events Partnership Development: Identify potential partners and integration opportunities Investor Research: Find founders and executives at companies in specific stages/verticals Troubleshooting Common Issues & Solutions Issue: "No results found" error Solution:** Broaden your search parameters; try removing one filter at a time Issue: Empty company field in results Solution:** Company is extracted from headline; some profiles may not include company in their headline format Issue: API authentication errors Solution:** Verify your ConnectSafely.ai API key is valid and has proper permissions; check Bearer Auth credential format Issue: Google Sheets not updating Solution:** Confirm OAuth credentials are valid; check that the sheet has write permissions Issue: Fewer results than expected Solution:** LinkedIn limits search results; try more specific parameters or upgrade your ConnectSafely.ai plan Issue: Rate limit errors Solution:** Add delay between multiple searches; check your API plan limits Documentation & Resources Official Documentation ConnectSafely.ai Docs:** https://connectsafely.ai/docs API Reference:** Available in ConnectSafely.ai dashboard n8n HTTP Request Node:** https://docs.n8n.io/nodes/n8n-nodes-base.httpRequest Support Channels Email Support:** support@connectsafely.ai Documentation:** https://connectsafely.ai/docs Custom Workflows:** Contact us for custom automation Connect With Us Stay updated with the latest automation tips, LinkedIn strategies, and platform updates: LinkedIn:** linkedin.com/company/connectsafelyai YouTube:** youtube.com/@ConnectSafelyAI-v2x Instagram:** instagram.com/connectsafely.ai Facebook:** facebook.com/connectsafelyai X (Twitter):** x.com/AiConnectsafely Bluesky:** connectsafelyai.bsky.social Mastodon:** mastodon.social/@connectsafely Need Custom Workflows? Looking to build sophisticated LinkedIn automation workflows tailored to your business needs? Contact our team for custom automation development, strategy consulting, and enterprise solutions. We specialize in: Multi-channel prospecting workflows AI-powered lead scoring and qualification CRM integration and data synchronization Custom search and enrichment pipelines Bulk outreach automation with personalization
by Oneclick AI Squad
This n8n workflow automates task creation and scheduled reminders for users via a Telegram bot, ensuring timely notifications across multiple channels like email and Slack. It streamlines task management by validating inputs, storing tasks securely, and delivering reminders while updating statuses for seamless follow-up. Key Features Enables users to create tasks directly in chat via webhook integration. Triggers periodic checks for due tasks and processes them individually for accurate reminders. Routes reminders to preferred channels (Telegram, email, or Slack) based on user settings. Validates inputs, handles errors gracefully, and logs task data for persistence and auditing. Workflow Process The Webhook Entry Point node receives task creation requests from users via chat (e.g., Telegram bot), including details like user ID, task description, and channel preferences. The Input Validation node checks for required fields (e.g., user ID, task description); if validation fails, it routes to the Error Response node. The Save to Database node stores validated task data securely in a database (e.g., PostgreSQL, MongoDB, or MySQL) for persistence. The Success Response node (part of Response Handlers) returns a confirmation message to the user in JSON format. The Schedule Trigger node runs every 3 minutes to check for pending reminders (with a 5-minute buffer for every hour to avoid duplicates). The Fetch Due Tasks node queries the database for tasks due within the check window (e.g., reminders set for within 3 minutes). The Tasks Check node verifies if fetched tasks exist and are eligible for processing. The Split Items node processes each due task individually to handle them in parallel without conflicts. The Channel Router node directs reminders to the appropriate channel based on task settings (e.g., email, Slack, or Telegram). The Email Sender node sends HTML-formatted reminder emails with task details and setup instructions. The Slack Sender node delivers Slack messages using webhooks, including task formatting and user mentions. The Telegram Sender node sends Telegram messages via bot API, including task ID, bot setup, and conversation starters. The Update Task Status node marks the task as reminded in the database (e.g., updating status to "sent" with timestamp). The Workflow Complete! node finalizes the process, logging completion and preparing for the next cycle. Setup Instructions Import the workflow into n8n and configure the Webhook Entry Point with your Telegram bot's webhook URL and authentication. Set up database credentials in the Save to Database and Fetch Due Tasks nodes (e.g., connect to PostgreSQL or MongoDB). Configure channel-specific credentials: Telegram bot token for Telegram Sender, email SMTP for Email Sender, and Slack webhook for Slack Sender. Adjust the Schedule Trigger interval (e.g., every 3 minutes) and add any custom due-time logic in Fetch Due Tasks. Test the workflow by sending a sample task creation request via the webhook and simulating due tasks to verify reminders and status updates. Monitor executions in n8n dashboard and tweak validation rules or response formats as needed for your use case. Prerequisites Telegram bot setup with webhook integration for task creation and messaging. Database service (e.g., PostgreSQL, MongoDB, or MySQL) for task storage and querying. Email service (e.g., SMTP provider) and Slack workspace for multi-channel reminders. n8n instance with webhook and scheduling enabled. Basic API knowledge for bot configuration and channel routing. Modification Options Customize the Input Validation node to add fields like priority levels or recurring task flags. Extend the Channel Router to include additional channels (e.g., Microsoft Teams or SMS via Twilio). Modify the Schedule Trigger to use dynamic intervals based on task urgency or user preferences. Enhance the Update Task Status node to trigger follow-up actions, like archiving completed tasks. Adjust the Telegram Sender node for richer interactions, such as inline keyboards for task rescheduling. Explore More AI Workflows: Get in touch with us for custom n8n automation!
by isaWOW
Paste any video URL into the form, choose your target languages, and this workflow handles everything else. It sends the video to WayinVideo AI, which automatically generates short vertical clips with translated subtitles in every language you selected. Each clip is downloaded and saved directly to your Google Drive folder — ready to post on TikTok, Instagram Reels, or YouTube Shorts. Built for content agencies, social media managers, and global brands who want to repurpose videos across multiple languages at scale — without editing a single frame. What This Workflow Does Multilingual clip generation** — Sends your video to WayinVideo once and generates a separate set of AI-clips per language, all in one run Translated subtitles embedded** — Every clip includes AI-generated captions translated into the target language — no manual subtitle work needed Auto-polling with retry** — Waits 45 seconds, checks if clips are ready, and loops back automatically every 30 seconds until all results arrive Batch clip processing** — Extracts every clip from the results — title, score, tags, description, and timestamps — and processes each one individually Automatic file download** — Downloads each clip video file directly from the WayinVideo export link, no manual clicking required Google Drive upload** — Saves every downloaded clip to your specified Google Drive folder, named using the clip's AI-generated title Team-friendly form** — A simple web form lets anyone on your team submit jobs — no n8n knowledge needed Setup Requirements Tools you'll need: Active n8n instance (self-hosted or n8n Cloud) WayinVideo account + API key Google account connected to n8n via OAuth2 Estimated Setup Time: 10–15 minutes Step-by-Step Setup Get your WayinVideo API key Log in at WayinVideo, go to your account settings or developer section, and copy your API key. Paste the API key into node "3. WayinVideo — Submit Clipping Task" Open this node, find the Authorization header, and replace YOUR_WAYINVIDEO_API_KEY with your actual key. Paste the API key into node "5. WayinVideo — Get Clips Result" Open this node, find the same Authorization header, and replace YOUR_WAYINVIDEO_API_KEY again. > ⚠️ This key appears in 2 nodes — you must replace it in both "3. WayinVideo — Submit Clipping Task" and "5. WayinVideo — Get Clips Result" or the workflow will fail. Connect your Google account Open node "10. Google Drive — Upload Clip". Click the credential field and connect your Google account via OAuth2. Follow the on-screen prompts to authorise n8n. Set your Google Drive folder ID In node "10. Google Drive — Upload Clip", find the folderId field. Replace YOUR_GDRIVE_FOLDER_ID with your actual folder ID. To find it: open your target Google Drive folder in a browser — the folder ID is the string of letters and numbers at the end of the URL after /folders/. Activate the workflow Toggle the workflow to Active. Open the form URL generated by node "1. Form — Video URL + Languages" and submit a test video with one or two language codes to confirm everything works end to end. How It Works (Step by Step) Step 1 — Form Trigger (Web Form) The workflow starts when someone fills out the web form. They enter four things: the video URL, the brand or project name, the target languages as comma-separated codes (e.g. en,hi,es,fr), and the number of clips to generate per language. This form is hosted by n8n and can be shared with anyone on your team — no technical knowledge required. Step 2 — Split by Language (Code) The language codes entered in the form are split into individual items — one per language. So if you entered en,hi,es, the workflow creates three separate jobs. Each job carries the video URL, brand name, target language, and clip count. From this point, every language runs through the same steps independently. Step 3 — Submit to WayinVideo AI For each language, the video URL and settings are sent to the WayinVideo API. The request includes clip duration (30–60 seconds), HD 720p resolution, 9:16 vertical ratio for social media, AI reframing enabled, and captions set to display translated text in the target language. WayinVideo processes the video and returns a task ID used to track that specific job. Step 4 — Wait 45 Seconds The workflow pauses for 45 seconds to give WayinVideo time to process the video before the first status check. This prevents unnecessary requests being sent too early when results are not yet available. Step 5 — Poll for Results The workflow calls the WayinVideo results endpoint using the task ID from Step 3. It checks: "Are the clips ready?" and receives either a completed clips array or a status that indicates processing is still in progress. Step 6 — Check: Status SUCCEEDED? (YES / NO branch) YES** — If the status equals SUCCEEDED, the workflow moves forward to extract and process each clip. NO** — If the job is still processing, the workflow routes to a 30-second retry wait, then loops back to Step 5 and polls again. This continues automatically until the clips are ready. > ⚠️ Infinite Loop Risk: If WayinVideo never returns a SUCCEEDED status — due to an invalid video URL, a private video, or an API error — this loop will run forever. Consider adding a retry counter to stop the loop after a set number of attempts and send an error alert instead. Step 7 — Wait 30 Seconds (Retry) When clips are not ready yet, the workflow pauses for 30 seconds before polling again. This gap prevents hitting WayinVideo's API rate limits during the retry loop. Step 8 — Extract Each Clip (Code) Once the status is SUCCEEDED, a code step reads the clips data and splits it into individual items — one per clip. Each item includes the clip title, export download link, AI score, tags, description, and start/end timestamps in milliseconds. Step 9 — Download Each Clip File For each clip, the workflow fetches the video file from WayinVideo's export link and downloads the binary file into n8n's memory, ready to be saved. Step 10 — Upload to Google Drive Each downloaded clip file is uploaded to your Google Drive folder. The file is named using the AI-generated clip title from Step 8, so every clip arrives in Drive with a clean, descriptive, ready-to-use filename. The final result is a Google Drive folder containing all short clips — organised by AI-generated titles — in every language you requested, ready to publish. Key Features ✅ Multilingual in one run — Enter multiple language codes and the workflow generates a full clip set per language automatically — no need to run it separately for each language ✅ Translated captions baked in — Subtitles are translated and embedded at the WayinVideo stage — clips arrive ready to post with no extra editing ✅ 9:16 vertical format — Every clip is automatically reframed for TikTok, Instagram Reels, and YouTube Shorts — no manual cropping required ✅ AI clip scoring — WayinVideo ranks clips by engagement potential, so you always get the most shareable moments first ✅ Auto-retry polling — The workflow keeps checking until clips are ready — you don't need to monitor or manually re-run anything ✅ Smart file naming — Each clip in Drive is named using its AI-generated title, not a random ID — your folder stays organised and client-ready ✅ Batch processing — Multiple clips per language are all downloaded and uploaded in a single run ✅ Team-friendly form input — Anyone on your team can submit a video job through the web form — no access to n8n needed Customisation Options Generate shorter clips for TikTok In node "3. WayinVideo — Submit Clipping Task", change target_duration from DURATION_30_60 to DURATION_15_30 to generate 15–30 second clips better suited for TikTok's shorter format. Switch to square format for Twitter/LinkedIn In the same Submit node, change ratio from RATIO_9_16 to RATIO_1_1 to produce square clips optimised for Twitter and LinkedIn feeds. Add Slack or email notifications Insert a Slack or Gmail node after "10. Google Drive — Upload Clip" to automatically ping your team with a message and the Drive folder link every time a new batch of clips is saved. Log clip metadata to Google Sheets After the upload step, add a Google Sheets "Append Row" node to record each clip's title, score, tags, language, and Drive link in a spreadsheet — useful for content calendars and client reporting. Sort clips into language-specific subfolders Pass the target_lang value from Step 2 into the folder selection in "10. Google Drive — Upload Clip" to automatically save each language's clips into its own subfolder (e.g. /clips/hi/, /clips/es/). Add a retry limit to prevent infinite loops Add a Set node before the retry wait to track a counter. Add a second IF node to check if the counter exceeds 8–10 attempts — if it does, route to a Gmail or Slack node to send an error alert and stop the loop. Troubleshooting API key not working: Check that you replaced YOUR_WAYINVIDEO_API_KEY in both node "3. WayinVideo — Submit Clipping Task" and node "5. WayinVideo — Get Clips Result" Confirm your WayinVideo account is active and the key has not expired Make sure there are no extra spaces before or after the key when pasting Workflow stuck in the polling loop: Check that the video URL is publicly accessible — private, age-restricted, or geo-blocked videos will not process in WayinVideo Open the output of node "5. WayinVideo — Get Clips Result" and inspect the raw response to see if WayinVideo returned an error message If clips never arrive, the workflow will loop indefinitely — fix the video URL and re-run Wrong language codes entered: Use standard ISO 639-1 two-letter codes only: en, hi, es, fr, pt, ar, de, ja, zh, ko, ru, id Do not use full language names like "Hindi" or "Spanish" — these will not be recognised by the WayinVideo API Separate codes with commas and no spaces (e.g. en,hi,es) or with spaces after commas — the workflow trims them automatically Google Drive upload failing: Make sure the Google OAuth2 credential in "10. Google Drive — Upload Clip" is connected and not expired — reconnect it if needed Confirm the folder ID is correct: it should be just the ID string (e.g. 1BxiMVs0XRA5nFMdKvBdBZjgmUUqptlbs), not the full URL Check that your Google account has write permission for that specific folder Form not triggering the workflow: Make sure the workflow is set to Active — it will not accept form submissions in inactive mode Copy the form URL directly from node "1. Form — Video URL + Languages" by clicking the node and finding the production URL If testing inside n8n, use the production URL, not the test URL Support Need help setting this up or want a custom version built for your team or agency? 📧 Email:info@isawow.com 🌐 Website:https://isawow.com
by Shreya Bhingarkar
What it does This workflow finds local businesses from Google Maps and automatically enriches them with emails, social profiles, AI summaries, and personalized outreach messages — all saved to Google Sheets. How it works Searches Google Maps using your custom queries Scrapes each business website for emails and contact info Finds missing social profiles (Instagram, Facebook, LinkedIn, Twitter, TikTok, YouTube) Validates emails and removes invalid ones Uses AI to generate a business summary and services list Writes a personalized outreach message for each lead Scores each lead from 0 to 10 based on digital presence Exports everything clean and organized to Google Sheets Who it is for Lead generation agencies Freelancers offering outreach as a service B2B sales and marketing teams Automation consultants Requirements Serper API key Google Sheets OAuth Email validation API Ollama or any LLM endpoint How to set up Connect your credentials in n8n Add your Google Sheet ID to the Sheets node Add your search queries like "bakery in Ohio" or "gyms in Delhi" Hit run and leads start filling your sheet automatically!
by vinci-king-01
Breaking News Aggregator with SendGrid and PostgreSQL ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow automatically scrapes multiple government and regulatory websites, extracts the latest policy or compliance-related news, stores the data in PostgreSQL, and instantly emails daily summaries to your team through SendGrid. It is ideal for compliance professionals and industry analysts who need near real-time awareness of regulatory changes impacting their sector. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or n8n.cloud) ScrapeGraphAI community node installed Operational SendGrid account PostgreSQL database accessible from n8n Basic knowledge of SQL for table creation Required Credentials ScrapeGraphAI API Key** – Enables web scraping and parsing SendGrid API Key** – Sends email notifications PostgreSQL Credentials** – Host, port, database, user, and password Specific Setup Requirements | Resource | Requirement | Example Value | |----------------|------------------------------------------------------------------|------------------------------| | PostgreSQL | Table with columns: id, title, url, source, published_at| news_updates | | Allowed Hosts | Outbound HTTPS access from n8n to target sites & SendGrid endpoint| https://*.gov, https://api.sendgrid.com | | Keywords List | Comma-separated compliance terms to filter results | GDPR, AML, cybersecurity | How it works This workflow automatically scrapes multiple government and regulatory websites, extracts the latest policy or compliance-related news, stores the data in PostgreSQL, and instantly emails daily summaries to your team through SendGrid. It is ideal for compliance professionals and industry analysts who need near real-time awareness of regulatory changes impacting their sector. Key Steps: Schedule Trigger**: Runs once daily (or at any chosen interval). ScrapeGraphAI**: Crawls predefined regulatory URLs and returns structured article data. Code (JS)**: Filters results by keywords and formats them. SplitInBatches**: Processes articles in manageable chunks to avoid timeouts. If Node**: Checks whether each article already exists in the database. PostgreSQL**: Inserts only new articles into the news_updates table. Set Node**: Generates an email-friendly HTML summary. SendGrid**: Dispatches the compiled summary to compliance stakeholders. Set up steps Setup Time: 15-20 minutes Install ScrapeGraphAI Node: From n8n, go to “Settings → Community Nodes → Install”, search “ScrapeGraphAI”, and install. Create PostgreSQL Table: CREATE TABLE news_updates ( id SERIAL PRIMARY KEY, title TEXT, url TEXT UNIQUE, source TEXT, published_at TIMESTAMP ); Add Credentials: Navigate to “Credentials”, add ScrapeGraphAI, SendGrid, and PostgreSQL credentials. Import Workflow: Copy the JSON workflow, paste into “Import from Clipboard”. Configure Environment Variables (optional): REG_NEWS_KEYWORDS, SEND_TO_EMAILS, DB_TABLE_NAME. Set Schedule: Open the Schedule Trigger node and define your preferred cron expression. Activate Workflow: Toggle “Active”, then click “Execute Workflow” once to validate all connections. Node Descriptions Core Workflow Nodes: Schedule Trigger** – Initiates the workflow at the defined interval. ScrapeGraphAI** – Scrapes and parses news listings into JSON. Code** – Filters articles by keywords and normalizes timestamps. SplitInBatches** – Prevents database overload by batching inserts. If** – Determines whether an article is already stored. PostgreSQL** – Executes parameterized INSERT statements. Set** – Builds the HTML email body. SendGrid** – Sends the daily digest email. Data Flow: Schedule Trigger → ScrapeGraphAI → Code → SplitInBatches → If → PostgreSQL → Set → SendGrid Customization Examples Change Keyword Filtering // Code Node snippet const keywords = ['GDPR','AML','SOX']; // Add or remove terms item.filtered = keywords.some(k => item.title.includes(k)); return item; Switch to Weekly Digest { "trigger": { "cronExpression": "0 9 * * 1" // Every Monday at 09:00 } } Data Output Format The workflow outputs structured JSON data: { "title": "Data Privacy Act Amendment Passed", "url": "https://regulator.gov/news/1234", "source": "regulator.gov", "published_at": "2024-06-12T14:30:00Z" } Troubleshooting Common Issues ScrapeGraphAI node not found – Install the community node and restart n8n. Duplicate key error in PostgreSQL – Ensure the url column is marked UNIQUE to prevent duplicates. Emails not sending – Verify SendGrid API key and check account’s daily limit. Performance Tips Limit initial scrape URLs to fewer than 20 to reduce run time. Increase SplitInBatches size only if your database can handle larger inserts. Pro Tips: Use environment variables to manage sensitive credentials securely. Add an Error Trigger node to catch and log failures for auditing purposes. Combine with Slack or Microsoft Teams nodes to push instant alerts alongside email digests.
by vinci-king-01
Social Media Sentiment Analysis Dashboard with AI and Real-time Monitoring 🎯 Target Audience Social media managers and community managers Marketing teams monitoring brand reputation PR professionals tracking public sentiment Customer service teams identifying trending issues Business analysts measuring social media ROI Brand managers protecting brand reputation Product managers gathering user feedback 🚀 Problem Statement Manual social media monitoring is overwhelming and often misses critical sentiment shifts or trending topics. This template solves the challenge of automatically collecting, analyzing, and visualizing social media sentiment data across multiple platforms to provide actionable insights for brand management and customer engagement. 🔧 How it Works This workflow automatically monitors social media platforms using AI-powered sentiment analysis, processes mentions and conversations, and provides real-time insights through a comprehensive dashboard. Key Components Scheduled Trigger - Runs the workflow at specified intervals to maintain real-time monitoring AI-Powered Sentiment Analysis - Uses advanced NLP to analyze sentiment, emotions, and topics Multi-Platform Integration - Monitors Twitter, Reddit, and other social platforms Real-time Alerting - Sends notifications for critical sentiment changes or viral content Dashboard Integration - Stores all data in Google Sheets for comprehensive analysis and reporting 📊 Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | timestamp | DateTime | When the mention was recorded | "2024-01-15T10:30:00Z" | | platform | String | Social media platform | "Twitter" | | username | String | User who posted the content | "@john_doe" | | content | String | Full text of the post/comment | "Love the new product features!" | | sentiment_score | Number | Sentiment score (-1 to 1) | 0.85 | | sentiment_label | String | Sentiment classification | "Positive" | | emotion | String | Primary emotion detected | "Joy" | | topics | Array | Key topics identified | ["product", "features"] | | engagement | Number | Likes, shares, comments | 1250 | | reach_estimate | Number | Estimated reach | 50000 | | influence_score | Number | User influence metric | 0.75 | | alert_priority | String | Alert priority level | "High" | 🛠️ Setup Instructions Estimated setup time: 20-25 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Google Sheets account with API access Slack workspace for notifications (optional) Social media API access (Twitter, Reddit, etc.) Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai npm install n8n-nodes-slack 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for sentiment analysis data Configure the sheet name (default: "Sentiment Analysis") 4. Configure Social Media Monitoring Update the websiteUrl parameters in ScrapeGraphAI nodes Add URLs for social media platforms you want to monitor Customize the user prompt to extract specific sentiment data Set up keywords, hashtags, and brand mentions to track 5. Set up Notification Channels Configure Slack webhook or API credentials Set up email service credentials for alerts Define sentiment thresholds for different alert levels Test notification delivery 6. Configure Schedule Trigger Set monitoring frequency (every 15 minutes, hourly, etc.) Choose appropriate time zones for your business hours Consider social media platform rate limits 7. Test and Validate Run the workflow manually to verify all connections Check Google Sheets for proper data formatting Test sentiment analysis with sample content 🔄 Workflow Customization Options Modify Monitoring Targets Add or remove social media platforms Change keywords, hashtags, or brand mentions Adjust monitoring frequency based on platform activity Extend Sentiment Analysis Add more sophisticated emotion detection Implement topic clustering and trend analysis Include influencer identification and scoring Customize Alert System Set different thresholds for different sentiment levels Create tiered alert systems (info, warning, critical) Add sentiment trend analysis and predictions Output Customization Add data visualization and reporting features Implement sentiment trend charts and graphs Create executive dashboards with key metrics Add competitor sentiment comparison 📈 Use Cases Brand Reputation Management**: Monitor and respond to brand mentions Crisis Management**: Detect and respond to negative sentiment quickly Customer Feedback Analysis**: Understand customer satisfaction and pain points Product Launch Monitoring**: Track sentiment around new product releases Competitor Analysis**: Monitor competitor sentiment and engagement Influencer Identification**: Find and engage with influential users 🚨 Important Notes Respect social media platforms' terms of service and rate limits Implement appropriate delays between requests to avoid rate limiting Regularly review and update your monitoring keywords and parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Consider privacy implications and data protection regulations 🔧 Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Google Sheets permission errors: Check OAuth2 scope and permissions Sentiment analysis errors: Review the Code node's JavaScript logic Rate limiting: Adjust monitoring frequency and implement delays Alert delivery failures: Check notification service credentials Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations Social media platform API documentation Sentiment analysis best practices and guidelines
by Daniel
Transform any website into a structured knowledge repository with this intelligent crawler that extracts hyperlinks from the homepage, intelligently filters images and content pages, and aggregates full Markdown-formatted content—perfect for fueling AI agents or building comprehensive company dossiers without manual effort. 📋 What This Template Does This advanced workflow acts as a lightweight web crawler: it scrapes the homepage to discover all internal links (mimicking a sitemap extraction), deduplicates and validates them, separates image assets from textual pages, then fetches and converts non-image page content to clean Markdown. Results are seamlessly appended to Google Sheets for easy analysis, export, or integration into vector databases. Automatically discovers and processes subpage links from the homepage Filters out duplicates and non-HTTP links for efficient crawling Converts scraped content to Markdown for AI-ready formatting Categorizes and stores images, links, and full content in a single sheet row per site 🔧 Prerequisites Google account with Sheets access for data storage n8n instance (cloud or self-hosted) Basic understanding of URLs and web links 🔑 Required Credentials Google Sheets OAuth2 API Setup Go to console.cloud.google.com → APIs & Services → Credentials Click "Create Credentials" → Select "OAuth client ID" → Choose "Web application" Add authorized redirect URIs: https://your-n8n-instance.com/rest/oauth2-credential/callback (replace with your n8n URL) Download the client ID and secret, then add to n8n as "Google Sheets OAuth2 API" credential type During setup, grant access to Google Sheets scopes (e.g., spreadsheets) and test the connection by listing a sheet ⚙️ Configuration Steps Import the workflow JSON into your n8n instance In the "Set Website" node, update the website_url value to your target site (e.g., https://example.com) Assign your Google Sheets credential to the three "Add ... to Sheet" nodes Update the documentId and sheetName in those nodes to your target spreadsheet ID and sheet name/ID Ensure your sheet has columns: "Website", "Links", "Scraped Content", "Images" Activate the workflow and trigger manually to test scraping 🎯 Use Cases Knowledge base creation: Crawl a company's site to aggregate all content into Sheets, then export to Notion or a vector DB for internal wikis AI agent training: Extract structured Markdown from industry sites to fine-tune LLMs on domain-specific data like legal docs or tech blogs Competitor intelligence: Build dossiers by crawling rival websites, separating assets and text for SEO audits or market analysis Content archiving: Preserve dynamic sites (e.g., news portals) as static knowledge dumps for compliance or historical research ⚠️ Troubleshooting No links extracted: Verify the homepage has tags; test with a simple site like example.com and check HTTP response in executions Sheet update fails: Confirm column names match exactly (case-sensitive) and credential has edit permissions; try a new blank sheet Content truncated: Google Sheets limits cells to ~50k chars—adjust the .slice(0, 50000) in "Add Scraped Content to Sheet" or split into multiple rows Rate limiting errors: Add a "Wait" node after "Scrape Links" with 1-2s delay if the site blocks rapid requests
by ConnectSafely
Extract LinkedIn Group Members to Google Sheets - Premium & Verified Only using ConnectSafely.AI API Who's it for This workflow is built for sales professionals, community managers, recruiters, and growth marketers who want to extract high-quality leads from LinkedIn groups without the manual grind. Perfect for anyone who needs to identify decision-makers, founders, and serious professionals within large LinkedIn communities. If you're running targeted outreach campaigns, building prospect lists, researching competitor communities, or looking to connect with verified industry leaders, this automation filters the noise and delivers only Premium and Verified members straight to your spreadsheet. How it works The workflow automates LinkedIn group member extraction by combining pagination handling with intelligent filtering through ConnectSafely.ai's API. The process flow: Initializes pagination variables with your target group ID Fetches group members in batches of 50 via ConnectSafely.ai API Filters each batch for Premium OR Verified members only Extracts profile data (name, headline, follower count, profile URL, etc.) Checks if more pages exist and loops back automatically Once complete, splits all members into individual items Appends or updates records in Google Sheets (deduplicates by Profile ID) The pagination loop handles groups of any size - whether 500 or 50,000 members. Setup steps Step 1: Prepare Your Google Sheet Structure your Google Sheet with the following columns: | Column Name | Description | Required | |------------|-------------|----------| | Profile ID | Unique LinkedIn profile identifier | Yes | | First Name | Member's first name | Yes | | Last Name | Member's last name | Yes | | Full Name | Combined first and last name | Yes | | Headline | Professional headline/tagline | Yes | | Public Identifier | LinkedIn username | Yes | | Profile URL | Direct link to LinkedIn profile | Yes | | Follower Count | Number of followers | Yes | | Is Premium | Premium subscription status | Yes | | Is Verified | Verification badge status | Yes | | Relationship Status | Connection degree (1st, 2nd, 3rd) | Yes | Pro Tip: The workflow uses "Append or Update" operation with Profile ID as the matching column, so running it multiple times won't create duplicates. Step 2: Configure ConnectSafely.ai API Credentials Obtain API Key Log into ConnectSafely.ai Dashboard Navigate to Settings → API Keys Generate a new API key Add Bearer Auth Credential in n8n Go to Credentials in n8n Click Add Credential → Header Auth or Bearer Auth Paste your ConnectSafely.ai API key Save the credential This credential is used by the "Fetch Group Members" HTTP Request node. Step 3: Configure Google Sheets Integration 3.1 Connect Google Sheets Account Go to Credentials → Add Credential → Google Sheets OAuth2 Follow the OAuth flow to connect your Google account Grant access to Google Sheets 3.2 Configure "Append to Google Sheets" Node Open the Append to Google Sheets node Select your Google Sheets credential Enter your Document ID (from the sheet URL) Select the Sheet Name Configure column mapping to match the extracted fields Set Matching Column to Profile ID for deduplication Step 4: Set Your Target LinkedIn Group Open the Initialize Pagination node Locate the groupId variable in the code Replace "9357376" with your target group ID Finding Your Group ID: Go to your LinkedIn group Look at the URL: linkedin.com/groups/XXXXXXX/ The numbers are your group ID // Change this value to your target group groupId: "9357376", // Replace with your group ID Step 5: Test the Workflow Click the Start Workflow manual trigger node Click Test Workflow Verify: API returns member data correctly Filtering captures only Premium/Verified members Pagination loops for additional pages (if applicable) Google Sheets populates with extracted data Customization Filter Criteria Edit the filter logic in the Process & Filter Members node to adjust: Premium Only**: Remove the isVerified checks to capture only Premium subscribers Verified Only**: Remove the isPremium checks to capture only Verified profiles All Members**: Remove the filter entirely to extract everyone (modify the return statement) Minimum Followers**: Add a follower count threshold for influencer targeting // Example: Filter for Premium members with 1000+ followers const filteredMembers = members.filter(member => { const isPremium = member.isPremium === true; const hasMinFollowers = member.followerCount >= 1000; return isPremium && hasMinFollowers; }); Batch Size Default**: 50 members per API request Adjust**: Modify the count value in Initialize Pagination node Note**: 50 is the maximum allowed by the API Additional Fields The API returns more fields than extracted by default. Edit the Process & Filter Members node to include: creator - Whether they're a LinkedIn creator badges - Full list of profile badges fetchedAt - Timestamp of extraction Use Cases Sales Prospecting**: Build targeted prospect lists from industry-specific groups with verified decision-makers Competitor Research**: Analyze who's active in competitor communities and their professional backgrounds Influencer Identification**: Find Premium creators and verified professionals for partnership opportunities Recruiting**: Source passive candidates who are active in professional development groups Event Marketing**: Identify engaged professionals in niche communities for webinar and conference promotion Content Strategy**: Research headlines and titles to understand what resonates in your industry Troubleshooting Common Issues & Solutions Issue: Empty results returned Solution**: Verify you're a member of the target group; API can only access groups you've joined Issue: "401 Unauthorized" errors Solution**: Check that your ConnectSafely.ai API key is valid and the Bearer Auth credential is properly configured Issue: Pagination loop seems infinite Solution**: This is expected behavior until hasMore returns false; large groups may take several minutes to fully process Issue: Duplicate entries in Google Sheets Solution**: Ensure the "Append or Update" operation is selected with Profile ID as the matching column Issue: Missing data in certain columns Solution**: Not all profiles have complete data; the workflow handles null values gracefully Issue: Google Sheets not updating Solution**: Verify OAuth credentials are valid and the sheet/document IDs are correctly configured Documentation & Resources Official Documentation ConnectSafely.ai Docs**: https://connectsafely.ai/docs API Reference**: Available in ConnectSafely.ai dashboard n8n Google Sheets Node**: https://docs.n8n.io/integrations/builtin/app-nodes/n8n-nodes-base.googlesheets/ Support Channels Email Support**: support@connectsafely.ai Documentation**: https://connectsafely.ai/docs Custom Workflows**: Contact us for custom automation Connect With Us Stay updated with the latest automation tips, LinkedIn strategies, and platform updates: LinkedIn**: linkedin.com/company/connectsafelyai YouTube**: youtube.com/@ConnectSafelyAI-v2x Instagram**: instagram.com/connectsafely.ai Facebook**: facebook.com/connectsafelyai X (Twitter)**: x.com/AiConnectsafely Bluesky**: connectsafelyai.bsky.social Mastodon**: mastodon.social/@connectsafely Need Custom Workflows? Looking to build sophisticated LinkedIn automation workflows tailored to your business needs? Contact our team for custom automation development, strategy consulting, and enterprise solutions. We specialize in: Multi-channel engagement workflows AI-powered personalization at scale Lead scoring and qualification automation CRM integration and data synchronization Custom reporting and analytics pipelines
by Tihomir Mateev
Stop Paying for the Same Answer Twice Your LLM is answering the same questions over and over. "What's the weather?" "How's the weather today?" "Tell me about the weather." Same answer, three API calls, triple the cost. This workflow fixes that. What Does It Do? Semantic caching with superpowers. When someone asks a question, it checks if you've answered something similar before. Not exact matches—semantic similarity. If it finds a match, boom, instant cached response. No LLM call, no cost, no waiting. First time: "What's your refund policy?" → Calls LLM, caches answer Next time: "How do refunds work?" → Instant cached response (it knows these are the same!) Result: Faster responses + way lower API bills The Flow Question comes in through the chat interface Vector search checks Redis for semantically similar past questions Smart decision: Cache hit? Return instantly. Cache miss? Ask the LLM. New answers get cached automatically for next time Conversation memory keeps context across the whole chat It's like having a really smart memo pad that understands meaning, not just exact words. Quick Start You'll need: OpenAI API key (for the chat model) huggingface API key (for embeddings) Redis 8.x (for vector magic) Get it running: Drop in your credentials Hit the chat interface Watch your API costs drop as the cache fills up That's it. No complex setup, no configuration hell. Tune It Your Way The distanceThreshold in the "Analyze results from store" node is your control knob: Lower (0.2)**: Strict matching, fewer false positives, more LLM calls Higher (0.5)**: Loose matching, more cache hits, occasional weird matches Default (0.3)**: Sweet spot for most use cases Play with it. Find what works for your questions. Hack It Up Some ideas to get you started: Add TTL**: Make cached answers expire after a day/week/month Category filters**: Different caches for different topics Confidence scores**: Show users when they got a cached vs fresh answer Analytics dashboard**: Track cache hit rates and cost savings Multi-language**: Cache works across languages (embeddings are multilingual!) Custom embeddings**: Swap OpenAI for local models or other providers Real Talk 💡 When it shines: Customer support (same questions, different words) Documentation chatbots (limited knowledge base) FAQ systems (obvious use case) Internal tools (repetitive queries) When to skip it: Real-time data queries (stock prices, weather, etc.) Highly personalized responses Questions that need fresh context every time Pro tip: Start with a higher threshold (0.4-0.5) and tighten it as you see what gets cached. Better to cache too much at first than miss obvious matches. Built with n8n, Redis, Huggingface and OpenAI. Open source, self-hosted, completely under your control.
by Avkash Kakdiya
How it works This workflow automates the full offer letter lifecycle, from generation to final candidate response tracking. When a new row with a Pending status is added to Google Sheets, it creates a personalized offer letter using a Google Docs template. The document is converted to PDF, stored in Google Drive, shared securely, and sent to the candidate via Gmail. Candidate responses are captured through webhooks, validated against deadlines, and used to update status and trigger follow-up communications. First set: Generate and send offer letter Monitor new candidates** Google Sheets Trigger – Watches for newly added rows in the offer sheet. If – Ensures only rows with a Pending status are processed. Generate offer letter** Google Drive – Copies the offer letter template for the candidate. Google Docs – Replaces placeholders with candidate-specific data. Google Drive – Converts the document to a PDF file. Store and share document** Google Drive – Saves the PDF to a designated folder. Google Drive – Assigns public read-only access to the file. Send offer email** Gmail – Sends the offer email with Accept and Decline action links. Google Sheets – Updates the sheet with offer link, date, and Sent status. Second set: Capture response and finalize outcome Capture candidate response** Webhook – Receives the candidate’s decision from email buttons. Google Sheets – Fetches the corresponding candidate record. If – Checks whether the response is already locked. Code – Validates decision against the response deadline. Finalize outcome** Switch – Routes Accepted, Rejected, or Timeout outcomes. Google Sheets – Updates final status and locks further responses. Gmail – Sends confirmation emails to the candidate and HR.
by khaled
What Problem Does It Solve? HR managers waste hours manually logging attendance, calculating work hours, and tracking salary advances in spreadsheets. Onboarding new staff often involves messy paperwork or scattered chat messages. Salary calculations are prone to errors when manually tallying absence, overtime, and penalties. This workflow solves these by: -- Allowing employees to check in/out and request loans via a simple Telegram chat. -- Automatically calculating work hours and applying penalties for early departure. -- Registering new employees through an interactive AI chat. -- Generating instant financial reports including net salary, deductions, and overtime. How to Configure It Google Sheets Setup -- Create a Google Sheet with two tabs: one for "Employee Data" (Columns: ID, Name, Role, Branch, Salary) and one for "Logs" (Attendance history). -- Connect your Google Sheets OAuth2 credentials in n8n and select this sheet in all related nodes. Telegram Setup -- Create a new bot via BotFather and connect your Telegram credentials. -- (Optional) Set the "Admin Chat ID" in the Daily Report node to receive absentee lists. AI Setup -- Add your OpenAI API key (used for intent classification and conversational agents). -- The included prompts are currently in Egyptian Arabic — you can translate them to English or your local language in the "System Message" of the Agent nodes. Timezone -- Adjust the timezone in the "Context" section of the AI Agent nodes to match your company's location (currently set to Africa/Cairo). How It Works Telegram Trigger catches every message sent to the HR Bot. AI Classifier analyzes the text to determine intent: -- New Employee: Triggers a conversation to collect Name, ID, Salary, and Branch -> Validates data -> Adds to "Employee Data" sheet. -- Attendance (Check-in): Logs the timestamp immediately. -- Departure (Check-out): Looks up arrival time, calculates total hours worked. If under 8 hours, it flags "Early Departure" and logs it with a penalty note. -- Financial Request: Logs salary advances (loans) directly to the sheet. -- Report Request: An AI Analyst calculates net salary (Basic + Overtime - Absence - Advances) and replies with a detailed breakdown. Daily Schedule: -- Every day at 1:00 PM, the workflow compares today's attendance logs against the full employee list. -- It generates a list of absentees and sends a summary report to the HR Manager. Customization Ideas Change the Logic: Edit the JavaScript node to change the "8-hour work day" rule to match your company policy. Multi-Platform: Swap the Telegram Trigger for WhatsApp or Slack to match your team's communication tool. PDF Payslips: Add a node to generate a PDF payslip based on the AI's financial report and email it to the employee. Face Recognition: Integrate an image analysis node if you want employees to send a selfie for attendance verification. For more info Contact Me