by Lorena
This workflow gets data from an API and exports it into Google Sheets and a CSV file.
by Anchor
Try It Out! This n8n template shows how to, based on a list of website, enrich a Google Spreadsheet with emails and phone numbers automatically using the Apify Email & Phone Extractor actor from Anchor. It will create a new sheet with the extracted contact details. You can use it to build lead enrichment workflows, keep CRM records current, or prep outreach lists, all directly inside n8n. Who is this for Sales Teams: Surface contacts for target accounts fast. Recruiters: Find contact details on company sites. Growth Marketers: Clean and enrich prospect lists at scale. Researchers: Map industries and orgs with real contacts. CRM Builders: Auto-populate contact fields from the web. Lead-Gen Agencies: Deliver verified contact data at volume. How it works The workflow starts with a list of website URLs or domains in Google Sheets (one per row). The Apify node runs the Email & Phone Extractor to collect emails, phone numbers, and social links (e.g., Twitter, LinkedIn, Instagram) from those sites. The results are written to a new Google Sheet with the found contacts and their source pages. How to use In Google Sheets: Create a Google Sheet, rename the sheet websites, and add all the domains or URLs you want to scan (one per row). In this Workflow: Open “Set google sheet URL & original sheet name” and replace the example Google Sheet URL and the name of the source sheet (e.g., websites). In the n8n credentials: Connect your Google Sheets account with read and write privileges. Connect your Apify account. In Apify: Sign up for this Apify Actor Speed The step "Run actor on Apify" can take a long time. It needs to scrape many pages on each website you provide. That's why. If you want to keep track of its progress, you can check the logs on Apify Run direclty ! Requirements Apify account with access to Email & Phone Extractor. A list of domains or URLs to process. Need Help? Open an issue directly on Apify! Avg answer in less than 24h Happy Contact Discovery
by dev
Every 10 minutes check if a new article was stared, if so, save it in wallabag to read it later.
by Harshil Agrawal
This workflow allows you to create a new list, add a new contact to that list, update the contact, and get all contacts in the list using the Autopilot node. Autopilot node: This node will create a new list called n8n-docs in Autopilot. Autopilot1 node: This node creates a new contact and adds it to the list created in the previous node. Autopilot2 node: This node updates the information of the contact that we created in the previous node. Autopilot3 node: This node returns all the contacts of the n8n-docs list that we created using the Autopilot node.
by Davide
This workflow automates the process of generating and publishing a talking avatar video. It takes a video and an audio file as inputs, then uses the Infinitalk API to create a lip-synced avatar that naturally matches the audio. Once generated, the video is: Retrieved and stored (Google Drive upload). Optimized with an AI-generated YouTube title for better SEO. Automatically uploaded to YouTube and TikTok via the Upload-Post API. Key Advantages ✅ Full Automation: From video generation to social media publishing, everything is handled automatically. ✅ AI-Powered Content Optimization: Uses GPT-based AI to create SEO-friendly and catchy YouTube titles, increasing reach and engagement. ✅ Multi-Platform Publishing: Videos are instantly shared across YouTube and TikTok, saving time on manual uploads. ✅ Seamless Storage Integration: Generated videos are saved to Google Drive for archiving or reuse. ✅ Scalable Workflow: Can be reused for multiple videos, podcasts, tutorials, or marketing clips with minimal effort. ✅ Natural Lip-Sync: Infinitalk API ensures avatars sync realistically with audio, creating professional-looking videos. ✅ Customizable: Resolution, number of frames, and acceleration options can be adjusted for different needs (e.g., faster processing vs. higher quality). How It Works The process begins with a form submission where users provide a video URL (for the avatar), an audio URL (for lip-sync), and a text prompt describing the video content. The system then: Video Generation: Sends the inputs to Fal.ai's Infinitalk API which creates a talking avatar video where the character lip-syncs to the provided audio with natural facial expressions Status Monitoring: Continuously checks the processing status every 60 seconds until the video generation is completed Title Generation: Uses OpenAI's GPT model to create an SEO-optimized YouTube title based on the original prompt File Handling: Downloads the generated video and uploads it to Google Drive for storage Multi-Platform Distribution: Automatically publishes the video to both YouTube and TikTok using the Upload-Post service The workflow includes conditional logic to ensure each step executes in the correct order and only proceeds when previous operations are successful. Set Up Steps Step 1: Configure Fal.ai API Create an account at Fai AI and obtain your API key In the "Create Video" node, set up Header Authentication with: Name: "Authorization" Value: "Key YOURAPIKEY" Step 2: Configure Upload-Post Service Get your API key from Upload-Post Manage Api Keys Set up the authentication header in both upload nodes: Name: Authorization Value: Apikey YOUR_API_KEY_HERE Create social media profiles in Upload-Post and replace "YOUR_USERNAME" in the upload nodes with your profile name Note: Free plan supports all platforms except TikTok (requires paid upgrade) Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Nima Salimi
Overview This n8n workflow automatically retrieves the monthly CrUX (Chrome User Experience) Report from Google BigQuery and updates the data in NocoDB. It removes the previous month’s data before inserting the new dataset, ensuring your database always contains the latest CrUX rankings for website origins. The flow is fully automated, using schedule triggers to handle both data cleanup and data insertion each month. ✅ Tasks ⏰ Runs automatically on a monthly schedule 🔢 Converts the month name to a numeric value for table selection 🧹 Deletes last month’s CrUX data from NocoDB 🌐 Queries Google BigQuery for the latest monthly dataset 💾 Inserts the new CrUX rankings into NocoDB ⚙️ Keeps your database up to date with zero manual effort 🛠 How to Use 1️⃣ Set Up BigQuery Access Connect your Google BigQuery credentials. Make sure your project includes access to the chrome-ux-report public dataset. 2️⃣ Adjust the Query In the Google BigQuery node, change the LIMIT value to control how many top-ranked sites are retrieved. Ensure the {{ $json.table }} field correctly references the dataset for the desired month (e.g., 202509). 3️⃣ Prepare NocoDB Table Create a table in NocoDB with fields: origin, crux_rank, and any additional metadata you wish to track. 4️⃣ Schedule Automation The workflow includes two Schedule Trigger nodes: One runs the data cleanup process (deletes last month). One runs the data insertion for the new month. 5️⃣ Run or Activate the Workflow Activate it to run automatically each month. You can also run it manually to refresh data on demand. 📋 Prerequisites Before running this workflow, make sure you complete the following setup steps: 🧱 Enable BigQuery API Go to Google Cloud Console → APIs & Services Enable the BigQuery API for your project. 📊 Access the Chrome UX Report Dataset In BigQuery, search for “Chrome UX Report” in the Marketplace or go directly to: https://console.cloud.google.com/marketplace/product/chrome-ux-report/chrome-ux-report Click “View Dataset” and add it to your BigQuery project. 🔑 Connect BigQuery to n8n In n8n, create credentials for your Google BigQuery account using Service Account Authentication. Ensure the account has permission to query the chrome-ux-report dataset. 🗄️ Create a NocoDB Table In NocoDB, create a new table to store your CrUX data with the following fields: origin → Short text crux_rank → Number ⚙️ Connect NocoDB to n8n Use your NocoDB API Token to connect and allow the workflow to read/write data. What is CrUX Rank? CrUX Rank (Chrome User Experience Rank) is a metric from Google’s Chrome UX Report (CrUX) dataset that indicates a website’s popularity based on real user visits. It reflects how frequently an origin (website) is loaded by Chrome users around the world. A lower rank number means the site is more popular (e.g., rank 1 = top site). The data is collected from anonymized Chrome usage statistics, aggregated monthly. This rank helps you track site popularity trends and compare your domain’s visibility over time.
by AppUnits AI
This workflow streamlines your lead management process by automatically capturing form submissions from Jotform, updating Attio CRM, and notifying your team (sales team for example) via Slack — all without manual work. How it works Receive Lead: A new submission is captured from Jotform (name, email, message). Prepare CRM: Checks if the Pending and Urgent deal stages exist in Attio CRM and creates them if they don’t exist. Checks if the Message column exists in Attio CRM and creates it if it doesn't exist. Lead Handling: If the lead doesn't exist in Attio CRM, the contact is created, a new deal is added to the Pending stage, and a Slack notification is sent. If the lead exists but has no deal, a new deal is added to Pending, and Slack is notified. If the lead exists with a deal, the deal is moved to the Urgent stage, and Slack is notified. Slack Notification: Your team (sales team for example) receives an instant Slack message whenever a new or existing lead is processed, so they can act fast. Requirements Make sure to have Jotform, Attio CRM and Slack accounts, then follow this video guide on how to start using this template.
by Anchor
Enrich Profiles directly in Google Sheet ! This n8n template shows how to enrich Google spreadsheet with LinkedIn profiles automatically using the Apify LinkedIn Profile Enrichment actor from Anchor. It will create a new sheet with the enriched data. You can use it to build lead enrichment workflows, update your CRM records, or personalize outreach campaigns — all directly inside n8n. Who is this for Sales Teams: Build targeted B2B lead lists fast. Recruiters: Gather candidate data from LinkedIn profiles. Growth Marketers: Enrich outreach lists with LinkedIn info. Researchers: Analyze industries, roles, and company trends. CRM Builders: Auto-populate contact data from LinkedIn. Lead-Gen Agencies: Deliver qualified prospect lists at scale. How it works The workflow starts with a list of LinkedIn profile URLs (you need to set the Google sheet URL after you added the Google credentials from the n8n settings). The Apify node runs the LinkedIn Profile Enrichment actor to extract structured data such as name, title, company, location, and more. The results are then stored in a new Google Sheet How to use In Google Sheet: Create a Google sheet, rename the sheet "profiles" and add all the LinkedIn URLs you want to enrich (one url per row) In this Workflow: Open the "Set google sheet URL & orginial sheet name" and replace the example Google sheet URL, and the name of the sheet where your LinkedIn URLs are In the n8n credentials: Connect your Google Sheet account, with read and write privileges for google sheets Connect to your Apify account In Apify: Sign up for this Apify Actor Requirements Apify account with access to the LinkedIn Profile Enrichment actor LinkedIn profile URLs to process Need Help? Open an issue on directly on Apify! Avg answer in less than 24h Happy Enrichment!
by WeblineIndia
🧠 Sentiment Analysis of Product Reviews using Google Sheets & OpenAI 🚀 Quick Implementation Steps Automated customer feedback analyzer: Trigger**: Google Sheets triggers on new product review rows. Sentiment Analysis**: Review text sent to OpenAI. Writeback**: Resulting sentiment (Positive, Neutral, Negative) is written back to the sheet. Just connect your credentials and sheet — you're ready to go! 🔍 What It Does This workflow automatically analyzes user-submitted product reviews and classifies them by sentiment using OpenAI’s powerful language models. It eliminates the need to manually sift through feedback by tagging each review with a sentiment score. The sentiment result is then written back to the Google Sheet next to the original review, enabling you to get a fast, clear snapshot of overall customer perception, satisfaction and pain points. Whether you're monitoring 10 or 10,000 reviews, this process scales effortlessly and updates every minute. 👤 Who’s It For This workflow is designed for: E-commerce teams** collecting user reviews. Product teams** monitoring customer feedback. Marketing teams** identifying promotable reviews. Support teams** watching for negative experiences. SaaS platforms**, apps, and survey tools managing structured text feedback. ✅ Requirements You’ll need: A Google Sheet with two columns: Review and Sentiment Google Sheets OAuth2 credentials in n8n OpenAI API Key (for GPT-4o-mini or GPT-3.5) n8n instance with LangChain and OpenAI nodes enabled ⚙️ How It Works Google Sheets Trigger: Watches for new rows every minute OpenAI Integration: Uses LangChain’s Sentiment Analysis node Passes review text into GPT-4o-mini via the OpenAI Chat Model node Sheet Update: The sentiment result (Positive, Negative, or Neutral) is written into the Sentiment column in the same row. Sticky Notes included for better visual understanding inside the workflow editor. 🛠️ Steps to Configure and Use 1. Prepare Your Google Sheet Make sure your sheet is named Sheet1 with the following structure: | Review | Sentiment | |---------------------------------------|-----------| | Absolutely love it! | | | Not worth the price. | | 2. Set Up Credentials Google Sheets**: OAuth2 credentials OpenAI**: API Key added via OpenAI API credential in n8n 3. Import & Activate Workflow Import the workflow JSON into your n8n instance. Assign the proper credentials to the trigger and OpenAI nodes. Activate the workflow. 🧩 How To Customize 🛎️ Alerting: Add Slack/Email nodes for negative sentiment alerts 🔄 Triggering: Change the polling interval to real-time triggers (e.g., webhook) 📊 Extended Sentiment: Modify sentiment categories (e.g., "Mixed", "Sarcastic") 🧾 Summary Report: Add Cron + Aggregation nodes for daily/weekly summaries 🧠 Prompt Tuning: Adjust system prompt for deeper or context-based sentiment evaluation 🧱 Add‑ons (Optional Features) Email Digest of Negative Reviews Google Drive Logging Team Notification via Slack Summary to Notion, Airtable, or Google Docs 📌 Use Case Examples Online Stores**: Auto-tag reviews for reputation monitoring Product Teams**: See which feature releases generate positive or negative buzz CX Dashboards**: Feed real-time sentiment to internal BI tools Marketing**: Extract glowing reviews for social proof Support**: Triage issues by flagging critical comments instantly ...and many more applications wherever text feedback is collected. 🧰 Troubleshooting Guide | Issue | Possible Cause | Suggested Fix | |-------------------------|---------------------------------------------|---------------------------------------------------| | Sentiment not updating | Sheet credentials missing or misconfigured | Reconnect Google Sheets OAuth2 | | Blank sentiment | Review column empty or misaligned | Ensure proper column header & value present | | OpenAI errors | Invalid or expired API key | Regenerate API Key from OpenAI and re-auth | | Workflow doesn’t run | Polling settings incorrect | Confirm interval & document ID in trigger node | 🤝 Need Help? If you need assistance for ✅ Help setting up this workflow ⚙️ Customizing prompts or output 🚀 Automating your full review pipeline 👉 Contact us today at WeblineIndia. We will be happy to assist.
by Hermilio
query data from two different databases handle and unify in a single return
by Sk developer
Automated DA PA Checker Workflow for SEO Analysis Description This n8n workflow collects a website URL via form submission, retrieves SEO metrics like Domain Authority (DA) and Page Authority (PA) using the Moz DA PA Checker API, and stores the results in Google Sheets for easy tracking and analysis. Node-by-Node Explanation On form submission – Captures the website input from the user to pass to the Moz DA PA Checker API. DA PA API Request – Sends the website to the Moz DA PA Checker API via RapidAPI to fetch DA, PA, spam score, DR, and organic traffic. If – Checks if the API request to the Moz DA PA Checker API returned a successful response. Clean Output – Extracts only the useful data from the Moz DA PA Checker API response for saving. Google Sheets – Appends the cleaned SEO metrics to a Google Sheet for record-keeping. Use Cases SEO Analysis** – Quickly evaluate a website’s DA/PA metrics for optimization strategies. Competitor Research** – Compare domain authority and organic traffic with competitors. Link Building** – Identify high-authority domains for guest posting and backlinks. Domain Purchase Decisions** – Check metrics before buying expired or auctioned domains. Benefits Automated Workflow** – From input to Google Sheets without manual intervention. Accurate Metrics* – Uses the trusted *Moz DA PA Checker API** for DA, PA, spam score, DR, and traffic. Instant Insights** – Get SEO scores in seconds for faster decision-making. Easy Integration** – Seamless connection between RapidAPI and Google Sheets for data storage.
by Shelly-Ann Davy
Build authentic Reddit presence and generate qualified leads through AI-powered community engagement that provides genuine value without spam or promotion. 🎯 What This Workflow Does: This intelligent n8n workflow monitors 9 targeted subreddits every 4 hours, uses AI to analyze posts for relevance and lead potential, generates authentic helpful responses that add value to discussions, posts comments automatically, and captures high-quality leads (70%+ potential score) directly into your CRM—all while maintaining full Reddit compliance and looking completely human. ✨ Key Features: 6 Daily Checks: Monitors subreddits every 4 hours for fresh content 9 Subreddit Coverage: Customizable list of target communities AI Post Analysis: Determines relevance, intent, and lead potential Intelligent Engagement: Only comments when you can add genuine value Authentic Responses: AI-generated comments that sound human, not promotional Lead Scoring: 0-1.0 scale identifies high-potential prospects (0.7+ captured) Automatic CRM Integration: High-quality leads flow directly to Supabase Rate Limit Protection: 60-second delays ensure Reddit API compliance Native Reddit Integration: Official n8n Reddit node with OAuth2 Beginner-Friendly: 14+ detailed sticky notes explaining every component 🎯 Target Subreddits (Customizable): Insurance & Claims: r/Insurance - General insurance questions r/ClaimAdvice - Claim filing help r/AutoInsurance - Auto coverage discussions r/FloodInsurance - Flood damage queries r/PropertyInsurance - Property coverage Property & Home: r/homeowners - Property issues and claims r/RoofingContractors - Roof damage discussions Financial & Legal: r/PersonalFinance - Insurance decisions r/legaladvice - Legal aspects of claims 🤖 AI Analysis Components: Post Evaluation: Relevance score (0-100%) User intent detection Damage type identification (hail, water, fire, wind) Urgency level (low/medium/high) Lead potential score (0-1.0) Recommended services Engagement opportunity assessment Decision Criteria: Should engage? (boolean) Can we add value? (quality check) Is this promotional? (avoid spam) Lead worth capturing? (70%+ threshold) Typical Engagement Rate: 5-10% of analyzed posts (67-135 comments/day) 🔧 Technical Stack: Trigger: Schedule (every 4 hours, 6x daily) Reddit API: Native n8n node with OAuth2 AI Analysis: Supabase Edge Functions Response Generation: AI-powered contextual replies Lead Capture: Supabase CRM integration Rate Limiting: Wait node (60-second delays)