by Mirza Ajmal
Description This powerful workflow automates the evaluation of new digital tools, websites, or platforms with the goal of assessing their potential impact on your business. By leveraging Telegram for user input, Apify for deep content extraction, advanced AI for contextual analysis, and Google Sheets for personalized data integration and record-keeping, this tool delivers clear, actionable verdicts that help you determine whether a tool is worth adopting or exploring further. Key Features and Workflow User-Friendly Input: Submit URLs of tools or websites directly through Telegram for quick and easy evaluation requests. Dynamic Content Extraction: The workflow retrieves detailed content from the submitted URLs using the Apify web crawler, capturing rich data for analysis. AI-Powered Cleaning & Analysis: Sophisticated AI models filter out noise, distill meaningful insights, and contextualize findings based on your business profile and goals stored in Google Sheets. Personalized Business Context: Integration with Google Sheets brings in your company’s specialization, current focus, and strategic objectives to tailor the analysis specifically to your needs. Structured Analysis Output: Receive a thorough, structured report including concise summaries, key considerations, business impact, benefits, risks, actionable insights, and an easy-to-understand final verdict on the tool’s relevance. Decision Support: The tool estimates effort, time to value, urgency, and confidence levels, enabling informed prioritization and strategic decision-making. Seamless Communication: Results are sent back via Telegram, ensuring you get timely and direct feedback without needing to leave your messaging app. Record Keeping & Tracking: All analyses and decisions are logged automatically into Google Sheets, creating a searchable knowledge base for ongoing reference and reporting. Setup Instructions for Key Nodes Telegram Trigger Node: Configure your Telegram bot API credentials here. Link the bot to your Telegram account to receive messages for URL submissions. URL Extraction Node: No credentials needed. This node extracts URLs from incoming messages for processing. Apify Web Crawler Node Setup Guide: Go to Apify's website, sign up for an account if you don’t have one, and get your API token from your profile’s API tokens section. Then, paste this token into the Apify Node’s API Key field in n8n. AI Cleaning and Analysis Nodes: Configure OpenRouter or compatible AI service API keys for content processing. Customize prompts or models if desired to align analysis style. Google Sheets Nodes: Connect using your Google account and provide access to the specified Google Sheet. Ensure sheets for Company Details and Analysis Results exist with proper columns as per this workflow. Telegram Reply Node: Use the Telegram bot API credentials to send analysis summaries and verdicts back to users. Access and Edit the Google Sheet You can access the Google Sheet used by this workflow here: Access the google sheet here Please make a copy of the sheet to your own Google Drive before connecting it with this workflow. This allows you to customize the sheets, update company information, and manage analysis results securely without affecting the original template. Extendibility Beyond manual URL submissions, you can enhance this workflow by scheduling automated daily checks of new product launches from platforms like Product Hunt. The system can proactively analyze emerging tools and deliver timely updates via Telegram, email, or other channels, helping you stay ahead of innovation effortlessly.
by Sascha
Having a seamless flow of customer data between your online store and your marketing platform is essential. By keeping your systems synchronized, you can ensure that your marketing campaigns are accurately targeted and effective. The integration between Shopify, a leading e-commerce platform, and Mautic, an open-source marketing automation system, is not available out-of-the-box. However, with a n8n workflow you can bridge this gap with. This template will help you: enhance accuracy in marketing lists by ensuring that subscription changes in Shopify are instantly updated in Mautic. improve compliance with data protection laws by respecting users' subscription preferences across platforms achieve integration without the need for additional plugins or software, minimizing complexity and potential points of failure. This template will demonstrate the follwing concepts in n8n: working with Shopify in n8n control flow with the IF node use Webhooks validate Webhooks with the Crypto node use the GraphQL node to call the Shopify Admin API The template consists of two parts: Sync Email Subscriptions from Shopify to Mautic Sync Email Subscriptions from Mautic to Shopify How to get started? Create a custom app in Shopify get the credentials needed to connect n8n to Shopify This is needed for the Shopify Trigger Create Shopify Acces Token API credentials n n8n for the Shopify trigger node Create Header Auth credentials: Use X-Shopify-Access-Token as the name and the Acces-Token from the Shopify App you created as the value. The Header Auth is neccessary for the GraphQL nodes. Enable the Mautic API under Configuration/API Settings, After the settings are saved you will have an additional entry in your settings menu to create API credentials for n8n Create Mautic credentials in n8n Please make sure to read the notes in the template. For a detailed explanation please check the corresponding video: https://youtu.be/x63rrh_yJzI
by Miquel Colomer
Do you want to avoid bounces in your Email Marketing campaigns? This workflow verifies emails using the uProc.io email verifier. You need to add your credentials (Email and API Key - real -) located at Integration section to n8n. Node "Create Email Item" can be replaced by any other supported service with email value, like Mailchimp, Calendly, MySQL, or Typeform. The "uProc" node returns a status per checked email (deliverable, undeliverable, spamtrap, softbounce,...). "If" node checks if "deliverable" status exists. If value is not present, you can mark email as invalid to discard bounces. If "deliverable" status is present, you can use email in your Email Marketing campaigns. If you need to know detailed indicators of any email, you can use the tool "Communication" > "Check Email Exists (Extended)" to get advanced information.
by Miquel Colomer
Do you want to discover company-related information to enrich a signup process? This workflow enriches any company by name using the uProc Get Company by Name tool. This tool combines Google Maps and emails research on the internet to return results. You get no results if the company has no presence on Google Maps. You need to add your credentials (Email and API Key - real -) located at Integration section to n8n. You can replace node "Create Company Item" with any other supported service returning Company names and countries, like Hubspot, Google Sheets, MySQL, or Typeform. You can set up the uProc node with several parameters: country: the country name you want to use. name: the name of the company you need to locate. Every "uProc" node returns the next fields per every located company: name: Contains the company's given name. email: Contains the company's given email. cif: Contains company's cif number. address: Contains company's formatted address. city: Contains the city location of the company. state: Contains province location of the company. county: Contains state location of the company country: Contains country location of the company zipcode: Contains zipcode code of the company phone: Contains phone number of the company website: Contains website of the company latitude: Contains latitude of the company longitude: Contains longitude of the company Next, you can save results to a CRM or Google Sheets, and prepare returned email or phone to launch an email or telemarketing campaign.
by Hans Blaauw
This flow is supported by a Chrome plugin created with Cursor AI. The idea was to create a Chrome plugin and a backend service in N8N to do chart analytics with OpenAI. It's a good sample on how to submit a screenshot from the browser to N8N. Who is it for? N8N developers who want to learn about using a Chrome plugin, an N8N webhook and OpenAI. What opportunity does it present? This sample opens up a whole range of N8N connected Chrome extensions that can analyze screenshots by using OpenAI. What this workflow does? The workflow contains: a webhook trigger an OpenAI node with GPT-4O-MINI and Analyze Image selected a response node to send back the Text that was created after analysing the screenshot. All this is needed to talk to the Chrome extension which is created with Cursor AI. The idea is to visit the tradingview.com crypto charts, click the Chrome plugin and get back analytics about the shown chart in understandable language. This is driven by the N8N flow. With the new image analytics capabilities of OpenAI this opens up a world of opportunities. Requirements/setup OpenAI API key Cursor AI installed The Chrome extension. Download The N8N JSON code. Download How to customize it to your needs? Both the Chrome extension and N8N flow can be adapted to use on other websites. You can consider: analyzing a financial screen and ask questions about the data shown analyzing other charts extending the N8N workflow with other AI nodes With AI and image analytics the sky is the limit and in some cases it saves you from creating complex API integrations. Download Chrome extension
by Angel Menendez
CallForge - AI-Powered Product Insights Processor from Sales Calls Automate product feedback extraction from AI-analyzed sales calls and store structured insights in Notion for data-driven product decisions. 🎯 Who is This For? This workflow is designed for: ✅ Product managers tracking customer feedback and feature requests. ✅ Engineering teams identifying usability issues and AI/ML-related mentions. ✅ Customer success teams monitoring product pain points from real sales conversations. It streamlines product intelligence gathering, ensuring customer insights are structured, categorized, and easily accessible in Notion for better decision-making. 🔍 What Problem Does This Workflow Solve? Product teams often struggle to capture, categorize, and act on valuable feedback from sales calls. With CallForge, you can: ✔ Automatically extract and categorize product feedback from AI-analyzed sales calls. ✔ Track AI/ML-related mentions to gauge customer demand for AI-driven features. ✔ Identify feature requests and pain points for product development prioritization. ✔ Store structured feedback in Notion, reducing manual tracking and increasing visibility across teams. This workflow eliminates manual feedback tracking, allowing product teams to focus on innovation and customer needs. 📌 Key Features & Workflow Steps 🎙️ AI-Powered Product Feedback Processing This workflow processes AI-generated sales call insights and organizes them in Notion databases: Triggers when AI sales call data is received. Detects product-related feedback (feature requests, bug reports, usability issues). Extracts key product insights, categorizing feedback based on customer needs. Identifies AI/ML-related mentions, tracking customer interest in AI-driven solutions. Aggregates feedback and categorizes it by sentiment (positive, neutral, negative). Logs insights in Notion, making them accessible for product planning discussions. 📊 Notion Database Integration Product Feedback** → Logs feature requests, usability issues, and bug reports. AI Use Cases** → Tracks AI-related discussions and customer interest in machine learning solutions. 🛠 How to Set Up This Workflow 1. Prepare Your AI Call Analysis Data Ensure AI-generated sales call insights are available. Compatible with Gong, Fireflies.ai, Otter.ai, and other AI transcription tools. 2. Connect Your Notion Database Set up Notion databases for: 🔹 Product Feedback (logs feature requests and bug reports). 🔹 AI Use Cases (tracks AI/ML mentions and customer demand). 3. Configure n8n API Integrations Connect your Notion API key** in n8n under “Notion API Credentials.” Set up webhook triggers** to receive AI-generated sales insights. Test the workflow** using a sample AI sales call analysis. 🔧 How to Customize This Workflow 💡 Modify Notion Data Structure – Adjust fields to align with your product team's workflow. 💡 Refine AI Data Processing Rules – Customize how feature requests and pain points are categorized. 💡 Integrate with Slack or Email – Notify teams when recurring product issues emerge. 💡 Expand with Project Management Tools – Sync insights with Jira, Trello, or Asana to create product tickets automatically. ⚙️ Key Nodes Used in This Workflow 🔹 If Nodes – Detect if product feedback, AI mentions, or feature requests exist in AI data. 🔹 Notion Nodes – Create and update structured feedback entries in Notion. 🔹 Split Out & Aggregate Nodes – Process multiple insights and consolidate AI-generated data. 🔹 Wait Nodes – Ensure smooth sequencing of API calls and database updates. 🚀 Why Use This Workflow? ✔ Eliminates manual sales call review for product teams. ✔ Provides structured, AI-driven insights for feature planning and prioritization. ✔ Tracks AI/ML mentions to assess demand for AI-powered solutions. ✔ Improves product development strategies by leveraging real customer insights. ✔ Scalable for teams using n8n Cloud or self-hosted deployments. This workflow empowers product teams by transforming sales call data into actionable intelligence, optimizing feature planning, bug tracking, and AI/ML strategy. 🚀
by Angel Menendez
CallForge - AI-Powered Marketing Insights Extraction from Sales Calls Automate marketing intelligence gathering from AI-analyzed sales calls and store insights in Notion. 🎯 Who is This For? This workflow is designed for: ✅ Marketing teams looking to extract trends and insights from sales conversations. ✅ Product managers who need direct customer feedback from sales calls. ✅ Revenue operations (RevOps) teams optimizing AI-driven call analysis. It streamlines AI-powered marketing intelligence, identifying customer pain points, competitor mentions, and recurring trends—all automatically stored in Notion. 🔍 What Problem Does This Workflow Solve? Manually reviewing sales call transcripts for marketing insights is time-consuming and inconsistent. With CallForge, you can: ✔ Extract key marketing insights from AI-analyzed sales calls. ✔ Track recurring discussion topics across multiple conversations. ✔ Generate actionable marketing recommendations for strategy and content. ✔ Store structured insights in Notion for seamless access. This automation eliminates manual work and ensures marketing teams get data-driven insights from real customer conversations. 📌 Key Features & Workflow Steps 🎙️ AI-Driven Marketing Insights Processing This workflow processes AI-generated sales call insights and organizes them in Notion databases: Triggers when AI sales call data is received. Identifies marketing-related data (trends, customer pain points, competitor mentions). Extracts key marketing insights, categorizing product discussions and recurring topics. Logs trends across multiple calls, ensuring marketing teams spot recurring themes. Processes actionable insights, capturing marketing strategy recommendations. Stores all findings in Notion, enabling structured, searchable insights. 📊 Notion Database Integration Marketing Insights** → Logs key trends and product mentions from sales calls. Recurring Topics** → Tracks frequently discussed themes across calls. Actionable Recommendations** → Stores AI-generated recommendations for marketing teams. 🛠 How to Set Up This Workflow 1. Prepare Your AI Call Analysis Data Ensure AI-generated sales call insights are available. Compatible with Gong, Fireflies.ai, Otter.ai, and other AI transcription tools. 2. Connect Your Notion Database Set up Notion databases for: 🔹 Marketing Insights (logs trends and product mentions) 🔹 Recurring Topics (tracks frequently discussed customer concerns) 🔹 Actionable Recommendations (stores marketing strategy insights) 3. Configure n8n API Integrations Connect your Notion API key** in n8n under “Notion API Credentials.” Set up webhook triggers** to receive AI-generated sales insights. Test the workflow** using a sample AI sales call analysis. 🔧 How to Customize This Workflow 💡 Modify Notion Data Structure – Adjust fields to match marketing strategy needs. 💡 Refine AI Data Processing Rules – Customize what insights are extracted and logged. 💡 Integrate with Slack or Email – Notify teams when key marketing trends emerge. 💡 Expand CRM Integration – Sync insights with HubSpot, Salesforce, or Pipedrive. CallForge - 01 - Filter Gong Calls Synced to Salesforce by Opportunity Stage CallForge - 02 - Prep Gong Calls with Sheets & Notion for AI Summarization CallForge - 03 - Gong Transcript Processor and Salesforce Enricher CallForge - 04 - AI Workflow for Gong.io Sales Calls CallForge - 05 - Gong.io Call Analysis with Azure AI & CRM Sync CallForge - 06 - Automate Sales Insights with Gong.io, Notion & AI CallForge - 07 - AI Marketing Data Processing with Gong & Notion CallForge - 08 - AI Product Insights from Sales Calls with Notion ⚙️ Key Nodes Used in This Workflow 🔹 If Nodes – Detect if marketing insights, recurring topics, or recommendations exist in AI data. 🔹 Notion Nodes – Create and update entries in Notion databases. 🔹 Split Out & Aggregate Nodes – Process multiple insights and consolidate AI-generated data. 🔹 Wait Nodes – Ensure smooth sequencing of API calls and database updates. 🚀 Why Use This Workflow? ✔ Eliminates manual sales call review for marketing teams. ✔ Provides structured, AI-driven insights for marketing and product strategy. ✔ Tracks competitor mentions and customer pain points automatically. ✔ Improves content marketing and campaign planning with real customer insights. ✔ Scalable for teams using n8n Cloud or self-hosted deployments. This workflow empowers marketing teams by transforming sales call data into actionable intelligence, streamlining strategy, content planning, and competitor analysis. 🚀
by tanaypant
This workflow is the third of three. You can find the other workflkows here: Incident Response Workflow - Part 1 Incident Response Workflow - Part 2 Incident Response Workflow - Part 3 We have the following nodes in the workflow: Webhook node: This trigger node listens to the event when the Resolve button is clicked. PagerDuty node: This node changes the status of the incident report from Acknowledged to Resolved in PagerDuty. Jira Software node: This node moves the incident issue to Done. Mattermost node: This node publishes a message in the auxiliary channel mentioning that the incident has been marked as resolved in PagerDuty and Jira. Mattermost node: This node publishes a message in the specified Incidents channel that the incident has been resolved by the on-call team.
by Humble Turtle
Github Deployer Agent Overview The Github Deployer Agent is an intelligent automation tool that integrates with Slack to streamline code deployment workflows. Powered by Anthropic's Claude 3.5 and Tavily for web search, it enables seamless, context-aware file pushes to a GitHub repository with minimal user input. Capabilities Accepts natural language via Slack Automatically pushes code to a default GitHub repository Uses Claude 3.5 for code generation and decision-making Leverages Tavily for real-time web search to enhance context Supports folder structure hints to ensure clean and organized repositories Required Connections To operate correctly, the following integrations must be in place: Slack API Token with permission to read messages and post responses GitHub Personal Access Token with repo write permissions Tavily API Key for external search functionality Claude 3.5 API Access via Anthropic Detailed configuration instructions are provided in the workflow Example Input From Slack, you can send messages like: "Generate a basic README.md for my Python project and store it in the root directory." Customising This Workflow You can tailor the workflow by: Modifying default folder paths or repository settings Integrate Jira node to use issue keys as default folder naming Add slack file upload option
by Evoort Solutions
TikTok Transcript Generator Overview This automated workflow extracts transcripts from TikTok videos by reading video URLs from a Google Sheet, calling the API via TikTok Transcript Generator, cleaning the subtitle data, and updating the sheet with transcripts. It efficiently handles batches, errors, and rate limits to provide a seamless transcription process. Key Features Batch processing:** Reads and processes multiple TikTok video URLs from Google Sheets. Automatic transcript generation:* Uses the *TikTok Transcript Generator API on RapidAPI**. Clean subtitle output:** Removes timestamps and headers for clear transcripts. Error handling:** Marks videos with no available transcript. Rate limiting:* Implements wait times to avoid API throttling on *RapidAPI**. Seamless Google Sheets integration:** Updates the same sheet with transcript results and statuses. API Used TikTok Transcript Generator API** Google Sheet Columns | Column Name | Description | |----------------|-----------------------------------------| | Video Url | URL of the TikTok video to transcribe | | Transcript | Generated transcript text (updated by workflow) | | Generated Date | Date when the transcript was generated (YYYY-MM-DD) | Workflow Nodes Explanation | Node Name | Type | Purpose | |--------------------------|-----------------------|-------------------------------------------------------------------| | When clicking ‘Execute workflow’ | Manual Trigger | Manually starts the entire transcription workflow. | | Google Sheets2 | Google Sheets (Read) | Reads TikTok video URLs and transcript data from Google Sheets. | | Loop Over Items | Split In Batches | Processes rows in smaller batches to control execution speed. | | If | Conditional Check | Filters videos needing transcription (URL present, transcript empty). | | HTTP Request | HTTP Request | Calls the TikTok Transcript Generator API on RapidAPI to fetch transcripts. | | If1 | Conditional Check | Checks for valid API responses (handles 404 errors). | | Code | Code (JavaScript) | Cleans and formats raw subtitle text by removing timestamps. | | Google Sheets | Google Sheets (Update)| Updates the sheet with cleaned transcripts and generation dates. | | Google Sheets1 | Google Sheets (Update)| Updates sheet with “No transcription available” message on error.| | Wait | Wait | Adds delay between batches to avoid API rate limits on RapidAPI. | Challenges Resolved Manual Transcription Effort:** Eliminates the need to manually transcribe TikTok videos, saving time and reducing errors. API Rate Limits:* Introduces batching and wait periods to avoid exceeding API usage limits on *RapidAPI**, ensuring smooth execution. Incomplete or Missing Data:** Filters out videos already transcribed and handles missing transcripts gracefully by logging appropriate messages. Data Formatting Issues:** Cleans raw subtitle data to provide readable, timestamp-free transcripts. Data Synchronization:** Updates transcripts back into the same Google Sheet row, maintaining data consistency and ease of access. Use Cases Content creators wanting to transcribe TikTok videos automatically. Social media analysts extracting text data for research. Automation enthusiasts integrating transcript generation into workflows. How to Use Prepare a Google Sheet with the columns: Video Url, Transcript, and Generated Date. Connect your Google Sheets account in the workflow. Enter your RapidAPI key for the TikTok Transcript Generator API. Execute the workflow to generate transcripts. View transcripts and generated dates directly in your Google Sheet. Try this workflow to automate your TikTok video transcriptions efficiently! Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by Robert Breen
This n8n workflow dynamically generates a realistic sample dataset based on a single topic you provide. It uses OpenAI (via LangChain) and n8n’s built-in nodes to: Generate structured JSON data for 5 columns with 3–5 values each Flatten that data into a single text blob Infer meaningful column names via a second AI call Pivot, split, merge, and rename columns automatically Output a clean, labeled dataset ready for export or further processing ⚙️ Prerequisites OpenAI API Key Visit: https://platform.openai.com/account/api-keys Create a new key In n8n: Credentials → New → OpenAI API, paste key, name it “OpenAi account” LangChain nodes enabled in your n8n instance 🥇 Step 1: Set Up OpenAI Credential Go to OpenAI API Keys Create and copy your key In n8n: Credentials → New → OpenAI API → paste key as “OpenAi account” 🥈 Step 2: Manual Trigger Add Manual Trigger to start the workflow 🥉 Step 3: Set Topic Add a Set node named Set Topic to Search Field: Topic = n8n use cases (or any topic you choose) ✨ Step 4: Generate Structured Data LangChain Agent** node Generate Random Data Connect to OpenAI Chat Model1 and Tool: Inject Creativity1 System prompt: instruct AI to output 5 columns of realistic values in JSON 🔧 Step 5: Parse AI Output Structured Output Parser** to validate JSON 🔄 Step 6: Flatten Data Code** node Outpt all Data to One Field Joins all values into a comma-separated string for column naming 🧠 Step 7: Generate Column Names LangChain Agent** Generate Column Names Connect to OpenAI Chat Model2 Prompt: infer 5 column names from the string 🔢 Step 8: Pivot Names Row Code** node Pivot Column Names transforms array into { column1: name1, … } 🪓 Step 9: Split Columns 5 SplitOut nodes to break each array back into rows per column 🔗 Step 10: Merge Rows Merge** node Merge Columns together using combineByPosition 🏷️ Step 11: Rename Columns Set** node Rename Columns assigns the AI-generated names to each column 🔗 Step 12: Final Output Merge** Append Column Names combines data and header row 🏁 Done! You now have a fully AI-driven, labeled dataset generated from a single topic—no external services needed. Easily extend by adding a Google Sheets or HTTP node to export. 📬 Need Help or Want to Customize This? 📧 robert@ynteractive.com 🔗 LinkedIn
by Puspak
Workflow Overview This workflow automatically fetches the latest "Ask HN: Who is hiring?" posts from Hacker News, extracts individual job listings, cleans the raw text, converts them into structured job listings using Google Gemini AI, and saves them into Airtable. Components It’s a full end-to-end automation system combining: Algolia API** for HN data Text cleaning** Gemini AI (via LangChain)** for parsing job descriptions Structured JSON extraction** Airtable integration** to store the final data 🎯 Use Cases Automatically build a job board from HN posts Track startup hiring trends Feed remote job alerts into a CRM or Slack Enrich a hiring intelligence database 🔧 Nodes & Services Used HTTP Request (Algolia + Firebase API) SplitOut, Set, Filter, Function, Limit Google Gemini (via LangChain integration) Output Parser Structured Airtable (API token required) 📌 Credentials Required Google Gemini (PaLM/Gemini API) Airtable Personal Access Token Algolia Application ID & API Key (via Header Auth) 📦 Tags hacker-news, jobs, airtable, ai, gemini, automation, hn, langchain, workflow Screenshots