by Javier Rieiro
Overview This workflow automates static security analysis for JavaScript, PHP, and Python codebases. It’s designed for bug bounty hunters and security researchers who need fast, structured, and AI-assisted vulnerability detection across multiple sources. Features 🤖 AI-Powered Analysis: Specialized agents for each language: AI JavaScript Expert AI PHP Expert AI Python Expert Each agent detects only exploitable vulnerabilities (AST + regex heuristics). Returns strict JSON with: { "results": [ { "url": "file or URL", "code": "lines + snippet", "severity": "medium|high|critical", "vuln": "vulnerability type" } ] } 🧩 Post-Processing: Cleans, formats, and validates JSON results. Generates HTML tables with clear styling for quick visualization. Output ✅ JSON vulnerability reports per file. 📊 HTML table summaries grouped by language and severity. Usage Import the workflow into n8n. Configure credentials: OpenAI API key GitHub API Key Google Drive API Key Run via the provided webhook form. Select analysis mode and input target. View structured vulnerability reports directly in n8n or Google Drive. Notes Performs static analysis only (no code execution). Detects exploitable findings only; ignores low-impact issues.
by Ertay Kaya
This n8n workflow automates the process of collecting, storing, and summarizing customer reviews from the Apple App Store for multiple apps. It fetches daily reviews, stores them in a Pinecone vector database, and generates a weekly summary using OpenAI, which is then posted to a Slack channel. Key Features Fetches daily customer reviews for a list of Apple App Store apps using the App Store Connect API. Stores reviews in Pinecone, with separate namespaces for each app and automatic weekly cleanup. Uses OpenAI to generate a summary of reviews, including positive/negative highlights and average star rating. Posts the summary to a specified Slack channel every week. How to use Set your Apple App Store app IDs and names in the provided Set nodes. Configure your Apple API, Pinecone, OpenAI, and Slack credentials. Adjust the schedule triggers as needed for daily fetching and weekly summarization. Deploy the workflow to automate review monitoring and reporting for your apps.
by Amirul Hakimi
Supercharge your sales and marketing efforts with this powerful automation that transforms a list of LinkedIn profiles into a fully enriched, personalized outreach campaign. This workflow is designed for sales teams, growth marketers, and business development professionals looking to scale their lead generation without sacrificing personalization. It seamlessly integrates LinkedIn scraping, email enrichment with Hunter.io, AI-powered message generation with OpenAI, and data organization in Google Sheets. How It Works Start with Leads: The workflow begins with a list of target LinkedIn profile URLs. Scrape Profile Data: It automatically scrapes each LinkedIn profile to extract key professional information such as name, title, company, and location. A built-in delay helps manage rate limits. Find Verified Emails: Using the scraped company and name, the workflow queries ==Hunter.io to find a verified work email address== for the lead. AI-Powered Personalization: If an email is found, the lead's data is sent to OpenAI (GPT-4), which generates a highly personalized, conversational outreach message based on their role, company, and your value proposition. Sync to CRM/Sheet: Finally, all the enriched data—including the custom AI message—is neatly organized and saved as a new row in your designated Google Sheet. Stop wasting hours on manual lead research and generic outreach. Implement this automated workflow to focus on building relationships and closing deals.
by Robin Bonduelle
Template presentation This template generates a sales follow-up presentation in Google Slides after a sales call recorded in Claap. The workflow is simplified to showcase the main use case. You can customize and enrich this workflow by connecting to the CRM, researching data online or adding more files in the presentation. The presentation template used in this workflow is available here. Workflow configuration Create a webhook in Claap, by following this article. Edit the labels that trigger the workflow and route on the relevant presentation. Fill your Open AI credentials by creating an API Key in OpenAI Platform Edit the presentation personalization details (user set as editor, content, title) Fill your Slack credentials by following steps in this video.
by Robert Breen
This n8n workflow pulls campaign data from Google Sheets, summarizes it using OpenAI, and sends a performance recap via Outlook email. ✅ Step 1: Connect Google Sheets In n8n, go to Credentials → click New Credential Select Google Sheets OAuth2 API Log in with your Google account and authorize Use a spreadsheet with: Column names in the first row Data in rows 2–100 Example format: 📄 Sample Marketing Sheet ✅ Step 2: Connect OpenAI Go to OpenAI API Keys Make sure you have a payment method set under Billing In n8n, create a new OpenAI API credential Paste your API key and save 📬 Need Help? Feel free to contact me if you run into issues: 📧 robert@ynteractive.com 🔗 LinkedIn
by Richard Besier
How This Works This automation automatically scrapes leads from Apollo using the Apify scraper, filters out those who do not have an Email or URL included, scrapes the leads' website content and writes personalised Icebreakers and subject lines based on the website's content. Set Up (Step-by-Step) Connect the API keys from the Apify scraper mentioned in the workflow sticky note. Insert Apollo URL and the amount of leads you want to scrape. Connect your Slack account (if needed) Reach Out To Me Send me an Email if you need further assistance: richard@advetica-systems.com
by Automate With Marc
🛠 GPT-5 + Pinecone-Powered Slack Auto-Responder — Real-Time, Context-Aware Replies for IT & Engineering Teams Description Cut down on context-switching and keep your Slack threads moving with an AI agent that responds on your behalf, pulling real-time knowledge from a Pinecone vector database. Built for IT, DevOps, and engineering environments, this n8n workflow ensures every reply is accurate, context-aware, and instantly available—without you lifting a finger. Check out step-by-step video build of workflows like these here: https://www.youtube.com/@automatewithmarc How It Works Slack Listener: Triggers when you’re mentioned or messaged in relevant channels. Pinecone RAG Retrieval: Pulls the most relevant technical details from your indexed documents, architecture notes, or runbooks. GPT-5 Processing: Formats the retrieved data into a clear, concise, and technically accurate reply. Thread-Aware Memory: Maintains the conversation state to avoid repeating answers. Slack Send-as-User: Posts the message under your identity for seamless integration into team workflows. Why IT Teams Will Love It 📚 Always up-to-date — If your Pinecone index is refreshed with system docs, runbooks, or KB articles, the bot will always deliver the latest info. 🏗 Technical context retention — Perfect for answering ongoing infrastructure or incident threads. ⏱ Reduced interruption time — No more breaking focus to answer “quick questions.” 🔐 Controlled outputs — Tune GPT-5 to deliver fact-based, low-fluff responses for critical environments. Common Use Cases DevOps: Automated responses to common CI/CD, deployment, or incident queries. Support Engineering: Pulling troubleshooting steps directly from KB entries. Project Coordination: Instant status updates pulled from sprint or release notes. Pro Tips for Deployment Keep your Pinecone vector DB updated with the latest architecture diagrams, release notes, and SOPs. Use embeddings tuned for technical documentation to improve retrieval accuracy. Add channel-specific prompts if different teams require different response styles (e.g., #devops vs #product).
by Luís Philipe Trindade
What's up Guys. I'm Luís 🙋🏻♂️ If you need to analyze dozens of Instagram profiles every week, this isn't just another automation. It’s your new secret weapon. It’s a fully structured workflow for anyone who needs to analyze Instagram profiles at scale, with AI, and keep everything tracked and organized — no manual effort, no copy-paste, and total control over the process. What this workflow does: Detects when a new or updated Instagram profile appears in Google Sheets Checks if the profile still needs analysis Creates a account in Airtop and after create a session and new window in Airtop** for Instagram scraping (Airtop offers 5,000 free credits/month!) Automatically accesses the Instagram profile and scrapes public data Cleans and refines the data with OpenAI (AI) for maximum accuracy Performs a second layer of AI analysis for deep, actionable insights Updates all results and insights directly back into Google Sheets 🧩 How the flow is structured This workflow is strategically divided into two independent parts to ensure clarity, organization, and easy scalability: Part 1 – Profile Capture and Data Extraction Triggered by a new/updated row in Google Sheets - Take here your Template of Google Sheets <> [[TEMPLATE] - Instagram Profiles](https://docs.google.com/spreadsheets/d/1rXvJuMg1LHsF5dHZobFmfZ3wk60jjp-bC-WkmIH8Jqc/edit?usp=sharing) Checks if the profile needs analysis Starts a session and new window on Airtop Scrapes the public Instagram data Part 2 – AI Analysis and Results Delivery Cleans and structures Airtop’s output for AI analysis Uses OpenAI to process and refine the data Applies a second AI prompt for actionable insights Updates Google Sheets with all processed results This structure makes the flow easier to manage, customize, and scale — plug in other tools without breaking the logic. Tools used: ✅ Airtop (Instagram scraping & session management) ✅ Google Sheets (database & dashboard) ✅ OpenAI (data parsing, refinement & analysis) How to set it up: Connect your Google Sheet to the workflow Register on Airtop, get your free credits, and set up your session: https://portal.airtop.ai/ Add your Airtop credentials in n8n Add your OpenAI API key (Optional) Customize the AI prompts and sheet logic Why this workflow stands out: 📊 *Fully automated: analyze dozens or hundreds of profiles without manual work* 🤖 Double-layer AI analysis for maximum insights 🚀 Built to scale — ideal for agencies, marketing teams, communities, and creators 📝 Everything tracked and accessible in Google Sheets 🔗 Airtop integration: scrape Instagram with zero headache ✅ Works on both n8n Cloud and Self-hosted 🔐 100% secure. No hacks. No shortcuts. Want to adapt this flow for your business, team, or community? 📩 Custom requests: WhatsApp me at +5534992569346 Português <> PT-BR Fala, galera! Eu sou o Luís 🙋🏻♂️ Se você precisa analisar vários perfis do Instagram por semana, esse não é só mais um fluxo, é uma verdadeira mão na roda. É uma automação ponta a ponta, feita pra quem analisa perfis do Instagram em escala, com IA, tudo organizado direto na planilha, e sem trabalho manual ou copiar e colar. O que esse fluxo faz: Detecta quando um novo perfil é cadastrado ou atualizado no Google Sheets Confere se o perfil ainda precisa ser analisado Cria uma nova conta no Airtop e depois Cria uma sessão e uma nova janela no Airtop** para buscar dados do Instagram (Airtop libera 5.000 créditos grátis todo mês!) Acessa automaticamente o perfil do Instagram e coleta os dados públicos Limpa e ajusta os dados com a OpenAI para garantir máxima precisão Realiza uma segunda análise com IA para entregar insights profundos Atualiza todos os resultados direto na sua planilha do Google Sheets 🧩 Como o fluxo está estruturado Esse fluxo foi estrategicamente dividido em duas partes independentes, garantindo clareza, organização e escalabilidade: Parte 1 – Captura do Perfil e Extração dos Dados Dispara quando um perfil novo ou atualizado aparece na planilha. Pegue aqui o Template da sua planilha. <> [[TEMPLATE] - Instagram Profiles](https://docs.google.com/spreadsheets/d/1rXvJuMg1LHsF5dHZobFmfZ3wk60jjp-bC-WkmIH8Jqc/edit?usp=sharing) Verifica se precisa de análise Inicia sessão e janela nova no Airtop Faz o scraping dos dados públicos do Instagram Parte 2 – Análise com IA e Entrega dos Resultados Limpa e estrutura a saída do Airtop para análise por IA Usa a OpenAI para processar e refinar os dados Aplica um segundo prompt de IA para gerar diagnósticos e recomendações Atualiza todos os resultados prontos na planilha Essa estrutura torna o fluxo muito mais fácil de manter, adaptar e escalar — pode integrar novas ferramentas sem bagunçar nada. Ferramentas utilizadas: ✅ Airtop (raspagem e gerenciamento de sessão do Instagram) ✅ Google Sheets (base e painel de acompanhamento) ✅ OpenAI (tratamento, ajuste e análise via IA) Como configurar: Conecte sua planilha Google Sheets ao fluxo Cadastre-se no Airtop e ative seus créditos gratuitos: https://portal.airtop.ai/ Adicione suas credenciais do Airtop no n8n Coloque sua chave da OpenAI (Opcional) Personalize os prompts da IA e a lógica da planilha Por que esse fluxo se destaca: 📊 *Automação completa: analise dezenas ou centenas de perfis sem esforço manual* 🤖 Dupla análise IA para insights de verdade 🚀 Pronto pra escalar — ideal para agências, times de marketing, comunidades e criadores 📝 Tudo rastreado e acessível no Google Sheets 🔗 Integração Airtop: scraping do Instagram sem dor de cabeça ✅ Compatível com n8n Cloud e Self-hosted 🔐 100% seguro. Sem gambiarra. Sem atalhos. Quer adaptar esse fluxo pro seu negócio, time ou comunidade? 📩 Solicitação personalizada: me chama no WhatsApp +5534992569346
by Candra Reza
Supercharge your sales and marketing with this AI-powered workflow! 🚀 Stop wasting hours on manual company research. This template deploys an autonomous AI agent that takes a list of company names from a Google Sheet, scours the web to find critical information, and automatically updates your sheet with the enriched data. What it does: Reads a list of companies to research from a Google Sheet. Uses an AI agent equipped with Google Search and web scraping tools. Extracts key data points like LinkedIn URLs, pricing details, integrations, market focus (B2B/B2C), and more. Structures the output into a clean JSON object. Updates the original Google Sheet with the new information. Key Features & Customization: This workflow is built to be easily customized. You can modify the AI's prompt in the "AI company researcher" node and adjust the "Structured Output Parser" to gather any public data point you need, such as recent news, key executives, or their technology stack. Required Credentials: OpenAI Google Sheets SerpApi or ScrapingBee for search capabilities
by Hemanth Arety
Who is this for This workflow is designed for content creators, digital marketers, bloggers, and businesses who need to produce high-quality, SEO-optimized blog posts and newsletters at scale. Perfect for marketing teams, content agencies, solopreneurs, and anyone looking to automate their content creation process while maintaining professional quality. What it does This multi-agent AI system generates complete, research-backed blog posts and newsletters automatically. Four specialized AI agents work in sequence: the Research Agent gathers facts and sources, the Outline Agent structures the content, the Writer Agent creates engaging Medium-style prose, and the Editor Agent polishes and optimizes for search engines. The workflow automatically routes content to either blog format (with DALL-E generated featured images) or newsletter format based on your input, then saves everything to Airtable or Google Sheets for easy management. Requirements OpenRouter API key (for Grok AI model) Google Gemini API key OpenAI API key (for DALL-E image generation) Airtable account Google Sheets account (for newsletters) Telegram bot token (optional, for notifications) How to set up Import the workflow into your n8n instance Add your API credentials to each language model node (OpenRouter, Google Gemini, OpenAI) Configure your Airtable base ID and table ID in the "Save Blog to Airtable" node Set your Google Sheets document ID in the "Save Newsletter to Google Sheets" node (Optional) Add your Telegram bot token and chat ID for notifications Test the workflow by submitting the form with a sample topic How to customize the workflow You can easily adapt this workflow to your specific needs. Replace the AI models with your preferred providers (Claude, GPT-4, Llama, etc.) by swapping the language model nodes. Modify the agent prompts to change writing style, tone, or target audience. Add additional agents for fact-checking, plagiarism detection, or brand voice alignment. Connect the output directly to publishing platforms like WordPress, Medium, Webflow, or email marketing services like Mailchimp. Adjust the image generation parameters to match your brand aesthetic, or skip image generation entirely if not needed.
by Nishant
Overview Confused which credit card to actually get or swipe? With 100+ cards in the market, hidden caps, and milestone rules, most people end up leaving rewards, perks, and cashback on the table. This workflow uses n8n + GPT + Google Sheets + Telegram to recommend the best credit card for each user’s lifestyle in under 3 seconds, while keeping the logic transparent with a ₹-value breakdown. What does this workflow do? This workflow: Captures User Inputs – Users answer a 7-question lifestyle quiz via Telegram. Stores Responses – Google Sheets logs all answers for resumption & deduplication. Scores Answers – n8n Function nodes map single & multi-select inputs into scores. Generates Recommendations – GPT analyses profile vs. 30+ card dataset. Breaks Down Value – Outputs a transparent table of rewards, milestones, lounge value. Delivers Results – Top 3 card picks returned instantly on Telegram. Why is this useful? Most card comparison tools only list features — they don’t personalise or calculate actual value. This workflow builds a decision engine: 🔍 Personalised → matches lifestyle to best-fit cards 💸 Transparent → shows value in real currency (rewards, milestones, lounges) ⏱ Fast → answers in under 3 seconds 🗂 Organised → Google Sheets keeps audit trail of every user + dedupe Tools used n8n (Orchestrator): Orchestration + logic branching Telegram: User-facing quiz bot Google Sheets: Database of credit cards + logs of user answers OpenAI (GPT): Analyses user profile & generates recommendations Who is this for? 🧑💻 Fintech product builders → see how AI can power recommendation engines 💳 Cardholders → understand which card fits their lifestyle best ⚙️ n8n makers → learn how to combine Sheets + GPT + chat interface into one workflow 🌍 How to adapt it for your country/location This workflow uses a credit card dataset stored in Google Sheets. To make it work for your country: Build your dataset → scrape or collect card details from banks, comparison sites, or official portals Fields to include: Fees, Reward rate, Lounge access, Forex markup, Reward caps, Milestones, Eligibility. You can use web crawlers (e.g., Apify, PhantomBuster) to automate data collection. Update the Google Sheet → replace the India dataset with your country’s cards. Adjust scoring logic → modify Function nodes if your cards use different reward structures (e.g., cashback %, miles, points value). Run the workflow → GPT will analyse against the new dataset and generate recommendations specific to your country. This makes the workflow flexible for any geography. Workflow Highlights ✅ End-to-end credit card recommendation pipeline (quiz → scoring → GPT → result) ✅ Handles single + multi-select inputs fairly with % match scoring ✅ Transparent value breakdown in local currency (rewards, milestones, lounge access) ✅ Google Sheets for persistence, dedupe & audit trail ✅ Delivers top 3 cards in <3 seconds on Telegram ✅ Fully customisable for any country by swapping the dataset
by Sergey Filippov
Who's it for Developers building AI-powered workflows who want to ensure their agents work reliably. If you need to validate AI outputs, test agent behavior systematically, or build maintainable automation, this template shows you how. What it does This subworkflow extracts structured meeting details (title, date, time, location, links, attendees) from natural language messages using an AI agent. It demonstrates production-ready patterns: Structured output validation**: JSON schema enforcement prevents malformed responses Error handling**: Graceful failures with full execution traceability Automated evaluation**: Test agent accuracy against expected outputs using Google Sheets Dual execution modes**: Normal extraction + evaluation/testing mode The AI resolves relative time ("tomorrow", "next Friday") using timezone context and handles incomplete data gracefully. How to set it up Connect OpenAI API credential to the AI agent node Copy the test data sheet: https://docs.google.com/spreadsheets/d/1U89nPsasM2WNv1D7gEYINhDwylyxYw7BOd_i8ipFC0M/edit?usp=sharing Update Google Sheet IDs in load_eval_data and record_eval_output nodes Test normal mode: Execute workflow "from trigger" Test evaluation mode: Execute workflow "from load_eval_data" Requirements OpenAI API key Google Sheets OAuth credential Why subworkflow architecture? Reusability: Wrap AI agents in subworkflows to call them from multiple parent workflows. Extract meetings from Slack, email, or webhooks—same agent, consistent results. Testability: This pattern enables isolated testing for each AI component. Set up evaluation datasets, run automated tests, and validate accuracy before deploying to production. You can't do this easily with inline agents. Maintainability: Update the agent logic once, and all parent workflows benefit. Error handling and validation are built-in, so failures are traceable with execution IDs. This framework includes: Dual-trigger pattern (normal + evaluation modes) Output validation that catches silent AI failures Error bubbling with execution metadata for debugging Evaluation framework with semantic/exact matching Proper routing that returns output to parent workflows Following this pattern for other agents To adapt this for any AI task (contact extraction, invoice processing, sentiment analysis, etc.): Replace extract_meeting_details with your AI agent (add tools, memory, etc. as needed) Update Structured Output Parser schema to match your data structure Modify evaluate_match prompt for your validation criteria Create test cases in Google Sheets with your inputs/expected outputs Adjust normalize_eval_data timezone/reference time if needed The validation, error handling, and evaluation infrastructure stays the same regardless of what your agent does.