by Luís Philipe Trindade
What's up Guys. I'm Luís 🙋🏻♂️ If you need to analyze dozens of Instagram profiles every week, this isn't just another automation. It’s your new secret weapon. It’s a fully structured workflow for anyone who needs to analyze Instagram profiles at scale, with AI, and keep everything tracked and organized — no manual effort, no copy-paste, and total control over the process. What this workflow does: Detects when a new or updated Instagram profile appears in Google Sheets Checks if the profile still needs analysis Creates a account in Airtop and after create a session and new window in Airtop** for Instagram scraping (Airtop offers 5,000 free credits/month!) Automatically accesses the Instagram profile and scrapes public data Cleans and refines the data with OpenAI (AI) for maximum accuracy Performs a second layer of AI analysis for deep, actionable insights Updates all results and insights directly back into Google Sheets 🧩 How the flow is structured This workflow is strategically divided into two independent parts to ensure clarity, organization, and easy scalability: Part 1 – Profile Capture and Data Extraction Triggered by a new/updated row in Google Sheets - Take here your Template of Google Sheets <> [[TEMPLATE] - Instagram Profiles](https://docs.google.com/spreadsheets/d/1rXvJuMg1LHsF5dHZobFmfZ3wk60jjp-bC-WkmIH8Jqc/edit?usp=sharing) Checks if the profile needs analysis Starts a session and new window on Airtop Scrapes the public Instagram data Part 2 – AI Analysis and Results Delivery Cleans and structures Airtop’s output for AI analysis Uses OpenAI to process and refine the data Applies a second AI prompt for actionable insights Updates Google Sheets with all processed results This structure makes the flow easier to manage, customize, and scale — plug in other tools without breaking the logic. Tools used: ✅ Airtop (Instagram scraping & session management) ✅ Google Sheets (database & dashboard) ✅ OpenAI (data parsing, refinement & analysis) How to set it up: Connect your Google Sheet to the workflow Register on Airtop, get your free credits, and set up your session: https://portal.airtop.ai/ Add your Airtop credentials in n8n Add your OpenAI API key (Optional) Customize the AI prompts and sheet logic Why this workflow stands out: 📊 *Fully automated: analyze dozens or hundreds of profiles without manual work* 🤖 Double-layer AI analysis for maximum insights 🚀 Built to scale — ideal for agencies, marketing teams, communities, and creators 📝 Everything tracked and accessible in Google Sheets 🔗 Airtop integration: scrape Instagram with zero headache ✅ Works on both n8n Cloud and Self-hosted 🔐 100% secure. No hacks. No shortcuts. Want to adapt this flow for your business, team, or community? 📩 Custom requests: WhatsApp me at +5534992569346 Português <> PT-BR Fala, galera! Eu sou o Luís 🙋🏻♂️ Se você precisa analisar vários perfis do Instagram por semana, esse não é só mais um fluxo, é uma verdadeira mão na roda. É uma automação ponta a ponta, feita pra quem analisa perfis do Instagram em escala, com IA, tudo organizado direto na planilha, e sem trabalho manual ou copiar e colar. O que esse fluxo faz: Detecta quando um novo perfil é cadastrado ou atualizado no Google Sheets Confere se o perfil ainda precisa ser analisado Cria uma nova conta no Airtop e depois Cria uma sessão e uma nova janela no Airtop** para buscar dados do Instagram (Airtop libera 5.000 créditos grátis todo mês!) Acessa automaticamente o perfil do Instagram e coleta os dados públicos Limpa e ajusta os dados com a OpenAI para garantir máxima precisão Realiza uma segunda análise com IA para entregar insights profundos Atualiza todos os resultados direto na sua planilha do Google Sheets 🧩 Como o fluxo está estruturado Esse fluxo foi estrategicamente dividido em duas partes independentes, garantindo clareza, organização e escalabilidade: Parte 1 – Captura do Perfil e Extração dos Dados Dispara quando um perfil novo ou atualizado aparece na planilha. Pegue aqui o Template da sua planilha. <> [[TEMPLATE] - Instagram Profiles](https://docs.google.com/spreadsheets/d/1rXvJuMg1LHsF5dHZobFmfZ3wk60jjp-bC-WkmIH8Jqc/edit?usp=sharing) Verifica se precisa de análise Inicia sessão e janela nova no Airtop Faz o scraping dos dados públicos do Instagram Parte 2 – Análise com IA e Entrega dos Resultados Limpa e estrutura a saída do Airtop para análise por IA Usa a OpenAI para processar e refinar os dados Aplica um segundo prompt de IA para gerar diagnósticos e recomendações Atualiza todos os resultados prontos na planilha Essa estrutura torna o fluxo muito mais fácil de manter, adaptar e escalar — pode integrar novas ferramentas sem bagunçar nada. Ferramentas utilizadas: ✅ Airtop (raspagem e gerenciamento de sessão do Instagram) ✅ Google Sheets (base e painel de acompanhamento) ✅ OpenAI (tratamento, ajuste e análise via IA) Como configurar: Conecte sua planilha Google Sheets ao fluxo Cadastre-se no Airtop e ative seus créditos gratuitos: https://portal.airtop.ai/ Adicione suas credenciais do Airtop no n8n Coloque sua chave da OpenAI (Opcional) Personalize os prompts da IA e a lógica da planilha Por que esse fluxo se destaca: 📊 *Automação completa: analise dezenas ou centenas de perfis sem esforço manual* 🤖 Dupla análise IA para insights de verdade 🚀 Pronto pra escalar — ideal para agências, times de marketing, comunidades e criadores 📝 Tudo rastreado e acessível no Google Sheets 🔗 Integração Airtop: scraping do Instagram sem dor de cabeça ✅ Compatível com n8n Cloud e Self-hosted 🔐 100% seguro. Sem gambiarra. Sem atalhos. Quer adaptar esse fluxo pro seu negócio, time ou comunidade? 📩 Solicitação personalizada: me chama no WhatsApp +5534992569346
by Automate With Marc
🛠 GPT-5 + Pinecone-Powered Slack Auto-Responder — Real-Time, Context-Aware Replies for IT & Engineering Teams Description Cut down on context-switching and keep your Slack threads moving with an AI agent that responds on your behalf, pulling real-time knowledge from a Pinecone vector database. Built for IT, DevOps, and engineering environments, this n8n workflow ensures every reply is accurate, context-aware, and instantly available—without you lifting a finger. Check out step-by-step video build of workflows like these here: https://www.youtube.com/@automatewithmarc How It Works Slack Listener: Triggers when you’re mentioned or messaged in relevant channels. Pinecone RAG Retrieval: Pulls the most relevant technical details from your indexed documents, architecture notes, or runbooks. GPT-5 Processing: Formats the retrieved data into a clear, concise, and technically accurate reply. Thread-Aware Memory: Maintains the conversation state to avoid repeating answers. Slack Send-as-User: Posts the message under your identity for seamless integration into team workflows. Why IT Teams Will Love It 📚 Always up-to-date — If your Pinecone index is refreshed with system docs, runbooks, or KB articles, the bot will always deliver the latest info. 🏗 Technical context retention — Perfect for answering ongoing infrastructure or incident threads. ⏱ Reduced interruption time — No more breaking focus to answer “quick questions.” 🔐 Controlled outputs — Tune GPT-5 to deliver fact-based, low-fluff responses for critical environments. Common Use Cases DevOps: Automated responses to common CI/CD, deployment, or incident queries. Support Engineering: Pulling troubleshooting steps directly from KB entries. Project Coordination: Instant status updates pulled from sprint or release notes. Pro Tips for Deployment Keep your Pinecone vector DB updated with the latest architecture diagrams, release notes, and SOPs. Use embeddings tuned for technical documentation to improve retrieval accuracy. Add channel-specific prompts if different teams require different response styles (e.g., #devops vs #product).
by Milan Vasarhelyi - SmoothWork
Video Introduction Want to automate your inbox or need a custom workflow? 📞 Book a Call | 💬 DM me on Linkedin What This Workflow Does This workflow creates an AI-powered chatbot that can answer natural language questions about your QuickBooks Online data. Using OpenAI's GPT models and the Model Context Protocol (MCP), the agent can retrieve customer information, analyze balances, and provide insights through a conversational interface. Users can simply ask questions like "How many customers do we have?" or "What's our total customer balance?" and get instant answers from live QuickBooks data. Key Features Natural language queries**: Ask questions about your QuickBooks data in plain English MCP architecture**: Uses Model Context Protocol to manage tools efficiently, making it easy to expand with additional QuickBooks operations Public chat interface**: Share the chatbot URL with team members who need QuickBooks insights without direct access Real-time data**: Queries live QuickBooks data for up-to-date answers Common Use Cases Customer service teams checking account balances without logging into QuickBooks Sales teams quickly looking up customer information Finance teams getting quick answers about customer data Managers monitoring key metrics through conversational queries Setup Requirements QuickBooks Developer Account: Register at developer.intuit.com and create an app with Accounting scope permissions. You'll receive a Client ID and Client Secret. Configure OAuth: In your Intuit Developer dashboard, add the redirect URL provided by n8n when creating QuickBooks credentials. Set the environment to Sandbox for testing, or complete Intuit's app approval process for Production use. OpenAI API: Add your OpenAI API credentials to power the chat model. The workflow uses GPT-4.1-mini by default, but you can select other models based on your performance and cost requirements. Chat Access: The chat trigger is set to public by default. Configure access settings based on your security requirements before sharing the chat URL.
by Hemanth Arety
Who is this for This workflow is designed for content creators, digital marketers, bloggers, and businesses who need to produce high-quality, SEO-optimized blog posts and newsletters at scale. Perfect for marketing teams, content agencies, solopreneurs, and anyone looking to automate their content creation process while maintaining professional quality. What it does This multi-agent AI system generates complete, research-backed blog posts and newsletters automatically. Four specialized AI agents work in sequence: the Research Agent gathers facts and sources, the Outline Agent structures the content, the Writer Agent creates engaging Medium-style prose, and the Editor Agent polishes and optimizes for search engines. The workflow automatically routes content to either blog format (with DALL-E generated featured images) or newsletter format based on your input, then saves everything to Airtable or Google Sheets for easy management. Requirements OpenRouter API key (for Grok AI model) Google Gemini API key OpenAI API key (for DALL-E image generation) Airtable account Google Sheets account (for newsletters) Telegram bot token (optional, for notifications) How to set up Import the workflow into your n8n instance Add your API credentials to each language model node (OpenRouter, Google Gemini, OpenAI) Configure your Airtable base ID and table ID in the "Save Blog to Airtable" node Set your Google Sheets document ID in the "Save Newsletter to Google Sheets" node (Optional) Add your Telegram bot token and chat ID for notifications Test the workflow by submitting the form with a sample topic How to customize the workflow You can easily adapt this workflow to your specific needs. Replace the AI models with your preferred providers (Claude, GPT-4, Llama, etc.) by swapping the language model nodes. Modify the agent prompts to change writing style, tone, or target audience. Add additional agents for fact-checking, plagiarism detection, or brand voice alignment. Connect the output directly to publishing platforms like WordPress, Medium, Webflow, or email marketing services like Mailchimp. Adjust the image generation parameters to match your brand aesthetic, or skip image generation entirely if not needed.
by DuyTran
Description 📌 Overview This workflow creates a chat-based Retrieval-Augmented Generation (RAG) agent that lets you upload documents to Google Drive and then query them directly through Telegram. It uses embeddings, vector storage, and an AI agent to retrieve, analyze, and answer user questions with context-aware responses. 🧩 Key Features 📂 Google Drive Integration Watches a folder for new file uploads. Downloads and loads documents automatically into the system. 🔎 Vector Embeddings & Storage Uses OpenAI embeddings to transform documents into vectors. Stores them in an in-memory vector store for retrieval. 🤖 AI Agent with Memory Built on LangChain Agent + GPT-4.1-mini. Performs similarity search in the vector store. Provides contextual answers with citations from the uploaded documents. Maintains short-term conversation memory for better continuity. 💬 Telegram Bot Integration Users can send questions directly to the bot. AI agent retrieves relevant information and replies with clear answers. ⚙️ How It Works Trigger: Upload a file into the Google Drive folder. Processing: Workflow downloads the file → loads → embeds → stores in vector memory. Query: User sends a question via Telegram. Retrieval & Response: AI agent searches stored documents → analyzes results → returns summarized answer in Telegram. 🔐 Requirements Google Drive OAuth credentials. OpenAI API key (for embeddings + LLM). Telegram Bot API token. 📥 Use Cases 📑 Knowledge base assistant – Upload internal docs and query them in chat. 🏫 Learning support – Students upload study materials and ask questions. 📊 Business intelligence – Teams upload reports and get instant summaries.
by Tihomir Mateev
Chat with Your GitHub Issues Using AI 🤖 Ever wanted to just ask your repository what's going on instead of scrolling through endless issue lists? This workflow lets you do exactly that. What Does It Do? Turn any GitHub repo into a conversational knowledge base. Ask questions in plain English, get smart answers powered by AI and vector search. "Show me recent authentication bugs"** → AI finds and explains them "What issues are blocking the release?"** → Instant context-aware answers "Are there any similar problems to #247?"** → Semantic search finds connections you'd miss The Magic ✨ Slurp up issues from your GitHub repo (with all the metadata goodness) Vectorize everything using OpenAI embeddings and store in Redis Chat naturally with an AI agent that searches your issue database Get smart answers with full conversation memory Quick Start You'll need: OpenAI API key (for the AI brain) Redis 8.x (for vector search magic) GitHub repo URL (optional: API token for speed) Get it running: Drop in your credentials Point it at your repo (edit the owner and repository params) Run the ingestion flow once to populate the database Start chatting! Tinker Away 🔧 This is your playground. Here are some ideas: Swap the data source**: Jira tickets? Linear issues? Notion docs? Go wild. Change the AI model**: Try different GPT models or even local LLMs Add custom filters**: Filter by labels, assignees, or whatever matters to you Tune the search**: Adjust how many results come back, tweak relevance scores Make it public**: Share the chat interface with your team or users Auto-update**: Hook it up to webhooks for real-time issue indexing Built with n8n, Redis, and OpenAI. No vendor lock-in, fully hackable, 100% yours to customize.
by Kamran habib
## | N8N Workflow | AI-Powered Twitter Automation with Content Generation and Engagement 🚀 This n8n template automates Twitter (X) activity — from generating tweet content with AI to engaging with posts and even sending DMs — all powered by Google Gemini or OpenRouter AI. It’s designed for creators, marketers, brands, and agencies who want to automate social media presence with authentic, on-brand AI content and engagement. How It Works The workflow begins with a form trigger, where users input their topic, tone, and action type (Tweet, Engage, or DM). Those inputs are passed into Workflow Configuration, which sets key parameters like max tweet length and model URLs. Depending on your chosen action: Post Tweet: AI generates a tweet under 280 characters and can attach an image. Engage with Posts: AI can like, retweet, or reply to niche-relevant content. Send Direct Message: AI drafts a personalized DM for outreach or networking. If your workflow includes visuals, the AI Agent - Create Image From Prompt node builds a detailed image prompt (based on your topic and instructions) and sends it to Google Gemini or other image APIs. The HTTP Request - Create Image node generates a custom image via an external model (default: Pollinations.ai). Finally, all tweet text and image data merge together in Merge Tweet Text and Image, before being posted directly via the Create Tweet node. How To Use Download and Import the JSON workflow into your n8n interface. Set up the following credentials: OpenRouter API for text generation. Google Gemini (PaLM) for chat and image prompt creation. Twitter OAuth2 API for posting and engagement actions. Configure your form input fields (Topic, Tone, Action, Instructions). Enable or disable the nodes you want: Create Tweet → To post automatically. Twitter Engagement Tool → For likes/retweets/replies. Twitter DM Tool → For automated DMs. Trigger the Twitter Content Form via n8n’s web interface. Enter your content preferences and submit. The workflow generates your tweet text, optionally creates a matching image, and posts or saves it automatically. Requirements A Twitter Developer Account (with OAuth2 credentials). A Google Gemini or OpenRouter account with text and image model access. (Optional) Connection to Pollinations or another AI image generation API. How To Customize Update “Fields – Set Values” node to change: Default image size (1080 × 1920 px). Model name (e.g., “flux”, “turbo”, “kontext”). Modify Workflow Configuration to tweak AI parameters like: imageGenerationChance (default: 0.3). maxTweetLength (default: 280). Replace Google Gemini Chat Model with any supported model such as OpenAI GPT-4 or Mistral. Adjust AI Agent - Create Image From Prompt system message for your preferred image style or guidelines. Toggle which Twitter actions are active — Post, Engage, or DM — to tailor automation to your goals.
by Cj Elijah Garay
📋 WORKFLOW OVERVIEW Automate reactions for Telegram Channel Posts - Automated Telegram reaction system for specific posts Flow: User sends message to a receiver bot AI parses request (emoji type & quantity) Code processes and validates data Loop sends reactions one by one User receives confirmation Key Features: Natural language processing by sending a message to a chat bot to react to a post on a different channel Reiterates through bot token rotation. This means that if you use 100 bots then you will be able to have 100 reactions per post of your choice Rate limit protection Error handling with helpful messages You will need to first add the bots that you personally own which can be acquired from BotFather to the channel that you would want them to react posts to and allow it to manage messages. Required Bot Permissions: Bot Must Be an Administrator The bot needs to be added as an admin to the channel (regular member status won't work for reactions). Specific Admin Rights Needed: When adding the bot as admin, you need to enable: ✅ "Post Messages" - This is actually the key permission needed ✅ "Add Subscribers" (optional, but sometimes required depending on channel settings) Credentials needed are: Target Channel ID, Bot tokens, Bot Receiver token, OpenAI API Example Usage: "https://t.me/channel/123 needs 10 hearts and 10 fire reactions If in need of help contact me at: elijahmamuri@gmail.com
by Anan
⚡Auto Rename n8n Workflow Nodes with AI✨ This workflow uses AI to automatically generate clear and descriptive names for every node in your n8n workflows. It analyzes each node's type, parameters, and connections to create meaningful names, making your workflows instantly readable. Who is it for? This workflow is for n8n users who manage complex workflows with dozens of nodes. If you've ever: Built workflows full of generic names like HTTP Request 2 or Edit Fields 1 Struggled to understand your own work after a few weeks Copied workflows from others with unclear node names Spent hours manually renaming nodes one by one ...then this workflow will save you significant time and effort. Requirements n8n API Credentials**: Must be configured to allow listing and updating workflows AI Provider Credentials**: An API key for your preferred AI provider (OpenRouter is used currently) How it works Trigger: Launch via form (select from dropdown) or manual trigger (quick testing with pre-selected workflow) Fetch: Retrieve the target workflow's JSON and extract nodes and connections Generate: Send the workflow JSON to the AI, which creates a unique, descriptive name for every node Validate: Verify the AI mapping covers all original node names Apply: If valid, update all node names, parameter references, and connections throughout the workflow Save: Save/Update the workflow with renamed nodes and provide links to both new and previous versions If validation fails (e.g., AI missed nodes), the workflow stops with an error. You can modify the error handling to retry or loop back to the AI node. Setup Connect n8n API credentials Open any n8n node in the workflow and make sure your n8n API credentials is connected Configure AI provider credentials Open the "OpenRouter" node (or replace with your preferred AI) Add your API credentials Adjust the model if needed (current: openai/gpt-5.1-codex-mini) Test the workflow Use Manual Trigger for quick testing with a pre-selected workflow Use Form Trigger for a user-friendly interface with workflow selection Important notice If you're renaming a currently opened workflow, you must reload the page after execution to see the latest version, n8n doesn't automatically refresh the canvas when workflow versions are updated via API. Need help? If you're facing any issues using this workflow, join the community discussion on the n8n forum.
by Țugui Dragoș
This workflow automatically scores and categorizes new GoHighLevel contacts using AI (GPT-4), then tags and assigns them to the appropriate team member based on their score. Hot leads also trigger a Slack notification for immediate follow-up. What does it do? Triggers when a new contact is added in GoHighLevel. Fetches full contact details and recent engagement data. Uses AI (GPT-4) to analyze and score the lead (1-100), categorize it (Hot, Warm, Cold), and provide an explanation. Tags the contact in GoHighLevel based on the score. Assigns the lead to the correct sales or nurturing team member. Sends a Slack alert for Hot leads to ensure fast response. Use case Use this workflow to automate lead qualification and assignment in sales teams using GoHighLevel. It helps prioritize high-quality leads, ensures fast follow-up, and reduces manual work. How to configure GoHighLevel API: Set your GoHighLevel API URL and API key in the Workflow Configuration node. Update user IDs for assignment as needed. Slack Integration: Add your Slack webhook URL or credentials in the Slack Notify Hot Lead node. AI Provider: Configure your OpenAI (or compatible) credentials in the AI Lead Scoring (GPT-4) node. Adjust thresholds: If needed, change the score thresholds in the IF nodes to match your business logic. Activate the workflow: Once configured, activate the workflow to start processing new leads automatically. Tip: You can further customize the workflow to fit your sales process, add more notifications, or integrate with other tools as needed.
by Richard Besier
How This Works This automation automatically scrapes leads from Apollo using the Apify scraper, filters out those who do not have an Email or URL included, scrapes the leads' website content and writes personalised Icebreakers and subject lines based on the website's content. Set Up (Step-by-Step) Connect the API keys from the Apify scraper mentioned in the workflow sticky note. Insert Apollo URL and the amount of leads you want to scrape. Connect your Slack account (if needed) Reach Out To Me Send me an Email if you need further assistance: richard@advetica-systems.com
by Robert Breen
This n8n workflow pulls campaign data from Google Sheets, summarizes it using OpenAI, and sends a performance recap via Outlook email. ✅ Step 1: Connect Google Sheets In n8n, go to Credentials → click New Credential Select Google Sheets OAuth2 API Log in with your Google account and authorize Use a spreadsheet with: Column names in the first row Data in rows 2–100 Example format: 📄 Sample Marketing Sheet ✅ Step 2: Connect OpenAI Go to OpenAI API Keys Make sure you have a payment method set under Billing In n8n, create a new OpenAI API credential Paste your API key and save 📬 Need Help? Feel free to contact me if you run into issues: 📧 robert@ynteractive.com 🔗 LinkedIn