by Harshil Agrawal
This workflow allows you to register your audience to an event on Demio via a Typeform submission. Typeform Trigger node: This node will trigger the workflow when a form response is submitted. Based on your use-case, you may use a different platform. Replace the Typeform Trigger node with a node of that platform. Demio node: This node registers a user for an event. It gets the details of the users from the Typeform response.
by q
This workflow automatically notifies the team in a Slack channel when code in a GitHub repository gets a new release. Prerequisites A GitHub account and credentials A Slack account and credentials Nodes GitHub Trigger node triggers the workflow when a release event takes place in the specified repository. Slack node posts a message in a specified channel with the text "New release is available in {repository name}", along with further details and a link to the release.
by Robin Geuens
Overview Turn your keyword research into a clear, fact-based content outline with this workflow. It splits your keyword into 5-6 subtopics, makes research questions for those subtopics, and uses Tavily to pull answers from real search results. This way your outline is based on real data, not just AI training data, so you can create accurate and reliable content. How it works Enter a keyword in the form to start the workflow The OpenAI node splits the keyword into 5-6 research subtopics and makes a research question for each one. These questions will be used to enrich the outline later on We split the research questions into separate items so we can process them one by one Each research question is sent to Tavily. Tavily searches the web for answers and returns a short summary Next, we add the answers to our JSON sections We take all the separate items and join them into one list again The JSON outline is converted into Markdown using a code node. The code takes the JSON headers, turns them into Markdown headings (level 2), and puts the answers underneath Setup steps Get an OpenAI API key and set up your credentials inside n8n Sign up for a Tavily account and get an API key — you can use a free account for testing Install the Tavily community node. If you don’t want to use a community node, you can call Tavily directly using an HTTP node. Check their API reference for what endpoints to call Run the workflow and enter the keyword you want to target in the form Adjust the workflow to decide what to do with the Markdown outline Requirements An OpenAI API key A Tavily account The Tavily community node installed (Optional) If you don’t want to use the Tavily community node, use a regular HTTP node and call the API directly. Check their API reference for what endpoints to call Workflow customizations Instead of using a form to enter your keyword, you can keep all your research in a Google Doc and go through it row by row You can add another AI node at the end to turn the outline into a full article You can put the outline in a Google Doc and send it to a writer using the Google Docs node and the Gmail node
by Eugene
Who is this for Marketing teams tracking AI SEO performance Content strategists planning editorial calendars SEO teams doing competitive intelligence What this workflow does Identify content opportunities by analyzing where competitors outrank you in AI search and traditional SEO. What you'll get AI visibility gaps across ChatGPT, Perplexity, and Gemini Keyword gaps with search volume and difficulty Competitor backlink authority metrics Prioritized opportunities with HIGH/MEDIUM/LOW scoring Actionable recommendations for each gap How it works Fetches AI search visibility for your domain and competitor Compares metrics across ChatGPT, Perplexity, and Gemini Extracts competitor's top-performing prompts and keywords Analyzes competitor backlink authority Calculates opportunity scores and prioritizes gaps Exports ranked opportunities to Google Sheets Requirements Self-hosted n8n instance SE Ranking community node installed SE Ranking API token (Get one here) Google Sheets account (optional) Setup Install the SE Ranking community node Add your SE Ranking API credentials Update the Configuration node with your domain and competitor Connect Google Sheets for export (optional) Customization Change source for different regions (us, uk, de, fr, etc.) Adjust volume/difficulty thresholds in Code nodes Modify priority scoring weights
by Guillaume Duvernay
This template provides a straightforward technique to measure and raise awareness about the environmental impact of your AI automations. By adding a simple calculation step to your workflow, you can estimate the carbon footprint (in grams of CO₂ equivalent) generated by each call to a Large Language Model. Based on the open methodology from Ecologits.ai, this workflow empowers you to build more responsible AI applications. You can use the calculated footprint to inform your users, track your organization's impact, or simply be more mindful of the resources your workflows consume. Who is this for? Environmentally-conscious developers:** Build AI-powered applications with an awareness of their ecological impact. Businesses and organizations:** Track and report on the carbon footprint of your AI usage as part of your sustainability goals. Any n8n user using AI:** A simple and powerful snippet that can be added to almost any AI workflow to make its invisible environmental costs visible. Educators and advocates:** Use this as a practical tool to demonstrate and discuss the real-world impact of AI technologies. What problem does this solve? Makes the abstract tangible:** The environmental cost of a single AI call is often overlooked. This workflow translates it into a concrete, measurable number (grams of CO₂e). Promotes responsible AI development:** Encourages builders to consider the efficiency of their prompts and models by showing the direct impact of the generated output. Provides a standardized starting point:** Offers a simple, transparent, and extensible method for carbon accounting in your AI workflows, based on a credible, open-source methodology. Facilitates transparent communication:** Gives you the data needed to transparently communicate the impact of your AI features to stakeholders and users. How it works This template demonstrates a simple calculation snippet that you can adapt and add to your own workflows. Set conversion factor: A dedicated Conversion factor node at the beginning of the workflow holds the gCO₂e per token value. This makes it easy to configure. AI generates output: An AI node (in this example, a Basic LLM Chain) runs and produces a text output. Estimate token count: The Calculate gCO₂e node takes the character length of the AI's text output and divides it by 4. This provides a reasonable estimate of the number of tokens generated. Calculate carbon footprint: The estimated token count is then multiplied by the conversion factor defined in the first node. The result is the carbon footprint for that single AI call. Setup Set your conversion factor (Critical Step): The default factor (0.0612) is for GPT-4o hosted in the US. Visit ecologits.ai/latest to find the specific conversion factor for your AI model and server region. In the Conversion factor node, replace the default value with the correct factor. Integrate the snippet into your workflow: Copy the Conversion factor and Calculate gCO₂e nodes from this template. Place the Conversion factor node near the start of your workflow (before your AI node). Place the Calculate gCO₂e node after your AI node. Link your AI output: Click on the Calculate gCO₂e node. In the AI output field, replace the expression with the output from your AI node (e.g., {{ $('My OpenAI Node').item.json.choices[0].message.content }}). The carbon calculation will now work with your data. Activate your workflow. The carbon footprint will now be calculated with each execution. Taking it further Improve accuracy with token counts:* If your AI node (like the native *OpenAI** node) directly provides the number of output tokens (e.g., completion_tokens), use that number instead of estimating from the text length. This will give you a more precise calculation. Calculate total workflow footprint:* If you have multiple AI nodes, add a calculation step after each one. Then, add a final *Set** node at the end of your workflow to sum all the individual gCO₂e values. Display the impact:** Add the final AI output gCO₂e value to your workflow's results, whether it's a Slack message, an email, or a custom dashboard, to keep the environmental impact top-of-mind. A note on AI agents:** This estimation method is difficult to apply accurately to AI Agents at this time, as the token usage of their intermediary "thinking" steps is not yet exposed in the workflow data.
by Victor Manuel Lagunas Franco
Generate complete illustrated stories using AI. This workflow creates engaging narratives with custom DALL-E 3 images for each scene and saves everything to Firebase. How it works User fills out a form with story topic, language, art style, and number of scenes GPT-4 generates a complete story with scenes, characters, and image prompts DALL-E 3 creates unique illustrations for each scene Images are uploaded to Firebase Storage for permanent hosting Complete story data is saved to Firestore Returns JSON with the full story and image URLs Set up steps (10-15 minutes) Add your OpenAI API credentials Add Google Service Account credentials for Firebase Update 3 variables in the Code nodes: OPENAI_API_KEY, FIREBASE_BUCKET, FIREBASE_PROJECT_ID Activate and test with the form Features 12 languages supported 10 art styles to choose from 1-12 scenes per story Automatic image hosting on Firebase Full story saved to Firestore database
by Ali Khosravani
This workflow automatically generates realistic comments for your WordPress articles using AI. It makes your blog look more active, improves engagement, and can even support SEO by adding keyword-relevant comments. How It Works Fetches all published blog posts from your WordPress site via the REST API. Builds a tailored AI prompt using the article’s title, excerpt, and content. Uses OpenAI to generate a short, natural-sounding comment (some positive, some neutral, some longer, some shorter). Assigns a random commenter name and email. Posts the generated comment back to WordPress. Requirements n8n version: 1.49.0 or later (recommended). Active OpenAI API key. WordPress site with REST API enabled. WordPress API credentials (username + application password). Setup Instructions Import this workflow into n8n. Add your credentials in n8n > Credentials: OpenAI API (API key). WordPress API (username + application password). Replace the sample URL https://example.com with your own WordPress site URL. Execute manually or schedule it to run periodically. Categories AI & Machine Learning WordPress Content Marketing Engagement Tags ai, openai, wordpress, comments, automation, engagement, n8n
by Joel Cantero
🚀 Try It Out! YouTube Transcript API Extractor (Any Public Video) Extracts a clean transcript from a videoId using youtube-transcript.io. 🎯 Use Cases AI summaries, sentiment analysis, keyword extraction Internal indexing/SEO Content pipelines (blog/newsletter) Batch transcript processing 🔄 How It Works (5 Steps) 📥 Input: youtubeVideoId, apiToken 🌐 API: POST to youtube-transcript.io 🧩 Parse: Normalizes the response format 🧹 Clean: Normalizes text and whitespace ✅ Output: Transcript + metrics (wordCount/charCount) 🚀 How to Use Payload: {"youtubeVideoId":"xObjAdhDxBE", "apiToken": "xxxxxxxxxx"} ⚙️ Setup: This sub-workflow is intended to be called from another workflow (Execute Workflow Trigger)
by Biznova
Transform Product Photos into Marketing Images with AI Made by Biznova | TikTok 🎯 Who's it for E-commerce sellers, social media marketers, small business owners, and content creators who need professional product advertising images without expensive photoshoots or graphic designers. ✨ What it does This workflow automatically transforms simple product photos into polished, professional marketing images featuring: Professional models showcasing your product Aesthetically pleasing, contextual backgrounds Professional lighting and composition Lifestyle scenes that help customers envision using the product Commercial-ready quality suitable for ads and e-commerce 🚀 How it works Upload your basic product photo via the web form AI analyzes your product and generates a complete marketing scene Download your professional marketing image automatically Use it immediately in ads, social media, or product listings ⚙️ Setup Requirements OpenRouter Account: Create a free account at openrouter.ai API Key: Generate your API key from the OpenRouter dashboard Add Credentials: Configure the OpenRouter API credentials in the "AI Marketing Image Generator" node Test: Upload a sample product image to test the workflow 🎨 How to customize Edit the prompt** in the "AI Marketing Image Generator" node to match your brand style Adjust file formats** in the upload form (currently accepts JPG/PNG) Modify the response message** in the final form node Add your branding** by including brand colors or style preferences in the prompt 💡 Pro Tips Use high-resolution product images for best results Test different prompt variations to find your ideal style Save successful prompts for consistent brand imagery Batch process multiple products by running the workflow multiple times 🔧 Quick Setup Guide Prerequisites OpenRouter account (Sign up here) API key from OpenRouter dashboard Configuration Steps Click on "AI Marketing Image Generator" node Add your OpenRouter API credentials Save and activate the workflow Test with a product image Customization To change the image style: Edit the prompt in the "AI Marketing Image Generator" node Add specific instructions about colors, mood, or setting Include brand-specific requirements Example custom prompt additions: "Use a minimalist white background" "Feature a modern, urban setting" "Include warm, natural lighting" "Show the product in a luxury lifestyle context"
by PDF Vector
Legal professionals spend countless hours manually checking citations and building citation indexes for briefs, memoranda, and legal opinions. This workflow automates the extraction, validation, and analysis of legal citations from any legal document, including scanned court documents, photographed case files, and image-based legal materials (PDFs, JPGs, PNGs). Target Audience: Attorneys, paralegals, legal researchers, judicial clerks, law students, and legal writing professionals who need to extract, validate, and manage legal citations efficiently across multiple jurisdictions. Problem Solved: Manual citation checking is extremely time-consuming and error-prone. Legal professionals struggle to ensure citation accuracy, verify case law is still good law, and build comprehensive citation indexes. This template automates the entire citation management process while ensuring compliance with citation standards like Bluebook format. Setup Instructions: Configure Google Drive credentials for secure legal document access Install the PDF Vector community node from the n8n marketplace Configure PDF Vector API credentials Set up connections to legal databases (Westlaw, LexisNexis if available) Configure jurisdiction-specific citation rules Set up validation preferences and citation format standards Configure citation reporting and export formats Key Features: Automatic retrieval of legal documents from Google Drive OCR support for handwritten annotations and scanned legal documents Comprehensive extraction of case law, statutes, regulations, and academic citations Bluebook citation format validation and standardization Automated Shepardizing to verify cases are still good law Pinpoint citation detection and parenthetical extraction Citation network analysis showing case relationships Support for federal, state, and international law references Customization Options: Set jurisdiction-specific citation rules and formats Configure automated alerts for superseded statutes or overruled cases Customize citation validation criteria and standards Set up integration with legal research platforms (Westlaw, LexisNexis) Configure export formats for different legal document types Add support for specialty legal domains (tax law, patent law, etc.) Set up collaborative citation checking for legal teams Implementation Details: The workflow uses advanced legal domain knowledge to identify and extract citations in various formats across multiple jurisdictions. It processes both digital and scanned documents, validates citations against legal standards, and builds comprehensive citation networks. The system automatically checks citation accuracy and provides detailed reports for legal document preparation. Note: This workflow uses the PDF Vector community node. Make sure to install it from the n8n community nodes collection before using this template.
by boagolden
This template can backup WordPress context github。
by System Admin
Tagged with: Ted's Tech Talks