by Alexander Bentlund
What this workflow does This workflow is used as a bridge between your private Google Calendar to your Work Outlook Calendar. The same mentality can be used with other calendar types. Description Send a copy of a Google Calendar event to your Outlook work account as a reminder to yourself or co-workers that you are booked for private matters like "Dentist appointment", "Taking kids to Disney Land" etc. How it works Create event -- You create a Google Calendar event. -- A trigger in n8n reacts and collects the event info. -- An Outlook event is created with the same information in your Outlook Calendar. Cancel -- You cancel an event in Google Calendar -- A trigger in n8n reacts and collects the canceled event info. -- Using the Outlook node to getAll events searches for the event in your Outlook Calendar. -- If the event is found it is then deleted. -- An email with the details of the cancelation is then sent to your Outook e-mail address. The n8n Merge node is used to combine results from two different nodes that are necessary to create the cancelled event e-mail notification. Important notice Make sure you use a dedicated Google Calendar for private events that will be displayed in your work Outlook calendar in order to avoid displaying unwanted calendar events that you do not wish to share with your co-workers. Requirements Active workflow* Google Calendar OAuth2 API Microsoft Outlook OAuth2 API .*The Google Calendar trigger is activated only if this workflow is active. You can however TEST the workflow in the editor by clicking “Test step”. You will then receive a response from Google Calendar that you can use in order to view what data Google Sends.
by Hubschrauber
What this workflow does This (set of) workflow(s) shows how to start multiple sub-workflows, asynchronously, in parallel, and then wait for all of them to complete. Normally sub-workflows would need to be run synchronously, in series, or, if they are executed asynchronously (to run concurrently, in parallel), there is no easy way to merge/wait for an arbitrary number of them to complete. This is a "design pattern" template to show one approach for running multiple, data-driven instances of a sub-workflow "asynchronously," in parallel (instead of running them one at a time in series), but still prevent the later steps in the workflow from continuing until all of the sub-workflows have reported back that they are finished, via callback URL. There are other techniques involving messaging services, database tables, or other external "flow manager" helpers, but this technique accomplishes the goal fully within n8n. Setup To implement this pattern, examine the nodes in the template and modify the incoming data leading to: A split-out loop to acynchronously execute a sub-workflow multiple times, in parallel. For instance, each sub-workflow might process one of a list of incoming documents. The resumeUrl for the main/parent workflow is provided to all of the sub-workflow executions, along with a unique identifier that can be counted later (e.g. a document file-name). A "wait-for-all" loop that checks whether all sub-workflows have reported back (if node) and builds a unique list of identifiers from the callbacks received from each execution of the sub-workflow. The sub-workflow should be designed to respond immediately (async) and later send a callback request when it has finished processing. The callback request should include the unique identifier value received when the sub-workflow it was started. This is meant to be a possible answer to questions like this one about running things in parallel, maybe this one about waiting for things to finish, this one about managing sub-batches of things by waiting for each batch, or this one about running things in parallel. The topic of how to do this comes up A LOT, and this is one of the only techniques that (so far) seems to work.
by Edoardo Guzzi
Simple Social: Instagram Single Image Post with Facebook API Who is this workflow for? This workflow is designed for businesses, social media managers, content creators, and developers who need to automate the process of posting single images to Instagram using the Facebook API. It is ideal for anyone looking to streamline their social media posting process, saving time and ensuring consistent content delivery. Use Case / Problem Solved Manually posting images and captions on Instagram can be time-consuming, especially for businesses and content creators managing multiple accounts. This workflow automates the process from image preparation to publishing, reducing manual effort and increasing efficiency. What this workflow does Trigger Initialization: The workflow starts with a manual trigger that can be adapted to other triggers (e.g., HTTP webhook or schedule). Set Parameters: The workflow includes a node that sets essential parameters, such as the image URL, Instagram business account ID, and caption. Prepare Instagram Media: A node prepares the media for upload using the Facebook API, sending the image and caption for pre-publication processing. Check Media Upload Status: The workflow verifies if the media preparation is complete. Conditional Check: If the media preparation is successful, the workflow proceeds to publish; otherwise, it triggers an error-handling path. Publish Media: The media is published on Instagram if the conditions are met. Post-Publish Check: The workflow checks the status after publication. Conditional Check for Publication: If the publication status is "PUBLISHED," it triggers a success path; otherwise, it triggers a failure handling. Email Notifications: The workflow sends email notifications to indicate successful or unsuccessful outcomes. Setup Here is a quick video in italian language with sub eng(https://youtu.be/obWJFJvg_6g) Add API Credentials: Ensure that valid Facebook API credentials are added and configured for use. Permissions Required: Ensure your app has the necessary permissions (ads_management, business_management, instagram_basic, instagram_content_publish, pages_read_engagement). App review may be required for external user access. Node Configuration: Customize the Set Instagram Parameters node to specify the image URL, caption, and Instagram business account ID. Trigger Adaptation: Adapt the initial trigger if needed to fit your workflow's requirements (e.g., schedule, webhook). How to customize this workflow Change the Image URL and Caption**: Modify the Set Instagram Parameters node to change the image and caption. Trigger Customization**: Replace the manual trigger with other triggers like a webhook to automate posting based on external events. Notifications**: Adjust the email nodes to send customized messages or trigger other workflows based on the outcome. Limitations Image Format**: Only JPEG images are supported. Extended JPEG formats such as MPO and JPS are not compatible. Unsupported Tags**: Shopping tags, branded content tags, and filters are not supported. Instagram TV**: Publishing to Instagram TV is not supported. Rate Limit**: Instagram accounts are limited to 50 API-published posts within a rolling 24-hour period. Carousels count as a single post. Check usage with GET /{ig-user-id}/content_publishing_limit. Example Usage Imagine managing a business account that needs consistent posts. You can schedule this workflow or trigger it manually to automatically post images with captions at the right time, ensuring that your audience stays engaged without manual posting efforts.
by Alex Kim
Automate Google Analytics Reporting with n8n This n8n workflow collects, processes, and formats Google Analytics data into a comprehensive HTML report. The report is segmented into three primary categories: Engagement Stats, Search Results, and Country Views. The formatted report can be emailed or saved as a document, and the workflow includes error handling and logging for better debugging. Overview Purpose To automate the extraction, processing, and presentation of Google Analytics data in a visually appealing and structured format for easier insights and decision-making. Features Data Parsing**: Individual parsers process raw Google Analytics data for different time periods and categories. Data Aggregation**: Combines parsed data into a single structured JSON object. HTML Report Generation**: Formats the aggregated data into an HTML table with color-coded segments for better readability. Email or Document Output**: The formatted report can be emailed or saved as a Google Doc (will need additional setup). Error Handling**: Includes checks for missing data and detailed error messages for debugging. Workflow Steps Data Fetching: Six separate Google Analytics data pulls: Page Engagement Stats (This Week and Prior Week) Google Search Results (This Week and Prior Week) Country Views (This Week and Prior Week) Data Parsing: Each data pull is processed using a dedicated parser node to generate a URL-safe string. Example nodes: Parse - Get Page Engagement This Week Parse - Country Views Prior Week Data Aggregation: Aggregates parsed data into a structured JSON object using the Aggregate Data node. Ensures consistency and handles missing or malformed data. HTML Report Generation: Creates a formatted HTML report with color-coded tables for each segment: Engagement Stats: Green Search Results: Blue Country Views: Orange Includes headers and neatly formatted tables for each data set. Output: The report can be sent via email using the Gmail API or saved to Google Docs. Example nodes: Gmail node for email delivery. Google Docs node for saving the report as a document. Requirements Prerequisites Google Cloud Setup**: Enable Google Analytics API. Enable Gmail API (if using email output). Generate OAuth credentials for API access. n8n Installation**: Self-hosted n8n instance with required nodes (Gmail, Google Docs, etc.). Free Cloud-based n8n account. Environment Variables Ensure API credentials and tokens are set up in the n8n environment. Update the respective nodes with client ID, client secret, and access tokens. Configuration Google Analytics Configure the Get Report nodes with the appropriate property ID and metrics. Ensure correct date ranges are selected for each node. Formatting Node The Format Data node processes aggregated data and generates the HTML content. Customize the HTML styling and segment colors as needed. Email Node Configure the Gmail node with OAuth credentials. Set the recipient email address and subject line dynamically. Error Handling Common Issues Authentication Errors: Ensure OAuth credentials are correct. Verify that the APIs are enabled in the Google Cloud Console. Empty Data: Check the raw data from Google Analytics. Validate the property ID and query parameters in the Get Report nodes. Parsing Errors: Ensure the parser nodes are correctly configured and match the expected input format. Debugging Use debug logs in each node to identify data flow issues. Add error-handling nodes to capture and log issues during execution. Example Usage Run the Workflow Trigger the workflow to fetch, process, and format Google Analytics data. Verify Output Check the formatted HTML output in the debug logs. Ensure the email or Google Doc contains the correctly formatted report. Future Enhancements Add support for additional metrics or dimensions. Integrate with Slack for notifications. Enable scheduling for automated reports. Add a visual dashboard for real-time analytics.
by Zacharia Kimotho
How it works This workflow gets the search console results data and exports this to google sheets. This makes it easier to visualize and do other SEO related tasks and activities without having to log into Search Console Setup and use Set your desired schedule Enter your desired domain Connect to your Google sheets or make a copy of this sheet. Detailed Setup Inputs and Outputs:** Input: API response from Google Search Console regarding keywords, page data, and date data. Output: Entries written to Google Sheets containing keyword data, clicks, impressions, CTR, and positions. Setup Instructions: Prerequisites:** An n8n instance set up and running. Active Google Account with access to Google Search Console and Google Sheets. Google OAuth 2.0 credentials for API access. Step-by-Step Setup:** Open n8n and create a new workflow. Add the nodes as described in the JSON. Configure the Google OAuth2 credentials in n8n to enable API access. Set your domain in the Set your domain node. Customize the Google Sheets document URLs to your personal sheets. Adjust the schedule in the Schedule Trigger node as per your requirements. Save the workflow. Configuration Options:** You can customize the date ranges in the body of the HttpRequest nodes. Adjust any fields in the Edit Fields nodes based on different data requirements. Use Case Examples: Useful in tracking website performance over time using Search Console metrics. Ideal for digital marketers, SEO specialists, and web analytics professionals. Offers value in compiling performance reports for stakeholders or team reviews. Running and Troubleshooting: Running the Workflow:** Trigger the workflow manually or wait for the schedule to run it automatically. Monitoring Execution:** Check the execution logs in n8n's dashboard to ensure all nodes complete successfully. Common Issues:** Invalid OAuth credentials – ensure credentials are set up correctly. Incorrect Google Sheets URLs – double-check document links and permissions. Scheduling conflicts – make sure the schedule set does not overlap with other workflows.
by Jimleuk
This n8n template scrapes a list of AI grants from grants.gov and qualifies them using AI; determining interest and eligibility for the business. It then sends an email alert of interesting items to team members in an email. The template also shows how you can use the "Remove Duplicates" node to simplify deduplication of external listings without the need to manage this yourself. Not particularly interested in AI Grants? This template works for other tender websites as long as you're able to scrape them. How it works A scheduled trigger is set to fetch a list of AI grants listed on the grants.gov website in the past day. A Remove Duplicates node is used to track Grant IDs to filter out those already processed by the workflow. New grants are summarized and analysed by AI nodes to determine eligibility and interest which is then saved to an Airtable database. Another scheduled trigger starts a little later than the first to collect and summarize the new grants The results are then compiled into an email template using the HTML node, in the form of a newsletter designed to alert and brief team members of new AI grants. This email is then sent to a list of subscribers using the gmail node. How to use Make a copy of sample Airtable here: https://airtable.com/appiNoPRvhJxz9crl/shrRdP6zstgsxjDKL The filters for fetching the grants is currently set to the "AI" category. Feel free to change this to include more categories. Not interested in grants, this template can works for other sources of leads just change the endpoint and how you're defining the item ID to track. Requirements Airtable for database OpenAI for LLM Note: These are not hard requirements and can be exchanged for services available to you. customising the workflow "Eligibility" criteria at this stage may be better served by identifying hard blockers instead ie. certifications, geographical considerations or certain legal checks. Be sure to mention any hard blockers into the Eligibility prompt. Not particularly interested in AI prompts? This template works for other tender websites as long as you're able to scrape them.
by KlickTipp
Community Node Disclaimer: This workflow uses KlickTipp community nodes. How It Works Enhanced Calendly Integration: This workflow processes bookings and cancellations in Calendly, dynamically managing invitee and guest data with KlickTipp. Data Transformation: Dates and times are converted into formats (UNIX timestamps) compatible with KlickTipp’s API, ensuring seamless data integration. Key Features Calendly Trigger: Captures new bookings or cancellations of events, including participant details. Invitee and Guest Subscription in KlickTipp: Adds or updates invitees and guests in KlickTipp based on booking details (event name, time, join link, reschedule link, cancel link, etc.). Tracks and processes cancellations for both invitees and guests. Handles rescheduling intelligently to avoid redundant operations. Guest-Specific Operations: Processes guests individually for bookings and cancellations using dynamic arrays of email addresses. Recovers guest data from invitee records for cancellations since Calendly does not provide guest data upon cancellation. Data Processing: Standardizes and validates input fields Converts phone numbers to numeric-only format with international prefixes. Transforms dates into UNIX timestamps. Reads out the name of the invitee based on both possible input fields for name (name vs. firstname and lastname field setup). Error Handling: Validates critical fields like phone numbers, URLs, and dates to prevent incorrect data submissions. Setup Instructions Authentication: Set up the Calendly and KlickTipp nodes in your n8n instance. Configure authentication for both Calendly and KlickTipp nodes. Custom Field Preparation in KlickTipp: Create the following custom fields in KlickTipp to align with workflow requirements: | Feld-Name | Feld-Typ | |----------------------------------|------------------| | Calendly_event_name | Zeile | | Calendly_join_url | URL | | Calendly_reschedule_url | URL | | Calendly_cancel_url | URL | | Calendly_event_start_datetime | Datum & Zeit | | Calendly_event_end_datetime | Datum & Zeit | | Calendly_invitee_start_date | Datum | | Calendly_invitee_end_date | Datum | | Calendly_invitee_start_time | Zeit | | Calendly_invitee_end_time | Zeit | | Calendly_invitee_timezone | Zeile | | Calendly_invitee_guests_adresses| Zeile | After creating fields, allow 10-15 minutes for them to sync. If fields don’t appear, reconnect your KlickTipp credentials. Field Mapping and Adjustments: Open each KlickTipp node and map fields to match your setup. The workflow includes placeholders for: Invitee details (first name, last name, email, and phone). Event details (start/end times, timezone, etc.). Workflow Logic Trigger via Calendly event Booking: A new form event booking or cancellation from Calendly initiates the workflow Data Transformation: Processes raw Calendly event data to ensure compatibility with KlickTipp’s API. Add to KlickTipp Subscriber List: Adds invitees and guests to the designated KlickTipp list, including event-specific details. Benefits Efficient lead generation: Contacts from event bookings are automatically imported into KlickTipp and can be used immediately, saving time and increasing the conversion rate. Automated processes: Experts can start workflows directly, such as reminder emails or course admissions, reducing administrative effort. Error-free data management: The template ensures precise data mapping, avoids manual corrections and reinforces a professional appearance. Testing and Deployment Test the workflow by triggering a Calendly event and verifying data updates in KlickTipp. Notes: Customization: Update field mappings within the KlickTipp nodes to align with your account setup. This ensures accurate data syncing. Resources: Calendly KlickTipp Knowledge Base help article Use KlickTipp Community Node in n8n Automate Workflows: KlickTipp Integration in n8n
by Johan Denoyer
How it works 1) Extracts all company entries in Agile CRM 2) Search for company name in French INSEE OpenData database to extract address and government ID (SIREN) 3) Updates entries with data extracted from French Insee OpenData dabase Workflow also has a readonly feature to make sure entry is not overwritten. Setup steps Add your AgileCRM credentials Add your INSEE OpenData credentials Add two company custom fields in your Agile CRM (for SIREN data and ReadOnly support)
by Mark Shcherbakov
Video Guide I prepared a comprehensive guide detailing how to automate the parsing of invoices using n8n and LlamaParse, seamlessly capturing and storing vital billing information. Youtube Link Who is this for? This workflow is ideal for finance teams, accountants, and business operations managers who need to streamline invoice processing. It is particularly helpful for organizations seeking to reduce manual entry errors and improve efficiency in managing billing information. What problem does this workflow solve? Manually processing invoices can be time-consuming and error-prone. This automation eliminates the need for manual data entry by capturing invoice details directly from uploaded documents and storing structured data efficiently. This enhances productivity and accuracy across financial operations. What this workflow does The workflow leverages n8n and LlamaParse to automatically detect new invoices in a designated Google Drive folder, parse essential billing details, and store the extracted data in a structured format. The key functionalities include: Real-time detection of new invoices via Google Drive triggers. Automated HTTP requests to initiate parsing through Lama Cloud. Structured storage of invoice details and line items in a database for future reference. Google Drive Integration: Monitors a specific folder in Google Drive for new invoice uploads. Parsing with LlamaParse: Automatically sends invoices for parsing and processes results through webhooks. Data Storage in Airtable: Creates records for invoices and their associated line items, allowing for detailed tracking. Setup N8N Workflow Google Drive Trigger: Set up a trigger to detect new files in a specified folder dedicated to invoices. File Upload to LlamaParse: Create an HTTP request that sends the invoice file to LlamaParse for parsing, including relevant header settings and webhook URL. Webhook Processing: Establish a webhook node to handle parsed results from LlamaParse, extracting needed invoice details effectively. Invoice Record Creation: Create initial records for invoices in your database using the parsed details received from the webhook. Line Item Processing: Transform string data into structured line item arrays and create individual records for each item linked to the main invoice.
by Angel Menendez
Analyze & Sort Suspicious Email Contents with ChatGPT and Jira Who is this for? This workflow is tailored for IT security teams, managed service providers (MSPs), and organizations aiming to streamline the detection and reporting of phishing emails. It's especially useful for teams handling high email volumes and requiring quick, automated analysis. What problem is this workflow solving? Phishing emails pose a significant cybersecurity threat, and manual review processes are time-consuming and prone to human error. This workflow automates the identification of malicious emails, provides AI-driven insights, and generates structured reports, enabling faster and more efficient responses to email-based threats. What this workflow does This workflow integrates Gmail or Microsoft Outlook to monitor and capture incoming emails. It processes the email content and headers, converts the email's body to a visual screenshot for clarity, and uses ChatGPT's advanced AI to analyze the email for phishing indicators. Based on the analysis, it categorizes emails as potentially malicious or benign, creating detailed Jira tickets for each case. Attachments, including the email body and screenshots, are automatically uploaded for comprehensive reporting. Key steps include: Email Integration: Captures emails from Gmail or Microsoft Outlook. Content Processing: Extracts and organizes email content and metadata. AI Analysis: Uses ChatGPT to evaluate email content and headers. Classification: Categorizes emails as malicious or benign. Automated Reporting: Creates Jira tickets with detailed analysis and attachments. Setup Authentication: Configure Gmail or Microsoft Outlook credentials in n8n. API Keys: Add credentials for the HTML screenshot service (hcti.io) and OpenAI. Jira Configuration: Set up project and issue types in the Jira nodes. Customization: Update sticky notes and nodes to fit your organizational requirements, such as modifying the AI prompt or Jira ticket fields. How to customize this workflow to your needs Adjust email triggers to include or exclude specific senders or subjects. Refine the AI prompt in the ChatGPT node to tailor phishing detection criteria. Modify Jira ticket content to include additional fields or match specific workflows. This workflow is ideal for automating email threat detection, reducing response times, and enhancing overall cybersecurity processes. By leveraging AI-powered insights, it helps organizations stay ahead of phishing attacks.
by Mark Shcherbakov
Video Guide I prepared a comprehensive guide detailing how to create a Smart Agent that automates meeting task management by analyzing transcripts, generating tasks in Airtable, and scheduling follow-ups when necessary. Youtube Link Who is this for? This workflow is ideal for project managers, team leaders, and business owners looking to enhance productivity during meetings. It is particularly helpful for those who need to convert discussions into actionable items swiftly and effectively. What problem does this workflow solve? Managing action items from meetings can often lead to missed tasks and poor follow-up. This automation alleviates that issue by automatically generating tasks from meeting transcripts, keeping everyone informed about their responsibilities and streamlining communication. What this workflow does The workflow leverages n8n to create a Smart Agent that listens for completed meeting transcripts, processes them using AI, and generates tasks in Airtable. Key functionalities include: Capturing completed meeting events through webhooks. Extracting relevant meeting details such as transcripts and participants using API calls. Generating structured tasks from meeting discussions and sending notifications to clients. Webhook Integration: Listens for meeting completion events to trigger subsequent actions. API Requests for Data: Pulls necessary details like transcripts and participant information from Fireflies. Task and Notification Generation: Automatically creates tasks in Airtable and notifies clients of their responsibilities. Setup N8N Workflow Configure the Webhook: Set up a webhook to capture meeting completion events and integrate it with Fireflies. Retrieve Meeting Content: Use GraphQL API requests to extract meeting details and transcripts, ensuring appropriate authentication through Bearer tokens. AI Processing Setup: Define system messages for AI tasks and configure connections to the AI chat model (e.g., OpenAI's GPT) to process transcripts. Task Creation Logic: Create structured tasks based on AI output, ensuring necessary details are captured and records are created in Airtable. Client Notifications: Use an email node to notify clients about their tasks, ensuring communications are client-specific. Scheduling Follow-Up Calls: Set up Google Calendar events if follow-up meetings are required, populating details from the original meeting context.
by Yaron Been
🔍 Scrape Glassdoor with Bright Data Designed for sales teams, recruiters, and marketers aiming to automate job discovery and prospecting. This workflow scrapes Glassdoor job listings using Bright Data and automatically generates targeted pitches using AI, streamlining lead identification and outreach. 🧩 How It Works This automation leverages n8n, Bright Data, Google Sheets, and OpenAI: 1. Trigger Starts with a custom form input (Location, Keyword, Country). 2. Bright Data Job Scrape Triggers a Bright Data dataset snapshot via HTTP Request. Polls snapshot progress using a Wait node, ensuring data readiness. Retrieves full job listings dataset once ready. 3. Google Sheets Integration Writes detailed job data (company, role, location, overview, metrics) into a Google Sheet. Uses a pre-built template for organized data storage. 4. Automated Pitch Generation (AI) Splits listings into actionable parts: company name, title, and description. Sends data to OpenAI (via LangChain) to generate relevant pitches or icebreakers. Saves generated content back into the same sheet for easy access. ✅ Requirements Ensure you have the following: Google Sheets Google account Template Sheet with columns for job details and AI-generated pitches Bright Data Active account with Dataset API access API key and dataset ID OpenAI Valid OpenAI API key for GPT models n8n Environment Nodes: HTTP Request, Wait, If, Google Sheets, Split Out, LangChain (OpenAI) Credentials: Google Sheets OAuth2 Bright Data API credentials OpenAI API key ⚙️ Setup Instructions Step 1: Prepare Google Sheets Copy the provided Google Sheets template Do not change headers Step 2: Import & Configure Workflow in n8n Import the workflow JSON file Set Google Sheets node: Link to your copied sheet Confirm correct tab name Step 3: Configure Bright Data Replace <YOUR_BRIGHT_DATA_API_KEY> with your real key Set your dataset ID in all HTTP Request nodes Step 4: Configure OpenAI (LangChain) Connect OpenAI API key to the LangChain node Customize prompt to match tone and outreach style Step 5: Testing & Scheduling Test via manual form trigger Schedule runs or leave form enabled for on-demand use 🧠 Tips & Best Practices Use specific keywords and locations for better results Adjust polling intervals based on dataset size Refine AI prompts regularly to improve pitch quality Clean unused columns from your sheet to boost performance 💬 Support & Feedback For help or customization: 📧 Email: Yaron@nofluff.online 📺 YouTube: @YaronBeen 🔗 LinkedIn: linkedin.com/in/yaronbeen 📚 Bright Data Docs: docs.brightdata.com/introduction