For contemporary companies, a top quality, constant reporting machine is very important for keeping up consumer consider and construction long-term loyalty. A transparent, informative file lets you track gross sales tendencies, reveal marketing campaign effectiveness, calculate go back on funding (ROI), and a lot more.

Alternatively, for an company managing dozens and even loads of websites, producing those periodic studies can change into a significant bottleneck, compromising the scalability of your operations.

This is why optimizing and automating knowledge retrieval is an important. It restores potency and frees up your staff to concentrate on high-value actions—like creating new initiatives.

On this article, you are going to learn to leverage the Kinsta API to routinely fetch your internet hosting knowledge and generate strategic studies with the ability of AI.

Are you able to scale your reporting machine? Learn on.

Having access to Kinsta Analytics by the use of MyKinsta and Kinsta API

Kinsta consumers have get right of entry to to a wealth of knowledge throughout the MyKinsta internet hosting dashboard. You’ll be able to get right of entry to your plan knowledge within the Analytics segment of your dashboard.

Overview of the MyKinsta Analytics section.
Assessment of the MyKinsta Analytics segment.

The Analytics web page is split into a number of tabs, each and every that specialize in a selected side of your web site’s job:

  • Plan utilization: Shows your plan’s useful resource intake, each cumulatively and damaged down via person web site.
  • Best requests: Lets you establish the main requests for your web site, classified via bandwidth and perspectives.
  • Cache: Supplies a breakdown of cache utilization, together with Cache breakdown, Server cache parts, and Server cache bypasses.
  • CDN & edge: Provides knowledge on CDN bandwidth intake, Edge cache bandwidth, and lists of the highest recordsdata served from the CDN cache.
  • Dispersion: Displays the share of visits from desktop, pill, and cellular.
  • Efficiency: Comprises quite a lot of efficiency metrics reminiscent of Moderate PHP + MySQL reaction time, PHP throughput, PHP thread prohibit, and extra.
  • Reaction: Supplies statistics on reaction codes, together with an in depth breakdown of error codes.
  • Geo & IP: Shows lists of the Best International locations, Best Towns, and Best Consumer IPs from which requests for your web site originate.

You’ll be able to get right of entry to those identical analytics on the web site degree via navigating to Websites > sitename > Analytics.

Site analytics in MyKinsta.
Website online analytics in MyKinsta.

Kinsta Analytics supplies a staggering dataset; merely navigating your MyKinsta dashboard will provide you with an excessively transparent image of your web site’s useful resource intake, potency, and function. You’ll know precisely the place maximum requests come from and which of them eat essentially the most assets.

Mixed with our Kinsta APM instrument, Kinsta Analytics lets you optimize the efficiency of your WordPress websites.

What now not we all know is that Kinsta Analytics knowledge could also be available by the use of the Kinsta API. This lets you programmatically retrieve knowledge and construct internet hosting metrics, which you’ll be able to then use to generate computerized studies to proportion along with your purchasers.

Let’s discover the Kinsta API endpoints.

The Analytics endpoint of the Kinsta API

With the Analytics endpoint of the Kinsta API, you’ll be able to get right of entry to uncooked knowledge for your web site’s useful resource utilization and well being.

  • Visits utilization, server bandwidth utilization, and CDN bandwidth utilization: Those metrics observe your useful resource utilization relative for your internet hosting plan all the way through the present billing length.
  • Visits: Supplies the entire collection of visits to a given setting inside of a specified time frame.
  • Disk area: Supplies the entire disk area utilized by a given setting over a specified time frame.
  • Server bandwidth: Supplies the bandwidth ate up via a given setting over a specified time frame.
  • CDN bandwidth: Supplies the CDN bandwidth ate up via a given setting over a specified time frame.
  • Best International locations: Supplies a listing of the primary international locations from which requests to the web site originate inside of a specified time frame.
  • Best Towns: Supplies a listing of the primary towns from which requests to the web site originated all the way through a specified time frame.
  • Best Consumer IPs: Supplies a listing of the primary consumer IP addresses from which requests to the web site originated all the way through a specified time frame.
  • Talk over with Dispersion: It supplies knowledge at the distribution of visits throughout desktop, pill, and cellular gadgets over a specified time frame.
  • Reaction Code Breakdown: Supplies a breakdown of the HTTP standing codes returned via the server inside of a specified time frame.

Beneath are some examples of methods to use the analytics endpoint.

Visits

The next request supplies the entire collection of visits for your web site and the collection of distinctive IP addresses that experience accessed it within the final 30 days:

https://api.kinsta.com/v2/websites/environments/{KINSTA_ENV_ID}/analytics/visits?time_span=30_days&company_id={KINSTA_COMPANY_ID}

The reaction will likely be structured as follows:

{
	"analytics": {
		"analytics_response": {
			"key": "uniqueip",
			"knowledge": [
				{
					"name": "uniqueip",
					"total": 1000,
					"dataset": [
						{
							"key": "2025-10-28T00:00:00.000Z",
							"value": "1000"
						},
						{
							"key": "2025-10-28T00:00:00.000Z",
							"value": "900"
						},
						{
							"key": "2025-10-28T00:00:00.000Z",
							"value": "820"
						},
						...
					]
				}
			]
		}
	}
}

Bandwidth

The next instance displays methods to question the Kinsta API to retrieve the server’s bandwidth utilization during the last 30 days:

https://api.kinsta.com/v2/websites/environments/{KINSTA_ENV_ID}/analytics/bandwidth?time_span=30_days&company_id={KINSTA_COMPANY_ID}

The reaction from the Kinsta server supplies the day-to-day bandwidth utilization for the previous 30 days:

{
	"analytics": {
		"analytics_response": {
			"key": "bandwidth",
			"knowledge": [
				{
					"name": "bandwidth",
					"total": 1000,
					"dataset": [
						{
							"key": "2026-03-11T00:00:00.000Z",
							"value": "37347250"
						},
						{
							"key": "2026-03-12T00:00:00.000Z",
							"value": "9276458"
						},
						...
					]
				}
			]
		}
	}
}

CDN bandwidth

On this different instance, we question the Kinsta API to determine the CDN bandwidth utilization for the previous 7 days:

https://api.kinsta.com/v2/websites/environments/{KINSTA_ENV_ID}/analytics/cdn-bandwidth?time_span=7_days&company_id={KINSTA_COMPANY_ID}

The server will give you the following knowledge:

{
	"analytics": {
		"analytics_response": {
			"key": "cdn-bandwidth",
			"knowledge": [
				{
					"name": "cdn-bandwidth",
					"total": 1000,
					"dataset": [
						{
							"key": "2026-04-02T00:00:00.000Z",
							"value": "753447"
						},
						{
							"key": "2026-04-03T00:00:00.000Z",
							"value": "16911"
						},
						...
					]
				}
			]
		}
	}
}

You’ll be able to take a look at it your self via coming into your Kinsta API key (bearer token), setting ID, and corporate ID within the API playground.

Test the Kinsta API in the API playground.
Take a look at the Kinsta API within the API playground.

Now that you understand how to get right of entry to your web site’s analytics knowledge on Kinsta, you’ll be able to use it to automate your operations. This additionally contains automating the reporting machine.

The next sections will display you methods to automate your company’s reporting machine the use of the Kinsta API. We will be able to construct a Python script and leverage GitHub Movements to automate the construct and run. This may turn into the uncooked knowledge returned via the API into tables and charts, and question Google AI to generate a last file.

It’s time to get your palms grimy.

Construct an automatic reporting machine the use of the Kinsta API and Google AI

Our objective is to create an automatic file this is generated at particular durations. The machine will question the Kinsta API to retrieve knowledge on visits, server bandwidth, and CDN bandwidth. This knowledge will then be used to create charts and tables in a PDF report. As a part of this procedure, the knowledge will likely be despatched to the Gemini API to provide an research of the extracted knowledge, which is able to then be incorporated within the file.

Preview of the automated report generated with GitHub Actions and the Kinsta API.
Preview of the automatic file generated with GitHub Movements and the Kinsta API.

Putting in the venture on GitHub

At the GitHub homepage, click on the fairway New button to create a brand new venture. After you have an empty venture, pass to Settings > Secrets and techniques and variables > Movements and upload the secrets and techniques proven within the following symbol.

GitHub Actions repository secrets.
GitHub Movements repository secrets and techniques.

Storing your API keys and IDs in GitHub Secrets and techniques assists in keeping them inaccessible to any person and is helping be certain that your code stays safe.

GEMINI_API_KEY

You’ll be able to generate a Google AI API key within the Google AI Studio dashboard. Please discuss with the Google AI documentation for more info.

KINSTA_API-KEY

Subsequent, practice the directions in our article to generate a Kinsta API key.

KINSTA_COMPANY_ID, KINSTA_ENV_ID, KINSTA_SITE_ID

You’ll be able to in finding the Website online ID, Setting ID, and Corporate ID below Websites > sitename > Data to your MyKinsta dashboard.

Site Information in MyKinsta.
Website online Data in MyKinsta.

Now let’s transfer directly to the venture recordsdata.

Required libraries and GitHub Movements configuration

Within the root listing of your GitHub venture, create a report named necessities.txt and upload the next:

google-genai
requests
matplotlib
fpdf2

This report lists the parts required in your venture.

  • google-genai: That is Google’s library for interacting with Gemini fashions.
  • requests: A library for making HTTP requests. On this venture, it’ll be used to ship HTTP requests to the Kinsta API.
  • matplotlib: A Python library for growing graphs and visualizing knowledge.
  • fpdf2: It is a library that permits you to generate PDF recordsdata.

Subsequent, create a report named .github/workflows/generate_report.yml with the next code:

identify: Generate Kinsta Analytics File

on:
  push:
    branches: [main]
  workflow_dispatch:

jobs:
  build-and-run:
    runs-on: ubuntu-latest
    steps:
      - identify: Checkout Repository
        makes use of: movements/checkout@v4

      - identify: Arrange Python
        makes use of: movements/setup-python@v5
        with:
          python-version: '3.12'

      - identify: Set up dependencies
        run: |
          python -m pip set up --upgrade pip
          pip set up -r necessities.txt

      - identify: Run File Script
        env:
          KINSTA_API_KEY: ${{ secrets and techniques.KINSTA_API_KEY }}
          KINSTA_ENV_ID: ${{ secrets and techniques.KINSTA_ENV_ID }}
          KINSTA_SITE_ID: ${{ secrets and techniques.KINSTA_SITE_ID }}
          KINSTA_COMPANY_ID: ${{ secrets and techniques.KINSTA_COMPANY_ID }}
          GEMINI_API_KEY: ${{ secrets and techniques.GEMINI_API_KEY }}
        run: python primary.py

      - identify: Add File
        makes use of: movements/upload-artifact@v4
        with:
          identify: Kinsta-Complex-File
          trail: "*.pdf"

GitHub makes use of this report to routinely run your code by the use of GitHub Movements. Let’s take a more in-depth glance:

identify: Generate Kinsta Analytics File

on:
  push:
    branches: [main]
  workflow_dispatch:
  • identify: The identify of your venture as it sounds as if within the Movements tab on GitHub.
  • on: Determines when to cause the workflow.
  • push: The workflow runs each time you push a code exchange to the primary department.
  • workflow_dispatch: Lets you run the workflow manually.
jobs:
  build-and-run:
    runs-on: ubuntu-latest
  • jobs: The beginning of the duties to be carried out.
  • build-and-run: An arbitrary identify that identifies a selected series of movements.
  • runs-on: Specifies the machine on which the workflow must run.
  • ubuntu-latest: Units the most recent edition of Ubuntu Linux.
steps:
      - identify: Checkout Repository
        makes use of: movements/checkout@v4

      - identify: Arrange Python
        makes use of: movements/setup-python@v5
        with:
          python-version: '3.12'
  • steps: The series of operations to be carried out.
  • identify: The identify of the operation to be carried out
  • makes use of: The pre-configured GitHub module (Motion)
      - identify: Set up dependencies
        run: |
          python -m pip set up --upgrade pip
          pip set up -r necessities.txt
  • python -m pip set up --upgrade pip: Updates pip (the Python bundle supervisor) to the most recent to be had edition.
  • pip set up -r necessities.txt: Reads the necessities.txt report and installs the applications indexed in it.
      - identify: Run File Script
        env:
          KINSTA_API_KEY: ${{ secrets and techniques.KINSTA_API_KEY }}
          KINSTA_ENV_ID: ${{ secrets and techniques.KINSTA_ENV_ID }}
          KINSTA_SITE_ID: ${{ secrets and techniques.KINSTA_SITE_ID }}
          KINSTA_COMPANY_ID: ${{ secrets and techniques.KINSTA_COMPANY_ID }}
          GEMINI_API_KEY: ${{ secrets and techniques.GEMINI_API_KEY }}
        run: python primary.py
  • env: Retrieves setting variable values from GitHub Secrets and techniques.
  • run: python primary.py: Launches the Python interpreter and runs the primary.py report.
      - identify: Add File
        makes use of: movements/upload-artifact@v4
        with:
          identify: Kinsta-Complex-File
          trail: "*.pdf"
  • makes use of: movements/upload-artifact@v4: Makes use of the GitHub motion to control artifacts, a report or folder generated whilst the script is working.
  • with: Units the configuration parameters.

The configuration of your automation venture is entire. Now it’s time to create the Python scripts.

Querying the Kinsta API programmatically

After you have finished the setup, navigate to the foundation listing of your GitHub venture and create a brand new report named kinsta_utils.py with the next code:

import requests
import os

KINSTA_API_KEY = os.getenv("KINSTA_API_KEY")
KINSTA_SITE_ID = os.getenv("KINSTA_SITE_ID")
KINSTA_ENV_ID = os.getenv("KINSTA_ENV_ID")
KINSTA_COMPANY_ID = os.getenv("KINSTA_COMPANY_ID")
BASE_URL = f"https://api.kinsta.com/v2/websites/environments/{KINSTA_ENV_ID}/analytics"

def get_headers():
    go back {"Authorization": f"Bearer {KINSTA_API_KEY}"}
  • The primary two import statements load the usual library for making HTTP requests and the module for interacting with the running machine (os).
  • The following 4 strains (os.getenv) retrieve your credentials from GitHub Secrets and techniques.
  • BASE_URL defines the primary endpoint of the Kinsta API utilized by the script.
  • The get_headers serve as generates the Authorization Header, which is able to come with the Kinsta API key.

Subsequent, create a helper serve as that converts the uncooked knowledge returned via the API into megabytes.

def format_bytes_to_mb(bytes_value):
    """Converts uncooked bytes from API to human-readable Megabytes."""

    take a look at:
        # Same old conversion to MB
        # go back spherical(int(bytes_value) / (1024 * 1024), 2)

        # Decimal usual (utilized in MyKinsta dashboard) 
        go back spherical(int(bytes_value) / 1_000_000, 2)

    with the exception of (ValueError, TypeError):
        go back 0
  • This code supplies two choices. The primary makes use of the binary usual (1024 x 1024), and the second one makes use of the decimal usual. Dividing via 1_000_000 guarantees that the quantity to your PDF file suits the quantity that your purchasers would see in MyKinsta Analytics.

The next serve as queries the Kinsta API and returns a collection of uncooked knowledge:

def fetch_kinsta_metric(endpoint, start_date, end_date):

    url = f"{BASE_URL}/{endpoint}"

    params = {
        "company_id": KINSTA_COMPANY_ID,
        "from": f"{start_date}T00:00:00.000Z",
        "to": f"{end_date}T23:59:59.000Z"
    }

    take a look at:
        reaction = requests.get(url, headers=get_headers(), params=params)
        if reaction.status_code == 200:
            data_node = reaction.json()['analytics']['analytics_response']['data'][0]
            general = data_node.get('general', 0)
            dataset = data_node.get('dataset', [])[:7]
            go back general, dataset

    with the exception of Exception as e:
        print(f"Error fetching {endpoint}: {e}")

    go back 0, []
  • The fetch_kinsta_metric serve as takes 3 arguments: endpoint, start_date, and end_date. Those are used to build the request URL. The endpoint may also be visits, bandwidth, or cdn-bandwidth.
  • The params array shops the request parameters.
  • Kinsta’s reaction is a nested JSON object (data_node) which gives the aggregated values for the length (general) and a listing of day-to-day values (dataset).

The general serve as within the kinsta_utils.py report retrieves the web site identify.

def fetch_site_name():
    url = f"https://api.kinsta.com/v2/websites/{KINSTA_SITE_ID}"
    
    take a look at:
        reaction = requests.get(url, headers=get_headers())
        if reaction.status_code == 200:
            knowledge = reaction.json()
            site_data = knowledge.get('web site', {})
            
            site_label = site_data.get('display_name', 'Unknown Website online')
            
            env_label = "Unknown Env"
            envs = site_data.get('environments', [])
            for env in envs:
                if env.get('identification') == KINSTA_ENV_ID:
                    env_label = env.get('display_name')
                    spoil
            
            go back f"{site_label} ({env_label})"
        else:
            print(f"Kinsta API Error: {reaction.status_code} - {reaction.textual content}")
    with the exception of Exception as e:
        print(f"Error fetching web site identify: {e}")
        
    go back "Unknown Website online"

This code must be self-explanatory. Please discuss with the API Reference for main points at the websites endpoint.

Now all that’s left is to arrange the workflow.

Automate the workflow with Python and Gemini

The final report you want to create is the engine in your software. Nonetheless within the root listing of your GitHub venture, create a report primary.py. To start out, upload the next code:

import os
import matplotlib
matplotlib.use('Agg')
import matplotlib.pyplot as plt
from google.genai import Consumer
from fpdf import FPDF, XPos, YPos
from datetime import datetime, timedelta
from kinsta_utils import fetch_kinsta_metric, format_bytes_to_mb, fetch_site_name

REPORT_LANG = "en" 
MODEL_ID = "gemini-2.5-flash" 
GEMINI_API_KEY = os.getenv("GEMINI_API_KEY")
consumer = Consumer(api_key=GEMINI_API_KEY)

these days = datetime.now()
curr_end_dt = these days - timedelta(days=1)
curr_start_dt = these days - timedelta(days=7)
prev_end_dt = these days - timedelta(days=8)
prev_start_dt = these days - timedelta(days=14)

CURR_RANGE = f"{curr_start_dt.strftime('%b %d')} - {curr_end_dt.strftime('%b %d')}"
PREV_RANGE = f"{prev_start_dt.strftime('%b %d')} - {prev_end_dt.strftime('%b %d')}"

DATES = [
    prev_start_dt.strftime("%Y-%m-%d"), 
    prev_end_dt.strftime("%Y-%m-%d"), 
    curr_start_dt.strftime("%Y-%m-%d"), 
    curr_end_dt.strftime("%Y-%m-%d")
]

CURR_DAYS_LABELS = [(curr_start_dt + timedelta(days=i)).strftime("%d %a") for i in range(7)]
PREV_DAYS_LABELS = [(prev_start_dt + timedelta(days=i)).strftime("%d %a") for i in range(7)]
X_AXIS_LABELS = [(curr_start_dt + timedelta(days=i)).strftime("%d") for i in range(7)]

That is how the script is about up:

  • The import statements load the vital libraries, and matplotlib.use('Agg') instructs Python to generate the plots and stay them in reminiscence.
  • The next block units the language (en) and the style (gemini-2.5-flash), then initializes the Google consumer.
  • Subsequent, it defines time home windows to check values from the final seven days with the ones from the former seven days.
  • In spite of everything, it units the labels for tables and graphs.

The next move is to outline a KinstaReport magnificence for producing file pages the use of the FPDF library:

magnificence KinstaReport(FPDF):
    def __init__(self, site_name="Unknown Website online"):
        tremendous().__init__()
        self.site_name = site_name

    def header(self):
        self.set_font("Helvetica", "B", 8)
        self.set_text_color(150)
        # Website online identify
        self.mobile(100, 10, f"Website online: {self.site_name}", align="L")
        # Date generated
        self.mobile(0, 10, f"Kinsta Analytics File | Generated: {datetime.now().strftime('%Y-%m-%d')}", 
                  align="R", new_x=XPos.LMARGIN, new_y=YPos.NEXT)

    def add_metric_page(self, identify, chart_path, prev_vals, curr_vals, unit=""):
        self.add_page()
        # Web page identify
        self.set_font("Helvetica", "B", 24)
        self.set_text_color(83, 51, 237)
        self.mobile(0, 15, identify, align="C", new_x=XPos.LMARGIN, new_y=YPos.NEXT)
        
        # Subtitle
        self.set_font("Helvetica", "I", 10)
        self.set_text_color(120)
        self.mobile(0, 5, f"Comparability: {CURR_RANGE} vs {PREV_RANGE}", align="C", new_x=XPos.LMARGIN, new_y=YPos.NEXT)
        
        self.symbol(chart_path, x=10, y=42, w=190)
        
        # Knowledge tables
        self.set_y(150)
        self.set_font("Helvetica", "B", 10)
        self.set_fill_color(245, 245, 255)
        self.set_text_color(83, 51, 237)
        
        # Desk header
        col1, col2 = 35, 60
        self.mobile(col1, 10, " Day (Prev)", border=1, align='C', fill=True, new_x=XPos.RIGHT, new_y=YPos.TOP)
        self.mobile(col2, 10, f"Price {unit}", border=1, align='C', fill=True, new_x=XPos.RIGHT, new_y=YPos.TOP)
        self.mobile(col1, 10, " Day (Curr)", border=1, align='C', fill=True, new_x=XPos.RIGHT, new_y=YPos.TOP)
        self.mobile(col2, 10, f"Price {unit}", border=1, align='C', fill=True, new_x=XPos.LMARGIN, new_y=YPos.NEXT)
        
        self.set_font("Helvetica", "", 10)
        self.set_text_color(50)
        for i in vary(7):
            # Zebra striping
            fill = (i % 2 == 0)
            if fill: self.set_fill_color(250, 250, 250)
            else: self.set_fill_color(255, 255, 255)
            
            self.mobile(col1, 9, f" {PREV_DAYS_LABELS[i]}", border=1, align='C', fill=fill, new_x=XPos.RIGHT, new_y=YPos.TOP)
            self.mobile(col2, 9, f" {prev_vals[i]}", border=1, align='C', fill=fill, new_x=XPos.RIGHT, new_y=YPos.TOP)
            self.mobile(col1, 9, f" {CURR_DAYS_LABELS[i]}", border=1, align='C', fill=fill, new_x=XPos.RIGHT, new_y=YPos.TOP)
            self.mobile(col2, 9, f" {curr_vals[i]}", border=1, align='C', fill=fill, new_x=XPos.LMARGIN, new_y=YPos.NEXT)

We received’t pass into an excessive amount of element about this code. For more info at the FPDF library, please discuss with the net assets:

Subsequent, outline a serve as generated_chart. This serve as converts the uncooked knowledge won from Kinsta into charts.

def generate_chart(labels, curr, prev, identify, ylabel, filename, is_bar=False):
    plt.determine(figsize=(10, 5), dpi=100)
    ax = plt.gca()
    
    ax.spines['top'].set_visible(False)
    ax.spines['right'].set_visible(False)
    ax.spines['left'].set_color('#dddddd')
    ax.spines['bottom'].set_color('#dddddd')

    if is_bar:
        # Bar Chart for bandwidth
        bars = plt.bar(labels, curr, colour='#00c4b4', alpha=0.6, label='Present Duration', width=0.6)
        # Upload labels above the bars
        for bar in bars:
            top = bar.get_height()
            plt.textual content(bar.get_x() + bar.get_width()/2., top + 0.02, f'{top}', ha='heart', va='backside', fontsize=8, colour='#00a194')
    else:
        # Line chart for visits
        plt.plot(labels, curr, colour='#5333ed', marker='o', markersize=6, linewidth=3, label='Present', zorder=3)
        plt.plot(labels, prev, colour='#a1a1a1', linestyle='--', marker='x', markersize=5, linewidth=1.5, label='Earlier', alpha=0.6)
        
        plt.fill_between(labels, curr, colour='#5333ed', alpha=0.1)
    
    plt.identify(identify, fontsize=14, pad=20, colour='#333333', fontweight='daring')
    plt.ylabel(ylabel, colour='#666666')
    plt.xlabel("Day of Month", colour='#666666')
    plt.legend(frameon=False, loc='higher proper')
    plt.grid(axis='y', linestyle='--', alpha=0.3)
    plt.tight_layout()
    plt.savefig(filename)
    plt.shut()

This serve as makes use of the Matplotlib library to transform the knowledge extracted from Kinsta into charts for inclusion within the PDF file. For more info on the use of the Matplotlib library, please discuss with the net documentation:

In spite of everything, upload the serve as that mixes the entire portions we’ve described to this point.

def primary():
    site_display_name = fetch_site_name()

    metrics = {
        "visits": {"identify": "Website online Visits", "unit": ""},
        "bandwidth": {"identify": "Server Bandwidth", "unit": "(MB)"},
        "cdn-bandwidth": {"identify": "CDN Bandwidth", "unit": "(MB)"}
    }
    
    report_data = {}
    for key in metrics:
        _, data_curr = fetch_kinsta_metric(key, DATES[2], DATES[3])
        _, data_prev = fetch_kinsta_metric(key, DATES[0], DATES[1])
        
        curr_vals = []
        prev_vals = []
        for i in vary(7):
            c = waft(data_curr[i]['value']) if i < len(data_curr) else 0
            p = waft(data_prev[i]['value']) if i < len(data_prev) else 0
            
            if "bandwidth" in key:
                curr_vals.append(format_bytes_to_mb(c))
                prev_vals.append(format_bytes_to_mb(p))
            else:
                curr_vals.append(int(c))
                prev_vals.append(int(p))
                
        report_data[key] = {"curr": curr_vals, "prev": prev_vals}

    pdf = KinstaReport(site_name=site_display_name)
    
    for key, data in metrics.pieces():
        chart_file = f"{key}_chart.png"
        generate_chart(X_AXIS_LABELS, report_data[key]["curr"], report_data[key]["prev"], 
                       f"{data['title']} Developments", "Gadgets", chart_file, is_bar=("bandwidth" in key))
        pdf.add_metric_page(data["title"], chart_file, report_data[key]["prev"], report_data[key]["curr"], data["unit"])

    # Govt Abstract
    pdf.add_page()
    pdf.set_font("Helvetica", "B", 20)
    pdf.set_text_color(83, 51, 237)
    pdf.mobile(0, 15, "Govt Abstract", align="C", new_x=XPos.LMARGIN, new_y=YPos.NEXT)
    
    curr_visits = sum(report_data['visits']['curr'])
    prev_visits = sum(report_data['visits']['prev'])
    curr_bw = sum(report_data['bandwidth']['curr'])
    prev_bw = sum(report_data['bandwidth']['prev'])

    take a look at:
        summary_prompt = (
            f"Analyze Kinsta efficiency for web site {site_display_name}. "
            f"Present Duration ({CURR_RANGE}): {curr_visits} visits, {curr_bw:.2f}MB server bandwidth. "
            f"Earlier Duration ({PREV_RANGE}): {prev_visits} visits, {prev_bw:.2f}MB server bandwidth. "
            f"Examine those classes and establish tendencies. Language: {REPORT_LANG}. Max 4 sentences."
        )
        reaction = consumer.fashions.generate_content(style=MODEL_ID, contents=summary_prompt)
        abstract = reaction.textual content
    with the exception of Exception as e:
        abstract = f"Analytical insights unavailable. Error: {str(e)}"

    pdf.set_y(40)
    pdf.set_font("Helvetica", "", 12)
    pdf.set_text_color(0)
    pdf.multi_cell(0, 8, abstract)
    
    report_filename = f"Kinsta_Report_{datetime.now().strftime('%Y-%m-%d')}.pdf"
    pdf.output(report_filename)
    print(f"File generated: {report_filename}")

if __name__ == "__main__":
    primary()

That is what this code does:

  • The for loop iterates throughout the metrics array and queries the Kinsta API two times: as soon as for the present week and as soon as for the former week.
  • If the knowledge pertains to bandwidth, the format_bytes_to_mb() serve as converts the uncooked knowledge into MB.
  • The report_data() serve as shops the retrieved knowledge.
  • KinstaReport then creates a PDF for each and every web site.
  • The following for loop generates PNG pictures for the charts and creates a brand new web page for each and every metric.
  • The following segment generates the chief abstract, calculates the entire collection of visits and general megabytes for the length, and sends a dynamic recommended to Gemini 2.5 Flash. In spite of everything, the reaction is used to finish the final web page of the PDF.
  • The script saves the file with a report identify that comes with the present date.
  • * The general situation guarantees the method runs handiest when the script is performed as the primary program.

It’s time to construct and run your software.

Retrieving the Artifact

You'll be able to now run your software. For your GitHub venture web page, click on the Movements tab. Search for the identify of your motion within the menu at the left (in our instance, that is Generate Kinsta Analytics File, as specified to your generate_report.yml report).

The Actions tab shows a list of workflows.
The Movements tab displays a listing of workflows.

Subsequent, click on the Run workflow menu at the proper, then click on the fairway Run workflow button (handiest the primary department is lately to be had).

Run GitHub workflow.
Run GitHub workflow.

The following web page displays the present workflow. Click on on it to view the record of ongoing operations.

Use workflow from branch.
Use workflow from department.

The Run File Script segment supplies a listing of the operations carried out, whilst the Add File segment supplies the artifact obtain URL. Click on this hyperlink to obtain your file in PDF layout.

The Upload Report section provides the link to the PDF report.
The Add File segment supplies the hyperlink to the PDF file.

You’ll in finding the similar hyperlink within the Artifacts segment, on the backside of the workflow’s Abstract web page.

Run Report Script and Upload Report actions.
Run File Script and Add File movements.

The photographs beneath display your complete file, together with the Govt Abstract generated via Google AI.

Site Visits and Server Bandwidth report pages.
Website online Visits and Server Bandwidth file pages.
CDN Bandwidth and Executive Summary report pages.
CDN Bandwidth and Govt Abstract file pages.

Subsequent steps: The best way to fortify scalability and automate supply

That is only a style of what the Kinsta API can do when blended with complex automation gear like GitHub Movements. AI integration takes it additional, reworking uncooked numbers into deep-dive studies able to be shared along with your purchasers.

You'll be able to additional fortify your reporting in numerous tactics:

  • You'll be able to configure your software via including a line to the YAML report (agenda: '0 9 * * 1') to generate the file each Monday morning at 9:00 AM.
  • You might want to combine a library like smtplib or a provider like SendGrid to ship the file immediately for your consumer.
  • In case you are an company with dozens and even loads of websites, you need to put into effect a loop that iterates over a listing of web site IDs to generate all of your studies in one run.
  • You'll be able to additional enrich your file’s content material via the use of the Kinsta API to retrieve geographic knowledge, HTTP code breakdowns, server logs, and another knowledge you need to incorporate. By way of examining this knowledge, the AI can establish assault makes an attempt (4xx codes) or visitors spikes from sudden areas.
  • You'll be able to fine-tune your recommended to get extra detailed and complete AI responses.
  • You'll be able to customise the PDF template along with your company’s and your consumer’s trademarks.

Computerized reporting reduces your staff’s workload, and the consistency and precision it supplies improve your purchasers’ consider and loyalty.

Wish to get started automating your consumer reporting instantly? Subscribe to the plan that most closely fits your wishes and get started construction with the Kinsta API these days.

The put up Automate consumer reporting with Kinsta API and Google AI seemed first on Kinsta®.

WP Hosting

[ continue ]