In the event you arrange many WordPress websites, you’re most certainly all the time in search of techniques to simplify and accelerate your workflows.

Now, believe this: with a unmarried command to your terminal, you’ll be able to set off guide backups for your entire websites, although you’re managing dozens of them. That’s the facility of mixing shell scripts with the Kinsta API.

This information teaches you learn how to use shell scripts to arrange customized instructions that make managing your websites extra environment friendly.

Necessities

Earlier than we begin, right here’s what you wish to have:

  1. A terminal: All trendy running techniques include terminal instrument, so you’ll be able to get started scripting proper out of the field.
  2. An IDE or textual content editor: Use a device you’re ok with, whether or not it’s VS Code, Chic Textual content, or perhaps a light-weight editor like Nano for speedy terminal edits.
  3. A Kinsta API key: This is very important for interacting with the Kinsta API. To generate yours:
    • Log in in your MyKinsta dashboard.
    • Move to Your Title > Corporate Settings > API Keys.
    • Click on Create API Key and reserve it securely.
  4. curl and jq: Crucial for making API requests and dealing with JSON information. Check they’re put in, or set up them.
  5. Fundamental programming familiarity: You don’t want to be knowledgeable, however working out programming fundamentals and shell scripting syntax shall be useful.

Writing your first script

Growing your first shell script to have interaction with the Kinsta API is more effective than you may assume. Let’s get started with a easy script that lists the entire WordPress websites controlled below your Kinsta account.

Step 1: Arrange your setting

Start by way of making a folder on your challenge and a brand new script document. The .sh extension is used for shell scripts. As an example, you’ll be able to create a folder, navigate to it, and create and open a script document in VS Code the usage of those instructions:

mkdir my-first-shell-scripts
cd my-first-shell-scripts
contact script.sh
code script.sh

Step 2: Outline your setting variables

To stay your API key safe, retailer it in a .env document as a substitute of hardcoding it into the script. This lets you upload the .env document to .gitignore, fighting it from being driven to model keep an eye on.

To your .env document, upload:

API_KEY=your_kinsta_api_key

Subsequent, pull the API key from the .env document in your script by way of including the next to the highest of your script:

#!/bin/bash
supply .env

The #!/bin/bash shebang guarantees the script runs the usage of Bash, whilst supply .env imports the surroundings variables.

Step 3: Write the API request

First, retailer your corporate ID (present in MyKinsta below Corporate Settings > Billing Main points) in a variable:

COMPANY_ID=""

Subsequent, upload the curl command to make a GET request to the /websites endpoint, passing the corporate ID as a question parameter. Use jq to layout the output for clarity:

curl -s -X GET 
  "https://api.kinsta.com/v2/websites?corporate=$COMPANY_ID" 
  -H "Authorization: Bearer $API_KEY" 
  -H "Content material-Sort: utility/json" | jq

This request retrieves information about all websites related together with your corporate, together with their IDs, names, and statuses.

Step 4: Make the script executable

Save the script and make it executable by way of working:

chmod +x script.sh

Step 5: Run the script

Execute the script to peer a formatted record of your websites:

./list_sites.sh

Whilst you run the script, you’ll get a reaction very similar to this:

{
  "corporate": {
    "websites": [
      {
        "id": "a8f39e7e-d9cf-4bb4-9006-ddeda7d8b3af",
        "name": "bitbuckettest",
        "display_name": "bitbucket-test",
        "status": "live",
        "site_labels": []
      },
      {
        "identification": "277b92f8-4014-45f7-a4d6-caba8f9f153f",
        "call": "duketest",
        "display_name": "zivas Signature",
        "standing": "reside",
        "site_labels": []
      }
    ]
  }
}

Whilst this works, let’s give a boost to it by way of putting in place a serve as to fetch and layout the website particulars for more uncomplicated clarity.

Step 6: Refactor with a serve as

Change the curl request with a reusable serve as to maintain fetching and formatting the website record:

list_sites()  "(.display_name) ((.call)) - Standing: (.standing)"'


# Run the serve as
list_sites

Whilst you execute the script once more, you’ll get smartly formatted output:

Fetching all websites for corporate ID: b383b4c-****-****-a47f-83999c5d2...
Corporate Websites:
--------------
bitbucket-test (bitbuckettest) - Standing: reside
zivas Signature (duketest) - Standing: reside

With this script, you’ve taken your first step towards the usage of shell scripts and the Kinsta API for automating WordPress website control. Within the subsequent sections, we discover growing extra complicated scripts to have interaction with the API in robust techniques.

Complicated use case 1: Growing backups

Growing backups is a a very powerful side of web page control. They will let you repair your website in case of unexpected problems. With the Kinsta API and shell scripts, this procedure can also be computerized, saving effort and time.

On this phase, we create backups and deal with Kinsta’s prohibit of 5 guide backups according to setting. To maintain this, we’ll put in force a procedure to:

  • Test the present selection of guide backups.
  • Determine and delete the oldest backup (with consumer affirmation) if the prohibit is reached.
  • Continue to create a brand new backup.

Let’s get into the main points.

The backup workflow

To create backups the usage of the Kinsta API, you’ll use the next endpoint:

POST /websites/environments/{env_id}/manual-backups

This calls for:

  1. Setting ID: Identifies the surroundings (like staging or manufacturing) the place the backup shall be created.
  2. Backup Tag: A label to spot the backup (not obligatory).

Manually retrieving the surroundings ID and working a command like backup can also be bulky. As an alternative, we’ll construct a user-friendly script the place you merely specify the website call, and the script will:

  1. Fetch the record of environments for the website.
  2. Instructed you to make a choice the surroundings to again up.
  3. Care for the backup introduction procedure.

Reusable purposes for blank code

To stay our script modular and reusable, we’ll outline purposes for explicit duties. Let’s cross during the setup step-by-step.

1. Arrange base variables

You’ll get rid of the primary script you created or create a brand new script document for this. Get started by way of mentioning the bottom Kinsta API URL and your corporate ID within the script:

BASE_URL="https://api.kinsta.com/v2"
COMPANY_ID=""

Those variables will let you assemble API endpoints dynamically all the way through the script.

2. Fetch all websites

Outline a serve as to fetch the record of all corporate websites. This lets you retrieve information about every website later.

get_sites_list() {
  API_URL="$BASE_URL/websites?corporate=$COMPANY_ID"

  echo "Fetching all websites for corporate ID: $COMPANY_ID..."
  
  RESPONSE=$(curl -s -X GET "$API_URL" 
    -H "Authorization: Bearer $API_KEY" 
    -H "Content material-Sort: utility/json")

  # Test for mistakes
  if [ -z "$RESPONSE" ]; then
    echo "Error: No reaction from the API."
    go out 1
  fi

  echo "$RESPONSE"
}

You’ll understand this serve as returns an unformatted reaction from the API. To get a formatted reaction. You’ll upload every other serve as to maintain that (despite the fact that that isn’t our worry on this phase):

list_sites() {
  RESPONSE=$(get_sites_list)

  if [ -z "$RESPONSE" ]; then
    echo "Error: No reaction from the API whilst fetching websites."
    go out 1
  fi

  echo "Corporate Websites:"
  echo "--------------"
  # Blank the RESPONSE prior to passing it to jq
  CLEAN_RESPONSE=$(echo "$RESPONSE" | tr -d 'r' | sed 's/^[^ jq -r '.corporate.websites[] 

Calling the list_sites serve as presentations your websites as proven previous. The primary function, alternatively, is to get entry to every website and its ID, permitting you to retrieve detailed details about every website.

3. Fetch website particulars

To fetch information about a selected website, use the next serve as, which retrieves the website ID in accordance with the website call and fetches further particulars, like environments:

get_site_details_by_name() {
  SITE_NAME=$1
  if [ -z "$SITE_NAME" ]; then
    echo "Error: No website call equipped. Utilization: $0 details-name "
    go back 1
  fi

  RESPONSE=$(get_sites_list)

  echo "On the lookout for website with call: $SITE_NAME..."

  # Blank the RESPONSE prior to parsing
  CLEAN_RESPONSE=$(echo "$RESPONSE" | tr -d 'r' | sed 's/^[^ choose(.call == $SITE_NAME) 

The serve as above filters the website the usage of the website call after which retrieves further information about the website the usage of the /websites/ endpoint. Those particulars come with the website’s environments, which is what we want to set off backups.

Growing backups

Now that you just’ve arrange reusable purposes to fetch website particulars and record environments, you’ll be able to focal point on automating the method of making backups. The function is to run a easy command with simply the website call after which interactively make a selection the surroundings to again up.

Get started by way of making a serve as (we’re naming it trigger_manual_backup). Throughout the serve as, outline two variables: the primary to simply accept the website call as enter and the second one to set a default tag (default-backup) for the backup. This default tag shall be carried out until you select to specify a customized tag later.

trigger_manual_backup() {
  SITE_NAME=$1
  DEFAULT_TAG="default-backup"

  # Make sure that a website call is equipped
  if [ -z "$SITE_NAME" ]; then
    echo "Error: Web page call is needed."
    echo "Utilization: $0 trigger-backup "
    go back 1
  fi

  # Upload the code right here

}

This SITE_NAME is the identifier for the website you need to control. You additionally arrange a situation so the script exits with an error message if the identifier isn’t equipped. This guarantees the script doesn’t continue with out the vital enter, fighting doable API mistakes.

Subsequent, use the reusable get_site_details_by_name serve as to fetch detailed details about the website, together with its environments. The reaction is then wiped clean to take away any surprising formatting problems that may stand up all the way through processing.

SITE_RESPONSE=$(get_site_details_by_name "$SITE_NAME")

if [ $? -ne 0 ]; then
  echo "Error: Did not fetch website particulars for website "$SITE_NAME"."
  go back 1
fi

CLEAN_RESPONSE=$(echo "$SITE_RESPONSE" | tr -d 'r' | sed 's/^[^{]*//')

As soon as we now have the website particulars, the script under extracts all to be had environments and presentations them in a readable layout. This is helping you visualize which environments are connected to the website.

The script then activates you to choose an atmosphere by way of its call. This interactive step makes the method user-friendly by way of getting rid of the desire to bear in mind or enter setting IDs.

ENVIRONMENTS=$(echo "$CLEAN_RESPONSE" | jq -r '.website.environments[] | "(.call): (.identification)"')

echo "To be had Environments for "$SITE_NAME":"
echo "$ENVIRONMENTS"

learn -p "Input the surroundings call to again up (e.g., staging, reside): " ENV_NAME

The chosen setting call is then used to appear up its corresponding setting ID from the website particulars. This ID is needed for API requests to create a backup.

ENV_ID=$(echo "$CLEAN_RESPONSE" | jq -r --arg ENV_NAME "$ENV_NAME" '.website.environments[] | choose(.call == $ENV_NAME) | .identification')

if [ -z "$ENV_ID" ]; then
  echo "Error: Setting "$ENV_NAME" no longer discovered for website "$SITE_NAME"."
  go back 1
fi

echo "Discovered setting ID: $ENV_ID for setting call: $ENV_NAME"

Within the code above, a situation is created in order that the script exits with an error message if the equipped setting call isn’t matched.

Now that you’ve the surroundings ID, you’ll be able to continue to test the present selection of guide backups for the chosen setting. Kinsta’s prohibit of 5 guide backups according to setting method this step is a very powerful to steer clear of mistakes.

Let’s get started by way of fetching the record of backups the usage of the /backups API endpoint.

API_URL="$BASE_URL/websites/environments/$ENV_ID/backups"
BACKUPS_RESPONSE=$(curl -s -X GET "$API_URL" 
  -H "Authorization: Bearer $API_KEY" 
  -H "Content material-Sort: utility/json")

CLEAN_RESPONSE=$(echo "$BACKUPS_RESPONSE" | tr -d 'r' | sed 's/^[^{]*//')
MANUAL_BACKUPS=$(echo "$CLEAN_RESPONSE" | jq '[.environment.backups[] | choose(.kind == "guide")]')
BACKUP_COUNT=$(echo "$MANUAL_BACKUPS" | jq 'duration')

The script above then filters for guide backups and counts them. If the depend reaches the prohibit, we want to arrange the present backups:

  if [ "$BACKUP_COUNT" -ge 5 ]; then
    echo "Guide backup prohibit reached (5 backups)."
    
    # To find the oldest backup
    OLDEST_BACKUP=$(echo "$MANUAL_BACKUPS" | jq -r 'sort_by(.created_at) | .[0]')
    OLDEST_BACKUP_NAME=$(echo "$OLDEST_BACKUP" | jq -r '.observe')
    OLDEST_BACKUP_ID=$(echo "$OLDEST_BACKUP" | jq -r '.identification')

    echo "The oldest guide backup is "$OLDEST_BACKUP_NAME"."
    learn -p "Do you need to delete this backup to create a brand new one? (sure/no): " CONFIRM

    if [ "$CONFIRM" != "yes" ]; then
      echo "Aborting backup introduction."
      go back 1
    fi

    # Delete the oldest backup
    DELETE_URL="$BASE_URL/websites/environments/backups/$OLDEST_BACKUP_ID"
    DELETE_RESPONSE=$(curl -s -X DELETE "$DELETE_URL" 
      -H "Authorization: Bearer $API_KEY" 
      -H "Content material-Sort: utility/json")

    echo "Delete Reaction:"
    echo "$DELETE_RESPONSE" | jq -r '[
      "Operation ID: (.operation_id)",
      "Message: (.message)",
      "Status: (.status)"
    ] | sign up for("n")'
  fi

The situation above identifies the oldest backup by way of sorting the record in accordance with the created_at timestamp. It then activates you to substantiate whether or not you’d love to delete it.

In the event you agree, the script deletes the oldest backup the usage of its ID, liberating up house for the brand new one. This guarantees that backups can all the time be created with out manually managing limits.

Now that there’s house, let’s continue with the code to set off backup for the surroundings. Be happy to skip this code, however for a greater revel in, it activates you to specify a customized tag, defaulting to “default-backup” if none is equipped.

learn -p "Input a backup tag (or press Input to make use of "$DEFAULT_TAG"): " BACKUP_TAG

if [ -z "$BACKUP_TAG" ]; then
  BACKUP_TAG="$DEFAULT_TAG"
fi

echo "The usage of backup tag: $BACKUP_TAG"

In the end, the script under is the place the backup motion occurs. It sends a POST request to the /manual-backups endpoint with the chosen setting ID and backup tag. If the request is a success, the API returns a reaction confirming the backup introduction.

API_URL="$BASE_URL/websites/environments/$ENV_ID/manual-backups"
RESPONSE=$(curl -s -X POST "$API_URL" 
  -H "Authorization: Bearer $API_KEY" 
  -H "Content material-Sort: utility/json" 
  -d "{"tag": "$BACKUP_TAG"}")

if [ -z "$RESPONSE" ]; then
  echo "Error: No reaction from the API whilst triggering the guide backup."
  go back 1
fi

echo "Backup Cause Reaction:"
echo "$RESPONSE" | jq -r '[
  "Operation ID: (.operation_id)",
  "Message: (.message)",
  "Status: (.status)"
] | sign up for("n")'

That’s it! The reaction bought from the request above is formatted to show the operation ID, message, and standing for readability. In the event you name the serve as and run the script, you’ll see output very similar to this:

To be had Environments for "example-site":
staging: 12345
reside: 67890
Input the surroundings call to again up (e.g., staging, reside): reside
Discovered setting ID: 67890 for setting call: reside
Guide backup prohibit reached (5 backups).
The oldest guide backup is "staging-backup-2023-12-31".
Do you need to delete this backup to create a brand new one? (sure/no): sure
Oldest backup deleted.
Input a backup tag (or press Input to make use of "default-backup"): weekly-live-backup
The usage of backup tag: weekly-live-backup
Triggering guide backup for setting ID: 67890 with tag: weekly-live-backup...
Backup Cause Reaction:
Operation ID: backups:add-manual-abc123
Message: Including a guide backup to setting in development.
Standing: 202

Growing instructions on your script

Instructions simplify how your script is used. As an alternative of enhancing the script or commenting out code manually, customers can run it with a selected command like:

./script.sh list-sites
./script.sh backup 

On the finish of your script (out of doors the entire purposes), come with a conditional block that exams the arguments handed to the script:

if [ "$1" == "list-sites" ]; then
  list_sites
elif [ "$1" == "backup" ]; then
  SITE_NAME="$2"
  if [ -z "$SITE_NAME" ]; then
    echo "Utilization: $0 trigger-backup "
    go out 1
  fi
  trigger_manual_backup "$SITE_NAME"
else
  echo "Utilization: $0 trigger-backup "
  go out 1
fi

The $1 variable represents the primary argument handed to the script (e.g., in ./script.sh list-sites, $1 is list-sites). The script makes use of conditional exams to compare $1 with explicit instructions like list-sites or backup. If the command is backup, it additionally expects a 2d argument ($2), which is the website call. If no legitimate command is equipped, the script defaults to exhibiting utilization directions.

You’ll now set off a guide backup for a selected website by way of working the command:

./script.sh backup

Complicated use case 2: Updating plugins throughout a couple of websites

Managing WordPress plugins throughout a couple of websites can also be tedious, particularly when updates are to be had. Kinsta does a super process dealing with this by means of the MyKinsta dashboard, during the bulk motion characteristic we presented remaining 12 months.

But when you don’t like running with consumer interfaces, the Kinsta API supplies every other alternative to create a shell script to automate the method of figuring out old-fashioned plugins and updating them throughout a couple of websites or explicit environments.

Breaking down the workflow

1. Determine websites with old-fashioned plugins: The script iterates thru all websites and environments, in search of the desired plugin with an replace to be had. The following endpoint is used to fetch the record of plugins for a selected website setting:

GET /websites/environments/{env_id}/plugins

From the reaction, we filter out for plugins the place "replace": "to be had".

2. Instructed consumer for replace choices: It presentations the websites and environments with the old-fashioned plugin, permitting the consumer to choose explicit circumstances or replace they all.

3. Cause plugin updates: To replace the plugin in a selected setting, the script makes use of this endpoint:

PUT /websites/environments/{env_id}/plugins

The plugin call and its up to date model are handed within the request frame.

The script

Because the script is long, the total serve as is hosted on GitHub for simple get entry to. Right here, we’ll give an explanation for the core common sense used to spot old-fashioned plugins throughout a couple of websites and environments.

The script begins by way of accepting the plugin call from the command. This call specifies the plugin you need to replace.

PLUGIN_NAME=$1

if [ -z "$PLUGIN_NAME" ]; then
  echo "Error: Plugin call is needed."
  echo "Utilization: $0 update-plugin "
  go back 1
fi

The script then makes use of the reusable get_sites_list serve as (defined previous) to fetch all websites within the corporate:

echo "Fetching all websites within the corporate..."

# Fetch all websites within the corporate
SITES_RESPONSE=$(get_sites_list)
if [ $? -ne 0 ]; then
  echo "Error: Did not fetch websites."
  go back 1
fi

# Blank the reaction
CLEAN_SITES_RESPONSE=$(echo "$SITES_RESPONSE" | tr -d 'r' | sed 's/^[^{]*//')

Subsequent comes the center of the script: looping during the record of websites to test for old-fashioned plugins. The CLEAN_SITES_RESPONSE, which is a JSON object containing all websites, is handed to some time loop to accomplish operations for every website one at a time.

It begins by way of extracting some essential information just like the website ID, call, and show call into variables:

whilst IFS= learn -r SITE; do
  SITE_ID=$(echo "$SITE" | jq -r '.identification')
  SITE_NAME=$(echo "$SITE" | jq -r '.call')
  SITE_DISPLAY_NAME=$(echo "$SITE" | jq -r '.display_name')

  echo "Checking environments for website "$SITE_DISPLAY_NAME"..."

The website call is then used along the get_site_details_by_name serve as outlined previous to fetch detailed details about the website, together with all its environments.

SITE_DETAILS=$(get_site_details_by_name "$SITE_NAME")
CLEAN_SITE_DETAILS=$(echo "$SITE_DETAILS" | tr -d 'r' | sed 's/^[^{]*//')

ENVIRONMENTS=$(echo "$CLEAN_SITE_DETAILS" | jq -r '.website.environments[] | "(.identification):(.call):(.display_name)"')

The environments are then looped thru to extract particulars of every setting, such because the ID, call, and show call:

whilst IFS= learn -r ENV; do
  ENV_ID=$(echo "$ENV" | lower -d: -f1)
  ENV_NAME=$(echo "$ENV" | lower -d: -f2)
  ENV_DISPLAY_NAME=$(echo "$ENV" | lower -d: -f3)

  echo "Checking plugins for setting "$ENV_DISPLAY_NAME"..."

For every setting, the script now fetches its record of plugins the usage of the Kinsta API.

PLUGINS_RESPONSE=$(curl -s -X GET "$BASE_URL/websites/environments/$ENV_ID/plugins" 
  -H "Authorization: Bearer $API_KEY" 
  -H "Content material-Sort: utility/json")

CLEAN_PLUGINS_RESPONSE=$(echo "$PLUGINS_RESPONSE" | tr -d 'r' | sed 's/^[^{]*//')

Subsequent, the script exams if the desired plugin exists within the setting and has an to be had replace:

OUTDATED_PLUGIN=$(echo "$CLEAN_PLUGINS_RESPONSE" | jq -r --arg PLUGIN_NAME "$PLUGIN_NAME" '.setting.container_info.wp_plugins.information[] | choose(.call == $PLUGIN_NAME and .replace == "to be had")')

If an old-fashioned plugin is located, the script logs its particulars and provides them to the SITES_WITH_OUTDATED_PLUGIN array:

if [ ! -z "$OUTDATED_PLUGIN" ]; then
  CURRENT_VERSION=$(echo "$OUTDATED_PLUGIN" | jq -r '.model')
  UPDATE_VERSION=$(echo "$OUTDATED_PLUGIN" | jq -r '.update_version')

  echo "Out of date plugin "$PLUGIN_NAME" present in "$SITE_DISPLAY_NAME" (Setting: $ENV_DISPLAY_NAME)"
  echo "  Present Model: $CURRENT_VERSION"
  echo "  Replace Model: $UPDATE_VERSION"

  SITES_WITH_OUTDATED_PLUGIN+=("$SITE_DISPLAY_NAME:$ENV_DISPLAY_NAME:$ENV_ID:$UPDATE_VERSION")
fi

That is what the logged particulars of old-fashioned plugins would appear to be:

Out of date plugin "example-plugin" present in "Web page ABC" (Setting: Manufacturing)
  Present Model: 1.0.0
  Replace Model: 1.2.0
Out of date plugin "example-plugin" present in "Web page XYZ" (Setting: Staging)
  Present Model: 1.3.0
  Replace Model: 1.4.0

From right here, we carry out plugin updates for every plugin the usage of its endpoint. The whole script is on this GitHub repository.

Abstract

This text guided you thru making a shell script to have interaction with the Kinsta API.

Take a little time to discover the Kinsta API additional — you’ll uncover further options you’ll be able to automate to maintain duties adapted in your explicit wishes. You could believe integrating the API with different APIs to strengthen decision-making and potency.

Finally, continuously test the MyKinsta dashboard for brand new options designed to make web page control much more user-friendly thru its intuitive interface.

The submit Managing your WordPress websites with shell scripts and Kinsta API seemed first on Kinsta®.

WP Hosting

[ continue ]