How to create a GCP Sql Instance using your command line

In continuation of my series on using the GCP from your command line I will show you how to quickly setup an Sql instance from you preferred OS command line whether your using Windows, Linux or MacOS operating systems. Authorize a GCP compute instance to access it, create a cloud storage bucket, store an image on it and add the link to that image in the GCP compute instance. This is one of the qwiklabs for the ongoing Google Andela and Pluralsight Scholarship partnership for Africa. This lab [] in particular.

I prefer to work from the terminal as is trains my memory and really helps me remember the commands and what they do and how to use them; and it is way faster from my experience, and in the Linux world where I live we have a common saying that when most fails the terminal is always the only place and when working with servers a GUI is usually not an option.


  1. Cloud SDK installed
  2. Admin access on you machine
  3. Google Cloud Platform Account
  4. Internet

Lets us begin by stating the steps we will be using to achieve our task. These will be:

  1. Create the sql instance.
  2. Create your database instance.
  3. Create the user.
  4. Create a GCP compute instance and authorize it to connect to the Sql instance.
  5. Connect to the Sql instance.
  6. Setup the google cloud storage location and update the compute instance to host this image.

Create the Sql instance:

gcloud sql instances create blog-db --password="admin" --sql-version=MYSQL_5_7" --zone=us-central1-a

For the mysql version I used the one used in the tutorial which is the latest version (5.7) as of the time of writing this article and that version has more data storage space and more features, see the release notes.

  • Second Generation (recommended)
    ○ Up to 7x throughput and 20x storage capacity of First Generation
    ○ Up to 208 GB of RAM and 10 TB of data storage
    ○ MySQL 5.6 or 5.7
    ○ InnoDB only
  • First Generation
    ○ Up to 16 GB of RAM and 500 GB of data storage
    ○ MySQL 5.5
    ○ IPv6 connectivity and On Demand activation policy

The other databases include:

  • postgres 9.6
  • postgres 11

The following beta versions of SQL Server:

  • SQL Server 2017 Standard
  • SQL Server 2017 Enterprise
  • SQL Server 2017 Express
  • SQL Server 2017 Web

Create the database instance (though we will not be using it per say):

gcloud sql databases create host-db --instance="blog-db"

Create the user:

gcloud sql users create blogdbuser --instance=blog-db -i blog-db --host='%' --password='password'
  • The above commands and their arguments are self explanatory but for the “%” that refers to the host ip address that this user can interact with via the “help” documents of gcloud: “

Create a GCP compute instance and authorize it:

  • Create a file of type “.sh”, in linux the extension don’t really matter. They just tell us what the file does. Call the file anything you like I will call mine “”, and add these lines to it
#! /bin/bash
apt-get update
apt-get install apache2 php php-mysql -y
service apache2 start
  • Create for Linux users:
gcloud compute instances create hostvm --machine-type=g1-small --zone us-central1-a --tags="http-server" --metadata-from-file='/path/to/'
  • Create for Windows user:
gcloud compute instances create hostvm --machine-type=g1-small --zone us-central1-a --tags="http-server" --metadata-from-file="C:\Path\to\"
  • Authorize, here we get the IP address and then authorize:
gcloud compute instances list
gcloud sql instances patch blog-db --authorized-networks=""

I used the IP address seen above yours will be different. Now only this GCP compute instance will be able to connect to the Sql instance we created. Let’s connect to it:

  • SSH into the compute instance:
gcloud compute ssh hostvm --zone us-central1-a
  • Change into the folder of interest:
cd /var/www/html
  • Edit the file using “nano” test editor:
sudo nano index.php
  • Use the below command to get the IP address of the Sql instance and use that in the variable “$dbserver”:
gcloud sql instances list
  • Paste the lines below into this file:
$dbuser = "blogdbuser";
$dbserver = "";
$dbpassword = "password";
$conn = new mysqli($dbserver, $dbuser, $dbpassword);
if (mysqli_connect_error()){
echo ("Database connection error: " . mysqli_connect_error());
} else {
echo ("Database connection succeeded");

Save this file by hitting this key combinations on your key board.

  • Non-mac systems:
Ctrl + x
Shift + y
  • Mac systems:
Command + x
Shift + y

Now we restart the server:

sudo service apache2 restart

Now open your browser of choice and enter the external ip address of the compute instance in the format “” and you should see the texts “Database connection succeeded”.

Setup the google cloud storage location and update the compute instance to host this image. The steps are:

  • Create the bucket
  • Upload an image
  • SSH into your gcp compute instance and modify the “index.php” file to host the image and and restart the server once more.

Create bucket from your terminal window:

gsutil mb -l us gs://project_id

Upload an image:

  • Windows users:
gsutil cp "C:\Path\to\image" gs://project_id
  • Linux users:
gsutil cp "/path/to/image" gs://project_id
  • Make the image publicly accessible:
gsutil acl ch -u AllUsers:R gs://project_id/*

This reads “ch” — change the “acl” — access control list to allow all users read access to items in that bucket.

Now add the image url to the index.php file in the server and restart it. I leave you to do this. Tip: you need to ssh into the server like we did previously and add the line:

<img src='<project_id>/<image_file_name>' />

Note that the bucket url are in the format, “https://storage.googleapis.con/<project_id>/<storage_item_name>”. Now checkout your server in any browser using the external IP and the “/index.php” added at the end of the url like we did previously. Please leave a clap if you found this tutorial useful.

DevOps | FullStack developer | Python::Flask | GCP Cloud Certified | AWS & AZURE Cloud Savy | Linux Sysadmin | Google IT Support