Auto deploy static generated websites on shared hosting

Posted by Harald Nezbeda on Mon 11 June 2018

For some time I had a question in mind: Is it possible to do an auto deploy with Git on a shared host?

When it comes to auto deployment I can think of various ways to do this, depending on the available tooling. A common approach is to use a commit hook in the version control system which will trigger a script that will fetch the new code and build the static website on the server. Another wide used tehnique is to SSH into the server after all test pipelines succeed and push the update to the server. But what if you don’t have access to the server, and the only way to bring your website online is by uploading it over FTP (this is fairly common on shared hosting services)? Well, it’s time to get creative.

I still have an account on a shared host, it works well and covers my needs for the small things I build every now and then. This blog is hosted there and I wanted to make the deployment faster and easier. So here is what I currently use:

Since I do the deployment via FTP, the first thing that came in my mind was to automate this process on my local machine. This means I have to write a script that uploads the new build whenever I trigger it. I decided to do this in Python, and started with the internal FTP library, called ftplib, but this requires to much time (you need to create a list of all your files, create the folders, then upload the files, avoid duplication etc.). Luckily there is this library called pyftpsync that can handle this task.

Here is the sync script I came up:

import os
import time
from ftplib import FTP
from ftpsync.targets import FsTarget
from ftpsync.ftp_target import FtpTarget
from ftpsync.synchronizers import UploadSynchronizer

current_path = os.path.dirname(os.path.abspath(__file__))
local = FsTarget(os.path.join(current_path, 'build', 'prod'))

# Credentials
FTP_ADDRESS = os.environ['FTP_ADDRESS']
FTP_USER = os.environ['FTP_USER']
FTP_PASSWORD = os.environ['FTP_PASSWORD']
FTP_TMP_UPLOAD_PATH = '/tmp_%s' % time.strftime("%Y%m%d-%H%M%S")

# Create temp path for upload
ftp = FTP(host=FTP_ADDRESS, user=FTP_USER, passwd=FTP_PASSWORD)
ftp.mkd(FTP_TMP_UPLOAD_PATH)

# Sync build to temp path
remote = FtpTarget(FTP_TMP_UPLOAD_PATH, FTP_ADDRESS, username=FTP_USER, password=FTP_PASSWORD)
opts = {"force": False, "delete_unmatched": True, "verbose": 6}
s = UploadSynchronizer(local, remote, opts)
s.run()

# Rename old build
ftp.rename('public', 'public_%s' % time.strftime("%Y%m%d-%H%M%S"))

# Make new build public
ftp.rename(FTP_TMP_UPLOAD_PATH, 'public')
ftp.quit()

In the credentials area you can either add your own credentials, or set the environment variables on your system. It still uses ftplib to rename the folders, but only because I wanted to leave the old builds in the list (feel free to delete the folder if you don’t require a rollback option).

I decided to go a step further and run that piece of code on the Git server, so the build is created and uploaded automatically on each push. Bitbucket offers pipelines for tests and deployment. There is a free tier of 50 min/month which is enough for my use case.

First you must define the environment variables (remember the credential section above). This is done by going to project settings and then on Pipelines → Environment variables:

Bitbucket pipeline environments

The project now requires a pipeline configuration. Add a bitbucket-pipelines.yml in the root of the project for this. Here is my configuration:

image: python:3.6.3

pipelines:
  branches:
    master:
      - step:
          caches:
            - pip
          script:
            - pip install -r requirements.txt
            - cd website
            - pelican content -o build/prod -s publishconf.py -t theme/clean-blog/
            - cp dotfiles/.htaccess build/prod/.htaccess
            - python ftp_deploy.py
          deployment: production

The pipeline is using the Python 3.6.3 image from docker hub and runs the script only for the master branch. For each push to master it will install the requirements (or use the pip cache), do the Pelican production build, copy the .htaccess with some configurations for Apache and use the python script above to sync the files. This process takes a little over 2 minutes:

Bitbucket pipeline runner

This idea can also be applied on other projects where you only need to add files via FTP, but I would say that it should be used just on small websites. By using a pipeline you have no downtime and you can also start the deployment from any device you can push to Git. The workflow is similar for Gitlab CI, Circle CI, Travis or other continuous integration services, in case you want to give it a try :)