Serverless Blog with CI/CD

Updated 12/31/2019 - Changes to the build environment version, and to the buildspec.yml file

Notes

Before we get into anything, I thought I would point out that I’m doing all of this on WSL Ubuntu 18.04 LTS. Your commands may vary a little bit if you are running something else. I’ve tested the steps below on CentOS 8 and had not issues, just had to change the apt-get to dnf, and everything else worked fine.

Shell setup

The first tihng I do is make sure that my system is up to date, and I have

  • Git
  • tmux
  • zsh
  • oh-my-zsh
1
2
3
4
sudo apt-get update
sudo apt-get upgrade
sudo apt-get install git tmux zsh -y
sh -c "$(curl -fsSL https://raw.githubusercontent.com/robbyrussell/oh-my-zsh/master/tools/install.sh)"

Once complete, and I’ve copied my config files from my private git repo, we can install hexo.

Hexo Install

Now that we have our shell all setup, we have to get hexo setup on your machine.

We are going to install Node via NVM, and then Hexo.

  1. Install NVM
    curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.35.1/install.sh | bash
  2. Install NPM LTS
    nvm install --lts
  3. Install Hexo
    npm install -g hexo-cli

Once complete, we need to setup our initial blog file structure.

1
2
1. hexo init docs
2. cd docs

Edit the _config.yml file and update the site information.

1
2
3
4
5
6
7
8
9
10
title: Serverless Blog
subtitle:
description: This is a serverless blog!
keywords:
author: RLGeeX
language: en
timezone: ETC/UTC

url: http://docs.rlgeex.com
root: /

Start the built in Hexo server to verify that everything is working.

hexo server

You should get a localhost link and a page that looks like the page below.

Shell Server

Docs Example

There are a number of themes out there, and customizing one for your own needs is prety straight forward.

I personally create new pages as drafts by using hexo new draft Title and then publish it once I’m finished with the document using hexo publish draft Title. Publishing the document moves it from the _drafts folder to the _posts folder making it public when your site is published. In the next step we’ll go over both Code Commit and Github as your repository for your new serverless blog.

Code Repository

Now that we have a working blog locally, our next step is to get it into a code repository. I’ve create sections on both Code Commit and Github. I personally use code commit, to keep all of my stuff in my personal AWS account, but I also understand that a large number of users will be on Github.

Code Commit

  1. Create a new code repo for our blog, I’m calling mine docs.

Code Commit 1

  1. cd docs

  2. git init

  3. git remote add origin ssh://git-CodeCommit.us-west-2.amazonaws.com/v1/repos/docs

  4. vi .gitignore My .gitignore is below, this has been working for me, but you may want to make some changes as you extend your blog.

    1
    2
    3
    4
    5
    6
    7
    .DS_Store
    Thumbs.db
    db.json
    *.log
    node_modules/
    public/
    .deploy*/
  5. vi buildspec.yml We will need this buildspec.yml file later. Here is a copy of what I’m currently using. Replace the s3 path with what you plan on using for your bucket and website.

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    version: 0.2

    phases:
    install:
    runtime-versions:
    nodejs: 12
    commands:
    - npm install -g hexo-cli hexo-generator-json-content
    - npm install
    build:
    commands:
    - hexo generate
    post_build:
    commands:
    - cd public/ && aws s3 sync . s3://docs.rlgeex.com --delete --size-only
  6. git add *

  7. git commit -m "first commit"

  8. git push -u origin master

Populated Code Commit

Github

  1. Create a new code repo for our blog, I’m calling mine docs.

Github Creation

  1. cd docs

  2. git init

  3. git remote add origin git@github.com:johnafogarty4/docs.git

  4. vi .gitignore My .gitignore is below, this has been working for me, but you may want to make some changes as you extend your blog.

    1
    2
    3
    4
    5
    6
    7
    .DS_Store
    Thumbs.db
    db.json
    *.log
    node_modules/
    public/
    .deploy*/
  5. git add *

  6. git commit -m "first commit"

  7. git push -u origin master

Populated Github

Bucket Hosting

  1. Navigate to S3 in the AWS console

S3

  1. Click on Create Bucket

S3 Create Bucket

  1. Setup your Bucket Name and Region then click on Next

S3 Bucket Name and Region

  1. Since this bucket is just hosting a website, no need to configure any of these options, unless you want to. Click Next

S3 Bucket Options

  1. Since this is going to host a public website, uncheck Block all public access and click Next

Set permissions

  1. Confirm all of your settings, and click Create Bucket

S3 Create bucket

  1. Click on your bucket and go to Permissions then Bucket Policy, add the policy below replacing the resource with your bucket. Click Save

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    {
    "Id": "Policy1522074684919",
    "Version": "2012-10-17",
    "Statement": [
    {
    "Sid": "Stmt1522074683215",
    "Action": [
    "s3:GetObject"
    ],
    "Effect": "Allow",
    "Resource": "arn:aws:s3:::docs.rlgeex.com/*",
    "Principal": "*"
    }
    ]
    }

    S3 Bucket Policy

  2. Click on the Properties tab, and then Static website hosting. Choose Use this bucket to host a website and fill in the Index document, clicking Save when done.

S3 Website Bucket

ACM Certificate

  1. Open up the Certificate Manager from the AWS Console.

Certificate Manager

  1. Make sure you are in the N. Virginia (US-East-1) region.

us-east-1

  1. Click on Request a certificate

Certificates Request

  1. Leave Request a public certificate selected and click Request a certificate

Public certificate request

  1. Add the Domain name you created your bucket for, and click Next

Domain Name

  1. I prefer DNS validation since it’s a quick record or two, and you’re done, but Select the validation method you prefer. Click Next

Certificate select validation method

  1. Add any tags you would like, if there are none or you have finished, just click Review

Certificates Tags

  1. Verify everything looks correct, and click Confirm and request

Certificates Confirm and request

  1. Expand the Domain that you are creating your certificate for, and note the CNAME that you need to create. If you are using Route53 like I am, you can simply click the Create record in Route 53 button, otherwise create your record manually. Click Continue

Certificates DNS Validation

  1. Once your validation record exists, your certificate will be validated, and the status will change to Issued

Certificates Issued

CloudFront CDN

  1. Navigate to CloudFront in the AWS console.

Cloudfront

  1. Click on Create Distribution

Cloudfront Create Distribution

  1. Click on Get Started under Web

Cloudfront Get Started

  1. Fill in Origin Domain Name with the static bucket we configured.
  2. Select Redirect HTTP to HTTPS in the Viewer Protocol Policy section.
  3. Set Object Caching to customize, Maximum TTL to 3600 and Default TTL to 600.

Cloudfront Origin Settings

  1. Under Distribution Settings select the Price Class that you would like to use.
  2. Set the Alternate Domain Names (CNAMEs) to the DNS name we created.
  3. Set the SSL Certificate to Custom SSL Certificate and choose the one from Step #18.
  4. Set the Security Policy as high as you are comfortable with.

Cloudfront Distribution Settings

  1. Click Create Distribution

Cloudfront Create Distribution

  1. Wait for your Distribution to be Deployed

Cloudfront Deployed

Route 53

  1. Navigate to Route 53 in the AWS Console.

Route 53

  1. Click on Hosted Zones

Route 53 Dashboard

  1. Click on the domain name in use.

Route 53 Zone list

  1. Click on Create Record Set

Route 53 Create Record Set

  1. Enter the Name we setup in CloudFront, select Alias: Yes, Choose the correct CloudFront Alias Target and click Create

Route 53 Alias

Code Pipeline

Now that we have our blog in a repo, and the rest of the underlying infrastructure is setup. It’s time to automate the build and publish of our new site whenever code is checked into the master branch. To accomplish that we’re going to use AWS Code Pipeline and AWS Code Build.

  1. Navigate to CodePipeline in the AWS Console.

Pipeline

  1. Click Create Pipeline.

Create Pipeline

  1. Enter a name for your Pipeline, and allow a new service role to be created.

Name Pipline

  1. Select source provider.

Github

  1. You have chosen GitHub

    Github choice

  2. Click Connect to GitHub. This will open a separate window where you sign into your GitHub account. Once signed in, you must grant repo access to AWS CodePipeline. This is the communication link between your GitHub repo and CodePipeline.

    Github provider

  3. Enter your repository name and choose the branch that you want to trigger your build, in this case master.

    Github Repo

CodeCommit

  1. You have chosen CodeCommit

    Codecommit Choice

  2. Enter your repository name and choose the branch that you want to trigger your build, in this case master.

    CodeCommit Choices

  1. Click Next.

  2. For the Build provider we are going to choose AWS CodeBuild.

Build Provider

  1. Select Create a new build project.

Build Project

  1. Enter a name for your Build project.

Codebuild Project Configuration

  1. For the Environment image we will use Managed image, using Ubuntu and Standard as the runtime, creating a new role for this purpose.

CodeBuild Environment AMI

  1. Leave Build specification as the buildspec.yml option.

Codebuild Buildspec

  1. Leave the Cloudwatch logs checked and default, unless you have some other methods setup for logging.

Codebuild Logs

  1. Click on Continue to CodePipeline

Continue to CodePipeline

  1. Click on Next

CodePipeline Build stage Next

  1. We won’t be using a deployment provider so click Skip Deploy Stage.

No Deployment

  1. Click Skip

CodeBuild Skip Deployment confirm

  1. Before clicking on Create Pipeline we have to update your bucket policy so that your code build can upload to S3. Open the IAM console in another tab.

IAM

  1. Click on Roles

IAM Roles

  1. Find the Role that was created for code build and click on it.

IAM CodeBuild Role

  1. Copy the Role ARN and save it

IAM CodeBuild ARN

  1. Click Attach Policies

IAM Attach Policies

  1. Click Create Policy

IAM Attach Create Policy

  1. Click the JSON Tab and paste the below section between the Statement [], being careful to update the resource ARNs, and click on Review Policy

IAM Attach Review Policy

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
{
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::docs.rlgeex.com"
],
"Action": [
"s3:Get*",
"s3:Put*",
"s3:List*",
"s3:Delete*",
"s3:Restore*"
]
},
{
"Effect": "Allow",
"Resource": [
"arn:aws:codecommit:us-west-2:368934795268:docs",
"arn:aws:codecommit:us-west-2:368934795268:docs/*"
],
"Action": [
"codecommit:GitPull"
]
},
{
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::codepipeline-us-west-2-*"
],
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:GetObjectVersion",
"s3:GetBucketAcl",
"s3:GetBucketLocation"
]
}
  1. Give your policy a meaningful name, and click Create Policy

    IAM Attach Name and Create Policy

  2. Click on Roles again, and select your codebuild-docs role again.

IAM Check Role

  1. If the policy that you just created is not attached, click on Attach Policies, select your policy, and click on Attach Policy

IAM Attach New Policy

  1. You should now have 2 policies attached to your role.

IAM Verify role policy count

  1. Navigate to S3 in the AWS console

S3

  1. Click the bucket we created earlier

S3 Bucket List

  1. Click on the Permissions tab and 7then Bucket Policy, add the policy below replacing the resource with your bucket, and the Principal with your ARN. Click Save

S3 Bucket Policy Update

  1. The complete policy will look like:
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
        "Version": "2012-10-17",
    "Id": "Policy1522074684919",
    "Statement": [
    {
    "Sid": "Stmt1522074683215",
    "Effect": "Allow",
    "Principal": "*",
    "Action": "s3:GetObject",
    "Resource": "arn:aws:s3:::docs.rlgeex.com/*"
    },
    {
    "Sid": "Stmt1568736826502",
    "Effect": "Allow",
    "Principal": {
    "AWS": "arn:aws:iam::368934795268:role/service-role/codebuild-docs-service-role"
    },
    "Action": [
    "s3:Get*",
    "s3:Put*",
    "s3:List*",
    "s3:Delete*",
    "s3:Restore*"
    ],
    "Resource": "arn:aws:s3:::docs.rlgeex.com/*"
    }
    ]
    }
    1. Review your pipeline, and if you are happy with it, click Create Pipeline.

Create Pipeline

If everything was created correctly, your first build will pass, and it will upload to your bucket, and you can access it via DNS name once the cloudfront distribution finishes.

Success

In the very near future I will write another post detailing how to create all of the AWS items with both Terraform and Cloudformation. Once it’s published, I’ll be sure to link it here.