Automating Website Development with Amazon Web Services
This blog is long overdue. I have wanted to build my own website, and host it for a very long time. The problem was I did not know how to code javascript … and I really didn’t have the time to become a full-stack developer. I wanted to go deeper on the data science side of things, and learn as much as I could about cloud. But that nagging desire to create and maintain my own site was still there.
Then I learned about jekyll.
I studied a great course by James Williamson on Lynda.com that went through the steps of building a jekyll website from scratch. That’s when I knew I could achieve my goal without spending a ton of hours learning javascript. I actually ended up buying a jekyll template built by Adventure Themes for 10 bucks. Then, I turned my attention to automating updates to the website using github for version control, and Amazon Web Services (AWS) for hosting.
High Level Approach
–> Set up s3 Bucket
–> Get Domain Name from Route 53
(or whatever DNS provider you use)
–> Set Up Cloudfront
–> Connect CodePipeline to Github
–> Connect Codebuild to CodePipeline
The basis of these steps came from two main sources: 1) a course from acloud.guru titled serverless portfolio with react, and 2) this great blog by Alex Bilbie.
The Particulars
S3 bucket
The first thing I did was to set up an S3 bucket using the name of the website domain I bought in step number 2. Predictably, I named it ‘confessionsofadatascientist.com’.
Make sure you make the bucket ‘public’ or no one will be able to see it. Also, you will want to make sure you select the static website hosting option.
I put my index.html file as the index document and was ready for the next step.
Route 53
Let’s take a step back for one minute. I learned that I can use Route 53 to manage my domain subscriptions much like you can with Go Daddy or one of the other URL domain sites out there. I like the fact that I don’t have to leave the environment to get this taken care, but you may prefer to use another domain manager.
Cloudfront
After setting up my s3 bucket, I needed to set up my cloudfront distribution. Cloudfront allows you to have a fast loading website due to the benefits of cache so that when one person in a remote location accesses your website it is avaiable to others in the same location depending on your ‘ttl’ or time to live (basically how long you want the website to be in cache).
The key thing to know about this (as I specify in my daily blog) is that you need to make sure you grab the static website ‘endpoint’ address found in the properties > static website hosting tabs. It is real easy to mix up this link with the link for the s3 bucket. However, it won’t work as it should with the s3 bucket link.
You put this link in the ‘origin domain name and path’ after you have created a cloudfront distribution. Now, you need to go back to the Route 53 section and add this cloudfront distribution link to your alias in the record set. If you need more help with Route 53 – please take a look at this blog.
What will this do for you? Route 53 will send the user to the cloudfront distribution which points back to your S3 bucket where your static website files are.
Codepipeline
Before you know it, you’re ready to connect your local development environment (in my case – my handy dandy mac) to your github repository. What’s nice about this is you can use git commands to add/commit/push your changes to github, which will then push to your website once you’ve connected to code pipeline and codebuild.
As depicted above, codepipeline monitors github as the source of the application code and synchronizes the code in AWS with your github account. This allows for continuous deployment, but it is codebuild that pushes those changes to the s3 bucket where your static website files are stored.
The settings I used for codepipeline are as follows:
Action category: Source
Source provider: Github
Connect to Github: Sign in and connect to the appropriate repository
Output artifacts: MyApp
Codebuild
Finally, codebuild will build your code and deploy it. You could use codedeploy but for a simple static website this is not necessary. There is one trick that it took me a while to figure out because I’m new to this area. You need to put a ‘buildspec.yml’ file in your root folder.
The commands in this folder will run when codebuild kicks off – and it is this step where AWS will synch your s3 folder content with your source (in my case github) files. If you don’t add this file, then you’ll never get the synch to work.
To finish this off – I’ll share my current settings for codebuild:
Project name: confessions-website
Current source: github repository name
Current image: aws/codebuild/ruby:2.3.1
Current build specification: using buildspec.yml in the source code root directory
Artifact type: Amazon S3
Name: confessions-website
Namespace type: none
Bucket name: confessionsofadatascientist.com
Type: no cache
Role name: created role for protection of codebuild
VPC: no VPC
When you create the role, there is an addition to the json code that you will want to make which will be important. Review the Alex Bilbie blog (linked above) for the particulars.
Conclusion
Hopefully, this blog is helpful and informative. I know there are probably a lot of things I could have done differently or more efficiently but I find I learn best by wading into the deep waters and giving it a go. Please hit me on linkedin if you’d like to message me.