S3 Hosting

While preparing for my exams I had purchased this domain – dangilpin.click – to use during the labs for S3, Route 53, and CloudFront, etc. At that time I also had three existing web sites running on traditional hosts. I kept this domain to repurpose as a portfolio site, and have transferred the other three domains to AWS. I was thinking some content would have to go due to not having PHP anymore. But as it turns out that Google maps and analytics, and my contact forms all used javascript and remote servers to function – so those work without modification!

The only thing I had to leave behind was a php mySQL time-tracking app that I didn’t use anymore anyway (iPhone apps do it better now). I will save that as a good serverless project!

Now I have four sites served serverless-style via AWS! I was spending on the order of $200 per year for hosting fees. Now I will only spend about $30!

There are a few things to do yet:

  • get the www requests to go to the apex/naked domain name
  • get SSL certs for https requests
  • add dynamic content via Lambda and API Gateway
  • replicate wordpress-like functionality using S3
  • maybe an IOT project or two

Glacier as a Data “Junk Drawer”

I have a laptop with a small SSD filling up. I already have network and local backups, and those drive are filling up, too. Faced with buying a new internal SSD and having quite a bit of credits on AWS to use, I decided to try moving some of this to S3 and Glacier. Glacier is really cheap – but only for long-term archives that will not be accessed much.

I used S3 bucket lifecycle policies to migrate new files to Glacier. One bucket has not expiration (keep forever), and then two more buckets with expirations of one year and three years in the policy. So data moved to those buckets with be “self cleaning” if I don’t intervene before the expiration. I put files there that I want to get off my local HD, but am afraid to delete them just yet.

I created a new user in IAM with privileges only for S3Admin. Most of what I moved to Glacier consists of media – personal photos and videos that I don’t need to have around, but can’t delete them either. I exported the photos from albums to folders with descriptive names and dates so I could retrieve them easily, and then zipped the folders individually. Using Chrome and the AWS console to “drag-and-drop” a smallish batch of zip files at a time (5GB or so) seemed to work best from my relatively slow home connection. I did have trouble with AWS logging me out of the console if I did not use the root account – so keeping the batches small minimized recovery from incomplete uploads.

It took a bit of time and effort, but now have 100GB free on my internal SSD and cleared a lot off of the network drives. I am going to stop there and see how the billing looks after a month or so. I anticipate it costing about 40 cents per month per 100GB plus any data transfer and request fees. That is pretty cheap for this use case. But if I should use closer to a TB or more, then some of the “unlimited” cloud backup services will probably be more cost effective, at the cost of some durability.

Very nice to have my SSD back – and almost for “free”!