Katie Fenn

Building a static website with React

5th February 2017

I have been using React to build sites for nearly two years. It's a valuable tool for making cutting-edge software with vibrant user experiences and I have grown used to using it. It's not normally the tool you would use for a small development blog, but how would you go about using it if you did?

React is famous for introducing universally rendered JS components - HTML content rendered symmetrically on the server and in the browser. Rendering components only in the browser can be slow, and doing part of it on the server is a useful shortcut.

let output = ReactDOM.renderToString(
  <div>
    <h1>Hello World</h1>
  </div>
);

This makes React a useful HTML templating tool that we can use in the same role as Handlebars, Twig or Haml. React is remarkably relaxed about how and where you use it, and I want to take advantage of this as the basis of my super-modern static site generator.

I also have an ulterior motive. Much has been said about using tools like React to make sophisticated "single page apps". While it is true that React is a superb tool for this, there is a majority of developers who don't work on SPAs who really want to know what the latest tools can do for them.

Gulp

Before we get started with React, we need a static site generator to build our site around. The only one I've used before is Jekyll, which is a great tool for beginners but its prescriptive design doesn't give me any options to use my own tools. On the other hand, Gulp is designed especially for producing bespoke sites and its library of plugins allows me to replicate common static site features such as Markdown parsing.

gulp-markdown is used just for this purpose. It transforms markdown files into HTML, and is the first step in turning my articles into web pages.

I use gulp-rename to strip date prefixes from the folders that contain my articles. I also use it to rename the main article files to index.html to make requests to article-name/index.html load with a friendly URL like www.katiefenn.co.uk/article-name/.

gulp-sort orders my articles so that they appear in my index files in reverse date order, using the date prefixes on the article directory names.

gulp-filter is useful for making Gulp ignore other files stored with my articles, such as images and notes that I don't want to publish.

gulp-replace is perfect for adding permalinks to article headings.

These plugins are almost enough to make a basic static site work, all that's left is to inject the processed content into an HTML template to make it look nice. gulp-handlebars is a good way of doing this normally, but this is where I start doing things differently.

Custom render-to-html plugin

As far as I know, there are no Gulp plugins to render piped HTML content into React components (know better? Let me know!). The custom plugin I wrote for doing this, gulp-render-to-string, takes HTML piped into it, creates a new instance of a React element with props from its arguments, and renders it to a string.

// Inject article content into site templates
// (server-rendered React components)
.pipe(jsxTemplate(PageIndex, {
  pageCount: Math.ceil(articleCount / ARTICLES_PER_PAGE)
}))
.pipe(jsxTemplate(Page))

Looking inside the source of the plugin, we can see that it encapsulates ReactDom.renderToString:

let contents = file.contents.toString("utf8");
let output = ReactDOM.renderToString(
  <Component {...props}>
    { contents }
  </Component>
);

file.contents = new Buffer(output);

There's just one part of the puzzle missing for making a working blog: paginated index pages.

Custom pagination plugin

Navigation of articles on my site has always been through collating articles into numbered pages. It's a common method of navigating blog-like sites, and I am somewhat surprised that a Gulp plugin does not exist to do this.

I created another custom Gulp plugin, gulp-paginate, that holds files piped into it until the number of articles held reaches a configurable articles-per-page setting. It then creates a new file with content from each file it holds appended together.

CSS Modules and PostCSS

CSS Modules is one of the most exciting new developments in the world of CSS. It is a suite of tools that protect styles from being freely overwritten. This helps reduce the problem of unexpected side-effects when making changes to a site's styles. It also compliments building a site with React, by placing a focus on reusable user interface components instead of reusable styles.

CSS Modules is supported by both PostCSS and WebPack. I decided not to use WebPack because my site has few client-side assets that need bundling. PostCSS is ideal in this case, it can pre-process modular stylesheets for a static website and is easier to set up than WebPack.

When PostCSS processes my stylesheets, it transforms classnames to make them unique. It's a similar process to the BEM naming convention in that it adds a postfix to classes to reduce the likelihood of style being overwritten. In order to use the new classnames in my React components, PostCSS creates a JSON file that maps the old classes to the new:

import React from 'react';
const styles = require('./styles.css.json');

export default function(props) {
  return (
    <article className={ styles.article } />
  );
};

The end result HTML looks like this:

<article class="_article_1jkjk_1">
  <h1>Building a static website with React</h1>
  ...
</article>

Gulp then concatenates stylesheets together into a single file to be served on the web.

Automating builds and deployment with Travis CI

One of the pain points of making a static website in the past was the manual process of generating and deploying the site whenever content changed. Travis CI once again proves an invaluable frontend development tool by doing this bit for me.

My introduction to CI was a service than ran unit tests on my code whenever I pushed to GitHub. The sheer flexibility of modern CI services (such as Travis) allows them to do much more than this. Whenever I push a change to my site's repository on GitHub, Travis builds my site and deploys the built code to GitHub Pages. The built code is never committed to the source repository.

Dominic Denicola's article explains the entire process for setting this up.

Future improvements

Some things that I hope to work on that can improve the site in the future:

In Summary

Frontend tooling is steaming ahead at a breathtaking pace. It's easy to become disillusioned with the latest tools, and there have been times when I felt permanently left behind by the community.

As tools like React have matured, the community have started filling in the blanks to make them more flexible for use outside of single page applications. Although there is still work to be done, I found React and Gulp to be great substitute for Jekyll. The process of writing custom Gulp plugins was not as accessible as I'd hoped, and I would encourage the community to develop a stripped-down alternative API for creating plugins.

The best thing about the project was taking ownership of every aspect of the site. I loved using Ghost and Jekyll in the past, but with so much happening "under the hood" there is a reduced incentive for customisation. The recent trend for creating numerous small modules does a lot to break down the walls of what would otherwise be an opaque box.

I hope this trend continues, especially if it is matched by the community making tooling intuitive and accessible.

The complete source code for my site is available on GitHub.

2015 in Review

27th December 2015

I've never written an annual review before, but because this year has been a turnaround year for me, and because I've been inspired by Chad's writing, now seems like a good time to start.

Cool Things

In February, First Play Sheffield raised £500 for Special Effect by taking part in GameBlast, a twenty-four hour videogame marathon. I almost completed a full run of Ocarina of Time.

My album of the year and film of the year are both space-related, being The Race for Space and The Martian. Both are love letters to the golden age of space exploration and capture all of the drama and euphoria.

I starting diving lessons at Ponds Forge. It was a hobby that I'd dipped my toe into when I was young, and I have always wanted to take it seriously. Diving is unique; the excitement and focus shakes a difficult week of work out of you, and it's the best thing that has happened to me in a long time.

The Lead Experiment

I've been trying out the lead developer role in 2014 and 2015. I've helmed two projects, I'm very proud to have guided them to completion and to have worked such talented, hard working people.

I've enjoyed many of the challenges that the lead position brings. We replaced a clumsy desktop / mobile site with a single responsive site, simplifying the technology while introducing a crisp, modern design. We also introduced Sass, Grunt, Patternlab and SVG graphics. The key to the success of the project was a close relationship between designers and developers, involving both in planning meetings and pairing up to make sure the design worked as well as it looked.

Staying on top of such large changes with a geographically divided team was hard work, and although planning everything was an interesting experience, I regretted spending such a long time away from programming. I was very unhappy with the direction my career was taking, and losing out in several interviews for development jobs I wanted very much because of a lack of recent experience hurt.

Back to the Old Ways

I left behind many good friends at Inviqa and took a full-time Javascript role at Sky Betting and Gaming. I've since been working on a cutting-edge React app, with lots of new ideas such as Flux, functional programming and immutable data structures. The last six months have made me realise how rusty I'd become, and how much I enjoyed writing code again.

I've also had two pull requests merged into Ghost, and I'm working on another for Chrome DevTools. The warmth these communities have shown me has convinced me to increase participation in community development in 2016. This will be at the expense of solo efforts like Parker, but I think I will be a lot happier being part of something bigger.

Part of Something Big

The highlight of 2015 has been visiting and speaking at some amazing events (in rough chronological order):

  • Leeds JS*
  • Sheffield JS*
  • Scotland JS*
  • Up Front
  • Manchester JS*
  • Manchester Girl Geeks Barcamp **
  • McrFRED*
  • Reject JS **
  • JS Conf EU
  • CSS Conf EU
  • Frontend London*

* Speaking ** Speaking (lightning talk)

The community continues to be the most fulfilling thing about being a developer. There are many cool people out there who are doing some incredible work. It makes me proud to have spoken at some of those events.

My hope for 2016 is to write more articles, write more talks, and write more pull requests.

Merry Christmas and Happy New Year.

A New website

26th July 2015

Since last year, I have been taking a keen interest in Ghost, the new blogging platform built on Node.JS. I've been fortunate to contribute to the project, and I'm planning on continuing to donate some of my time and experience.

Migrating my website to Ghost is a great way of getting to know it a bit better, as well as providing an opportunity to write extensions and supporting tools.

At the same time, an audit of my old website showed it was significantly behind-the-times. An early attempt at building a high-dpi design did not go as well as planned. Here are the figures:

Total requests: 18 Total requests size: 1.6MB Time to load (fiber): 582ms

Ouch. Not good.

Since then, I've learned how great SVGs are to work with, so I decided a ground-up redesign with Adobe Illustrator was necessary.

SVGs

Smaller filesizes and great looking images on high-dpi displays make SVGs a good choice for developers.

More than this, I found Adobe Illustrator a surprisingly good tool for creating graphics for the web. Vector graphics packages are good at creating the clean lines and bold shapes that are common on the web. Being able to adjust vector shapes is really useful when tweaking and optimising designs, which is an important part of designing great web sites.

Data URI Images

Embedding images into stylesheets reduces the number of HTTP requests, meaning faster loading sites. The smaller size of SVG images means embedding them inside stylesheets places a smaller burden on their overall filesize.

I've also chosen to make the main heading an SVG image instead of adding an extra request to a web font for just a single element.

Responsive Web Design

I've paid extra-special attention to how the design responds to mobile devices. On medium devices, the decoration becomes static to make better use of the space. On mobiles, the decorative elements are hidden and the content and typography takes precedence.

I've been using Instapaper to save articles for later reading, and love the focused reading experience it gives. It was this experience I wanted to give all readers of my site on mobiles.

Ghost

Ghost makes a change from my previous platform of choice. I liked the simplicity of Jekyll, particularly its support for generating pages from Markdown source files.

My choice of Ghost is primarily made to support my contributions to the platform, but its support for Markdown authoring leaves the door open for importing content from files or services in the future.

Platform

Even after all of this, a significant effort has been made to completely re-platform my site.

I decided to use Ansible to provision a Digital Ocean droplet. Using automated provisioning gives a useful interface for making incremental changes to your setup, which is especially useful when you have a lot to learn about devops like I have.

I'm using the combination of CentOS and nginx for the base OS and web server. I've been working with both for four years, and both have proven to be dependable and easy to work with for those with limited time to spend.

Performance

Here's what impact the improvements have made to the performance:

Total requests: 8 Total requests size: 109KB Time to load (fiber): 445ms

Summary

I'm pleased with the improvement in performance, and happy to finally put the site to work. The re-platforming took longer than expected but paid off with a set of Ansible playbooks I can re-use and build on. Deciding when a project is done is always a problem, and this site is not done. But it is ready, and I am pleased to show not only a website that looks good but also works well.

A-Level Computing students using BBC Micros to learn computer programming

20th February 2013

An interesting report on BBC News on A-Level Computing students using BBC Micros to learn computer programming.

I think BBC Micros made excellent computers to learn to program on because learning to program was always a central function of the product. The relative simplicity of contemporary computer game software and a well-documented BASIC interpreter were also important features.

It is these qualities that remind me of Google Android, Apple’s IOS and the Web. These devices seem to carry on the tradition of the bedroom coder; to go beyond owning a product and to begin learning how it works and how to write your own software. These devices are often fondly remembered by their owners and leave an important legacy. The BBC Micro was that device for me.

G4 Cube Webserver

11th July 2011

I have been making websites for a long time. I have never hosted any of them, until this year.

My employer’s hardware cupboard is an Aladdin’s cave of Apple hardware. When it was last cleaned out, I was lucky enough to adopt an Apple Power Mac G4 Cube. It would’ve been beautiful new, but the reality distortion field has no kind effect on ageing hardware. The transparent, acrylic case was cracked, but it still started up.

I’d had the best intentions of operating my own web server for years. What had stopped me was the lack of support for static IP addresses in consumer broadband packages and the lack of appropriate hardware. Now that O2 support static IP addresses in their consumer broadband packages, I had a chance. The Cube’s passively-cooled hardware, famously dispersing its heat by the convection of air through a vent that runs from the bottom of the system to the top, immediately appealed to me.

The Cube was old, though such machines can thrive as webservers. HTTP servers have been written for many platforms, including many aging systems and formats. The Cube wasn’t going to be the oldest webserver in the world by a long way. The C64 webserver is an example, although probably not the most archaic.

Its age did create a few problems. It came with a vintage install of Mac OS X 10.3 Panther. Panther felt surprisingly zippy and responsive when it booted, although it was too old to run MacPorts, Mac OS X’s popular open-source port project. To install MacPorts, I would need OS X 10.4 Tiger as a minimum.

A copy of Tiger wasn’t hard to find. Once installed, the 2005-vintage operating system was thankfully responsive and stable. After installing MacPorts, it is simply a case of invoking the installation of PHP5, MySQL and Apache through the command-line interface.

It was here that the Cube really began to show its age. In total, it took the Cube the better part of twelve hours to compile and install the PHP/MySQL/Apache stack. I’m also feeling pinched for space – my Cube has a noisy 20GB hard disk drive, and the system files and applications account for around 10G alone.

That’s plenty for my needs, though. Now I have a web server of my own, that I can configure and operate to my heart’s content. I get good amounts of uptime, I get administrator access, I have no competing virtually-hosted sites draining performance.

If you’ve got a dusty piece of computer hardware that you’ve got in the back of a cupboard, I highly recommend that you try making your own webserver. You can find the guide that I used here. MacPorts makes it easy. I daresay Apt is very friendly too. You’ll get so much value out of computers that up until yesterday did nothing.

Pages: