How to pass variables to a Docker container when building a Node app

Environment variables are declared with the ENV statement and are notated in the Dockerfile either with $VARIABLE_NAME or ${VARIABLE_NAME}.

Passing variables at build-time

The ENV instruction sets the environment variable to the value. The environment variables set using ENV will persist when a container is run from the resulting image. For example:

FROM node:9

ENV PORT 3000
ENV NODE_ENV development

The Dockerfile allows you to specify arguments at build-time. The ARG instruction defines a variable that users can pass at to the builder:

FROM node:9

ARG PORT
ARG NODE_ENV

When building a Docker image from the command line, you can set those values using –build-arg:

 docker build --tag webapp --build-arg PORT=3000 --build-arg NODE_ENV=development .

Executing commands using the shell

And, here is the secret ingredient. If the $NODE_ENV variable is set, then you can use the shell to run an NPM script:

FROM node:9 

ARG PORT 
ARG NODE_ENV 

ENV PORT $PORT 
ENV NODE_ENV $NODE_ENV

RUN mkdir -p /usr/app
WORKDIR /usr/app
RUN cd /usr/app
ADD . .

RUN npm install
RUN /bin/bash -c '[[ "${NODE_ENV}" == "production" ]] && npm run build:prod || npm run build:dev'

EXPOSE $PORT

CMD ["npm", "run", "start"]

Finally, you expose the port number and start the HTTP server.

That’s it! Thanks for reading and happy Dockering :)

How to create a Data Container Component in React

One pattern I’ve used quite a lot while working with React at the BBC and Discovery Channel is the Data Container pattern. It became popular in the last couple of years thanks to libraries like Redux and Komposer.

wireframe

The idea is simple. When you build UI components in React you feed data into them via containers. Inside those containers you may need to access different data sources, filter data, handle errors, etc. So data containers help you build data-driven components and separate them into two categories: Data components and Presentational components.

  • A Presentational component is mainly concerned with the view, it doesn’t specify how the data is loaded or mutated. They receive data and callbacks exclusively via props.
  • A Data component talks to the data sources and provides the data and behaviour to the Presentational component. It’s usually generated using higher order function, such as connect() or createContainer().

There are actually 2 ways to implement this pattern, using inheritance or composition:

  1. Inheritance: a React component class extends a Data Container component class.
  2. Composition: a React component is injected into the Data Container (React Komposer uses this approach).

I recommend composition over inheritance as a design principle because it gives you more flexibility.

Code Example

Lets say you want to display a list of notifications and you have 2 components:
NotificationsContainer and NotificationsList

First you need to fetch the data and add it to the NotificationsContainer:

import React, {createElement} from 'react';
import PropTypes from 'prop-types';
import https from 'https';
import DataStore from '/path/to/DataStore';

export default function createContainer(SubComponent, subComponentProps) {

    class DataContainer extends React.Component {

        constructor(props) {
            super(props);

            this.name = props.name;
            this.dataSourceUrl = props.dataSourceUrl;
            
            this.state = {
                data: null,
                error: null
            };
        }

        componentDidMount() {
            this.setInitialData();
        }

        setInitialData() {
            if (DataStore.hasData(this.name)) {
                this.setState({
                    data: DataStore.getData(this.name)
                });
            } else {
                this.fetchData();
            }
        }

        fetchData() {
            https.get(this.dataSourceUrl, res => {
                let chunkedData = '';

                res.on('data', data => {
                    chunkedData += data;
                });

                res.on('end', () => {
                    this.setState({
                        data: chunkedData
                    });
                });

                res.on('error', (error) => {
                    this.setState({error});
                });
            });
        }

        render() {
            return createElement(
                SubComponent,
                Object.assign({}, subComponentProps, this.state)
            );
        }
    }

    DataContainer.propTypes = {
        name: PropTypes.string,
        dataSourceUrl: PropTypes.string
    };

    return DataContainer;
}

Then you need to create a NotificationsList component that receives the data as a prop:

import React from 'react';
import PropTypes from 'prop-types';

class NotificationsList extends React.Component {

    constructor(props) {
        super(props);
    }

    render() {
        const listItems = this.props.data.items || [];

        return (
            <ul>
                {listItems.map((item, index) => {
                    return <NotificationListItem item={item} index={index} />;
                })}
            </ul>
        );
    }
}

NotificationsList.propTypes = {
    data: PropTypes.object,
    error: PropTypes.object
};

export default NotificationsList;

And, finally, you need to create and render the data container:

import React from 'react';
import NotificationsList from './NotificationsList';
import createContainer from './createContainer';

export default class HomePage extends React.Component {

    render() {
        const NotificationsContainer = createContainer(
            NotificationsList, {
                propName: 'propValue'
            }
        );

        return (
            <NotificationsContainer 
                dataSourceUrl="/api/notifications/list" 
                name="notifications" />
        );
    }
}

If you are looking for something a bit more advanced, similar to what I was using at the BBC, then check out this nice little project called  Second. Or, if you are building a more complex app and need to manage state or map components to multiple containers, then you should consider using Redux. Here’s a great presentation about React/Redux.

For those using React 16.3, keep an eye on the following projects: react-waterfall and unistore. They are data stores built on top of the new Context API.

If you don’t want to miss any of my articles, follow me on twitter @fedecarg

Website performance monitoring tool

Monitoring systems allow you to monitor changes to your front-end code base over time, catching any regression issues and monitoring the ongoing effects of any performance optimisation changes. Easy to use dashboards are a must when it comes to monitoring the state of your web apps. Companies like Calibre or SpeedCurve offer this as a professional service, but not everyone can afford them.

Meet SpeedTracker

SpeedTracker is an open source (MIT license) self-hosted solution to monitor your app’s uptime and APIs, developed by Eduardo Bouças. It runs on top of WebPageTest and makes periodic performance tests on your website and shows a visualisation of how the various performance metrics evolve over time.

SpeedTracker provides clean charts and graphs that can help you identify possible problem areas.

speedtracker01

Check out the demo here: https://bbc.github.io/iplayer-web-speedtracker/

WebPageTest is an incredibly useful resource for any web developer, but the information it provides becomes much more powerful when monitored regularly, rather than at isolated events. Web application monitoring is not just for detecting downtime, it also gives you additional insight into performance trends during peak load times, as well as by time of day, and day of the week.

speedtracker04

For me, the best thing about SpeedTracker is that it runs on your GitHub repository! Data from WebPageTest is pushed to a GitHub repository. It can be served from GitHub Pages, from a private or public repository, with HTTPS baked in for free.

SpeedTracker also allows you to define performance budgets for any metric you want to monitor and receive alerts when a budget is overrun. This can be an e-mail or a message on Slack.

For instructions on how to install this tool, visit the following GitHub repo: https://github.com/speedtracker/speedtracker

 

Node.js: How to mock the imports of an ES6 module

The package mock-require is useful if you want to mock require statements in Node.js. It has a simple API that allows you to mock anything, from a single exported function to a standard library. Here’s an example:

app/config.js

function init() {
    // ...
}

module.exports = init;

app/services/content.js

import config from '../../config.js';

function load() {
    // ...
}

module.exports = load;

test/services/content_spec.js

import {assert} from 'chai';
import sinon from 'sinon';
import mockRequire from 'mock-require';

describe('My module', () => {

    let module; // module under test
    let configMock;

    beforeEach(() => {
        configMock = {
            init: sinon.stub().returns("foo")
        };

        // mock es6 import (tip: use the same import path)
        mockRequire("../../config.js", configMock);

        // require es6 module
        module = require("../../../app/services/content.js");
    });

    afterEach(() => {
        // remove all registered mocks
        mockRequire.stopAll();
    });

    describe('Initialisation', () => {

        it('should have an load function', () => {
            assert.isFunction(module.load);
        });

    });

});

API Development Tips

Organisations who are paying attention already know they need to have an open web API, and many already have under development or in the wild. Make sure you haven’t been caught by the pitfalls of many early API releases.

Multiple points of failure

  • Back-end systems: db servers/caches, hardware failures, etc.
  • Interconnections: router failures, bad cables, etc.
  • External Dependencies: fail whales, random cloud latency, etc.

The 5 tips

  1. Test it all
    • Unit test are not enough, they are just the beginning.
    • Test what users experience. Perform end-to-end black box tests.
    • Replay your access logs. Very accurate.
    • Validate return payloads. A stack trace is not valid XML.
  2. Plan for future versions
    • Versions are not sexy/semantic (but do it anyway).
    • Announce versions often.
  3. Embrace standards
    • APIs are better when predictable.
    • Standard approaches mean tools.
    • Avoid uncomfortable migrations. No one wants an OAuthpocalypse.
  4. Monitor everything & be honest
    • Trends are your friend.
    • Users are not your early-warning ops team.
    • Be open and honest, or your users will tweet that your API sucks!
  5. Fail well
    • Well-formed errors win friends and makes users more tolerant to failure.
    • Make monitoring easy.
    • Don’t punish everyone. Determine who gets hurt most by failures.

Watch the video here: Understanding API Activity by Clay Loveless

The Little Manual of API Design

This manual gathers together the key insights into API design that were discovered through many years of software development on the Qt application development framework at Trolltech (now part of Nokia). When designing and implementing a library, you should also keep other factors in mind, such as efficiency and ease of implementation, in addition to pure API considerations. And although the focus is on public APIs, there is no harm in applying the principles described here when writing application code or internal library code.

The Little Manual of API Design (PDF)