Documentation

Here you can find all the information you’ll need to create and configure your web application.

This document is also available as a standalone markdown manual.

We prefer yarn because it is faster and more deterministic than npm, install it if you need to:

npm i -g yarn

Install the command line interface and the project generator templates:

yarn global add makestatic \
  generator-makestatic

If you don’t already have yeoman you’ll want it to scaffold a project:

yarn global add yo

Run yo passing it the generator name makestatic and the name of your project (in this case webapp):

yo makestatic webapp

Complete the prompts with your project information and your new application will be created in the webapp directory.

The makestatic cli will read the project configuration from app.js in the current working directory, when an environment is specified the configuration in app.<environment>.js is merged with app.js.

To compile a project change into the project directory and run makestatic:

cd webapp
makestatic

Your compiled web application is now in the public directory. For an optimized build use the production configuration:

makestatic --env production

To watch the source files with browsersync enabled use the --watch option:

makestatic -w

Once you have installed the makestatic package globally you can get more information on the command line interface with man makestatic or via the program help makestatic -h.

The configuration object is a webpack configuration object with some additional fields so it accepts all webpack options with one notable exception, the output key should point to a directory.

These are the fields specific to makestatic.

input
String source directory.
output
String build directory.
ignore
Array list of regexp or glob patterns to ignore.
matchers
Object map of file extension patterns.
test
Object map of loader test regexp patterns.
markup
Object|Function loader options for HTML templates.
styles
Object|Function loader options for CSS processing.
script
Object|Function loader options for Javascript transpiling.
lifecycle
Object plugin configuration for the optimization lifecycle.
server
Object options for browsersync.
static
Boolean disable browsersync snippet.
manifest
Boolean generate a manifest.json file.
gzip
Boolean generate static gzip files.
deploy
Object deployment configuration.
clean
Array list of glob patterns used when cleaning the build.
url
String indicate the deployment URL.

When using the command line interface some configuration options may be overriden, see makestatic -h for more detail.

The input option points to the directory where source files are read from, these files are inserted in to the webpack pipeline and will be processed using the module rules.

The output option points to the directory where compiled files are written. This option takes precedence over the webpack configuration as the plugin needs to explicitly set the output object for predictable file names.

The ignore option is an array that contains regular expressions (or glob patterns) indicating files that should not be included in processing. Use this when you have files in the input directory you want to completely exclude from being processed.

The default configuration will ignore all files beginning with an underscore which is a useful convention to exclude files without modifying the default ignore patterns. See excluding files for more information.

The matchers option is a map of file extensions to patterns that is used to change the file extension for source files. When a pattern matches a file in the input directory it’s file extension is changed to the corresponding key.

For example the default configuration renames files with a .sss extension to .css and files with a .sgr extension to .html.

If you have modified the module loader rules to use different processors you will need to override the matchers.

The test option is a map of keys to regexp patterns used when the webpack plugin creates the default loaders, it is provided as a convenience if you want to use different file extensions or a different folder name for vendor static assets.

    1 module.exports = {
    2   // pattern for vendor assets
    3   vendor: /vendor\//,
    4   // pattern for sugarss files
    5   css: /\.sss$/,
    6   // pattern for javascript sources
    7   js: /\.jsx?$/,
    8   // pattern for reshapeml files
    9   html: /\.sgr$/,
   10   // pattern for static assets
   11   static: /\.(?!(jsx?|sss|sgr)).*$/
   12 }

The markup option configures the loader options for HTML templates, it is passed to the reshape-loader by default.

The styles option configures the loader options for CSS processing , it is passed to the postcss-loader by default.

The script option configures the loader options for Javascript transpiling, it is passed to the babel-loader by default.

The lifecycle option is a map that is used to configure the plugins used during processing and the optimization phase. See optimization for more details.

The server object allows you to customize the browsersync options.

The static option disables the browsersync snippet so that the server acts as a simple static file server. This is useful when you want to inspect the network requests without the browsersync activity.

When the manifest option is enabled a file named manifest.json is written to the output directory containing information about the generated output files.

If the gzip option is set static .gz files are created for each output file.

Most servers nowadays will generate gzip files in memory however there may still be occasions when it is useful to have static gzip files for example nginx with gzip_static switched on.

An object that configures deployment providers per environment, see deployment for more details.

The clean option allows you fine-grained control over which files are removed when the clean plugin executes. By default it will remove the entire output directory but you can set this to an array list of glob patterns to indicate which files should be removed. Usually you should not need to set this option however sometimes you may want to also remove files outside of the output directory or you may be working with large files, see the handling large files section for more information.

The url option is used to allow plugins to build absolute URL references to deployed pages. For example the sitemap plugin needs absolute URLs when building sitemap files.

There are often files in the source directory that you want to exclude from processing, you can either modify the ignore array or use the convention of prefixing a file name with an underscore.

In larger applications you may want to separate your templates into partials so that you can re-use them from different pages. By default all template files would be compiled to corresponding HTML files but you may wish to ignore the partial files, to do so prefix the file name with an underscore.

The same convention can be used for stylesheet source files when you want to group styles in to separate files and @import them from a primary stylesheet.

Webpack requires that there is at least a single entry point so the default entry point created if none is provided is:

    1 {
    2   entry: {
    3     'js/main.js': ['./js/index.js']
    4   }
    5 }

Which will treat js/index.js relative to the input directory as your entry point and write the compiled output to js/main.js relative to the output directory.

You can change this to suit your project layout just remember that the values must be an array and the source path should begin with ./ otherwise you will likely encounter webpack errors.

By default the plugin that handles compiling using webpack will ignore all javascript files other than the entry point. This is by design so that you don’t have to prefix your javascript modules with an underscore to exclude them from being compiled individually.

However there are occasions when you want to include some javascript files but bypass compilation. Typically this is when you want to drop in a third-party library and load it using a script tag. To do this place them in a vendor folder and they will be treated as static assets which means that they won’t be compiled by webpack but will be copied to the output directory and will be compressed during the optimization process.

If you want to add plugins to the webpack build you can assign them to the plugins array in the configuration file.

    1 class WebpackPlugin {
    2   apply (compiler) {
    3     // configure webpack plugin functions
    4   }
    5 }
    6 
    7 module.exports = {
    8   plugins: [new WebpackPlugin()]
    9 }

For full control of the webpack module rules you can assign loaders to the module configuration.

This is particularly useful if you want to use a different CSS preprocessor or HTML templating library.

    1 module.exports = {
    2   module: {
    3     rules: [
    4       {
    5         test: /.*/,
    6         use: [
    7           {loader: 'sources-loader'}
    8         ]
    9       }
   10     ]
   11   }
   12 }

Templates for HTML documents are configured to use reshape with sugarml by default which is a terse language with a syntax similar to pug.

We choose reshape because it is easily extensible using plugins and generates an intermediary AST for processing.

Because reshape has been designed with an HTML parser as the default you can include normal HTML documents in your templates which can be useful.

To save you some effort we recommend you use the standard html plugins:

    1 module.exports = {
    2   markup: () => {
    3     // standard html plugin configuration
    4     const std = require('makestatic-html-standard')
    5     // helper to generate a page identifier for CSS scoping
    6     const id = require('makestatic-page-id')
    7     return std(
    8       {
    9         markdown: {
   10           plugins: [/* configure markdown plugins */]
   11         },
   12         // local variables you want to expose to templates
   13         locals: (ctx, options) => {
   14           return {
   15             pageId: id(ctx, options)
   16           }
   17         }
   18       }
   19     )
   20   }
   21 }

Here we present a brief description of the language features, see the sugarml documentation for more information.

Typically you would want an HTML5 document type:

    1 doctype html

Elements are nested using whitespace indentation:

    1 ul
    2   li Item 1
    3   li Item 2

Attributes are declared in parentheses:

    1 a(href="https://makestatic.ws" title="Makestatic")

Use the dot syntax to add class names to an element:

    1 p.footnote.muted

Use a hash symbol to specify an id attribute:

    1 div#features

You can chain identifiers and classes:

    1 div#main.features

Use the pipe character to intermingle text and inline elements or to break long text onto multiple lines:

    1 p
    2   | Paragraph with some
    3   strong bold text
    4   | followed by more text.

If you prefer not to use the pipe character for long sections of text add a trailing period to the tag name:

    1 script.
    2   const hash = document.location.hash
    3   if (hash) {
    4     // do something
    5   }

You can include other templates, markdown documents and HTML files:

    1 include(src='_partial.sgr')
    2 include(src='_document.md')
    3 include(src='_page.html')

You can share common page elements using layouts and blocks, an example layout.sgr file:

    1 doctype html
    2 html(lang='en')
    3   head
    4     block(name='meta')
    5       meta(charset='utf-8')
    6       link(rel='canonical' href='https://example.com')
    7     block(name='title')
    8       title Example Title
    9     block(name='stylesheets')
   10       link(rel='stylesheet' href='/css/style.css')
   11   body
   12     block(name='header')
   13       include(src='_header.sgr')
   14     block(name='content')
   15     block(name='footer')
   16       include(src='_footer.sgr')

Then in your template pages you can extend the layout, an example index.sgr file:

    1 extends(src='layout.sgr')
    2   block(name='title')
    3     title Page Title
    4   block(name='content')
    5     h2 Heading
    6     p Page content.

Markdown documents can be included and you can inline markdown in your templates. To include a markdown document use the normal include syntax with a file that has a .md extension:

    1 include(src='document.md')

The content of the markdown document is converted to HTML using markdown-it and then parsed in to the reshape AST.

You can write inline markdown content by adding an mdi attribute to the element:

    1 p(mdi) Markdown is [commonmark](http://commonmark.org/) compliant

If you want the markdown to contain an outer p element you can use the md attribute:

    1 section(md) Markdown is [commonmark](http://commonmark.org/) compliant

The optimize preset is enabled for the stage and production environments by default.

It will compress all HTML, CSS and Javascript files using html minifier, clean css and uglify js. It will also optimize png, jpg, gif and svg images using imagemin. Note that html minifier will compress inline CSS and Javascript so you will get an extremely optimized build.

See the production configuration for an example of configuring the optimize lifecycle phase.

The options you will most likely want to modify are shown below see the optimize preset for all available options.

    1 const optimize = require('makestatic-preset-optimize')
    2 module.exports = {
    3   lifecycle: optimize((
    4     html:   {/* options for html-minifier */},
    5     css:    {/* options for clean-css */},
    6     js:     {/* options for uglify-js */},
    7     image:  {/* options for imagemin */}
    8   ))
    9 }

To create a more secure application it is recommended you configure a content security policy.

If you are loading resources from third-party servers such as CDNs or analytics we recommend enabling the SRI plugin.

To enable a content security policy for your application create a csp.txt file containing the default policy and enable the CSP parser plugin.

You can use the csp generator to scaffold a default csp.txt file for an existing application:

yo makestatic:csp src

Which will write a csp.txt file to the src directory. Content security policy text files are marked as transient and will never be written to disc.

Note that you don’t have to quote content security policy keywords in the text file the CSP parser will do that for you.

Once you have a default policy file you should enable the CSP parser for the parse phase. You will need to have enabled parsing HTML files so we recommend using it in conjunction with the parse preset.

    1 const parse = require('makestatic-preset-parse')
    2 {
    3   lifecycle: {
    4     parse: parse().concat(require('makestatic-parse-csp'))
    5   }
    6 }

The next step is to use either the CSP plugin or the CSP SHA plugin to create meta elements containing the content security policy and generate policies for inline elements.

If you want to use the nonce algorithm for inline elements use the CSP plugin otherwise if you prefer using SHA checksums choose the CSP SHA plugin.

When using the CSP plugin you should enable it for the transform phase:

    1 const parse = require('makestatic-preset-parse')
    2 {
    3   lifecycle: {
    4     parse: parse().concat(require('makestatic-parse-csp')),
    5     transform: [
    6       {
    7         plugin: require('makestatic-csp'),
    8         options: {
    9           styles: true,
   10           scripts: true
   11         }
   12       }
   13     ]
   14   }
   15 }

This will modify the default policy to include nonce values for all inline styles and scripts and add nonce attributes to each inline element. It will also generate a <meta http-equiv="Content-Security-Policy"> element and prepend it to the head of each HTML document.

If you prefer to use the SHA algorithm you can enable the CSP SHA plugin for the emit phase:

    1 const parse = require('makestatic-preset-parse')
    2 {
    3   lifecycle: {
    4     parse: parse().concat(require('makestatic-parse-csp')),
    5     emit: [
    6       {
    7         plugin: require('makestatic-csp-sha'),
    8         options: {
    9           styles: true,
   10           scripts: true
   11         }
   12       }
   13     ]
   14   }
   15 }

This plugin must be configured for the emit phase because the generated SHA checksums must operate on the final content for the inline elements so it is essential that any optimize plugins have already been processed before generating the checksums.

You can configure different CSP policies for different pages by using the rules option, see the plugin documentation for more details.

The SRI plugin adds the crossorigin and integrity attributes to stylesheets and scripts loaded from third-party domains.

This provides additional checks by the browser that the resources have not been tampered with. See the SRI Specification for more information and SRI plugin documentation to learn how to configure the plugin.

This section covers deploying the application to a configured deployment provider.

The general syntax to configure a deployment is to add a deploy object with an environment and a deployment provider, for example to configure the stage environment to deploy to amazon s3:

    1 module.exports = {
    2   deploy: {
    3     stage: {
    4       s3: {
    5         domain: 'website.com',
    6         credentials: {
    7           profile: 'default'
    8         }
    9       }
   10     }
   11   }
   12 }

Deployment provider modules are resolved relative to your project so you need to install the providers you wish to use.

npm i makestatic-deploy-s3 --save

You can then clean, build and deploy the application with:

makestatic --env stage --provider s3

If you have an existing build and don’t want to build before deployment you can perform a deployment by itself using the --deploy flag:

makestatic --env stage --provider s3 -d

We recommend deploying your application to amazon s3 as it is extremely reliable and fast. In combination with a cloudfront distribution and object compression page speeds are amazing and it is easy to configure an SSL certificate at no additional cost.

To configure a deployment for s3 all you need to do is specify a domain and provide authentication credentials. The domain will become the name of the bucket and the credentials profile is used to authenticate using credentials stored in ~/.aws/credentials.

    1 module.exports = {
    2   deploy: {
    3     production: {
    4       s3: {
    5         domain: 'example.com',
    6         credentials: {
    7           profile: 'example'
    8         }
    9       }
   10     }
   11   }
   12 }

Before taking a close look at all the configuration options it is worth knowing what happens when you perform a deployment.

The deployment provider uses a diff of the local and remote files to determine the actions that need to be taken which means that repeat deployments will only upload the files that are new or have changed and will delete remote files that no longer exist locally.

This section describes each of the configuration properties.

The domain field is required and becomes the bucket where your files will be uploaded.

The credentials object must include the name of a profile that is used for authentication. The actual credentials are stored in the INI formatted file located at ~/.aws/credentials, for example given the following entry:

    1 [example]
    2 aws_access_key_id = xxxxxxxxxx
    3 aws_secret_access_key = xxxxxxxxxx

You can use the example credentials by setting the profile:

    1 module.exports = {
    2   deploy: {
    3     production: {
    4       s3: {
    5         domain: 'example.com',
    6         credentials: {
    7           profile: 'example'
    8         }
    9       }
   10     }
   11   }
   12 }

The index field specifies the name of the file to use for directory requests. The default value is index.html.

The error field specifies the name of an error document to use typically for handling 404 requests, the default value is undefined. You may wish to set this to 404.html for pretty page not found errors however be aware that if you are using the prefix option you should also include the prefix here, for example production/404.html.

The region to use when creating bucket(s). The default value is us-east-1.

The prefix for bucket objects is particularly useful as it provides a mechanism for versioning and environment segregation. We recommend using this to prefix per environment and then configuring cloudfront with the origin path option.

So if we want to separate our stage and production environments the configuration in the app.stage.js configuration would contain:

    1 module.exports = {
    2   deploy: {
    3     stage: {
    4       s3: {
    5         domain: 'example.com',
    6         credentials: {
    7           profile: 'example'
    8         },
    9         prefix: 'stage',
   10         error: 'stage/404.html'
   11       }
   12     }
   13   }
   14 }

And the app.production.js file would use a different prefix:

    1 module.exports = {
    2   deploy: {
    3     production: {
    4       s3: {
    5         domain: 'example.com',
    6         credentials: {
    7           profile: 'example'
    8         },
    9         prefix: 'production',
   10         error: 'production/404.html'
   11       }
   12     }
   13   }
   14 }

Notice that we use the same domain in this instance because you would use two cloudfront distributions with origin paths set to /production and /stage and then configure the DNS for example.com and stage.example.com to point to the cloudfront distributions.

If you are not using cloudfront you may find it easier to just use different domains without the prefix option and configure CNAME records for example.com and stage.example.com.

You can use the cors array option to configure cross origin resource sharing rules. See the cors documentation for more information.

The policy option is a boolean that determines whether the primary bucket (domain) is given a bucket policy that allows public read access. The default value is true. If you are using a cloudfront distribution you may want to disable this to prevent people from accessing the bucket contents using the s3 endpoint. This is particularly useful if you have a strict requirement to enforce access via SSL.

Often you may want to serve content from multiple domains in which case it is convenient to redirect all requests from one (or more) domains to another. You can use the redirects array to list the domains that you want to redirect to the primary domain:

    1 module.exports = {
    2   deploy: {
    3     production: {
    4       s3: {
    5         domain: 'example.com',
    6         credentials: {
    7           profile: 'example'
    8         },
    9         redirects: [
   10           'www.example.com'
   11         ]
   12       }
   13     }
   14   }
   15 }

When redirects are specified a bucket is created for each domain with a website configuration that redirects all requests.

The publish boolean when enabled will only upload the website content. The default value is false. Once you have performed an initial deployment that has created and configured the bucket set this to true and your deployment will be faster!

The params object allows you to set additional parameters when files are uploaded to the bucket. This is particularly useful for setting cache control behaviour. Note that when parameters are specified here they apply to all objects uploaded to the bucket.

The default value configures CacheControl for one year. See the s3 sdk documentation for more detail.

The s3 deployment provider supports automatic invalidation of cloudfront distributions. After the files have been uploaded if a cloudfront distribution is configured the deployment provider will create a cloudfront invalidation using only the files that have changed. The distribution field must be specified and is the cloudfront distribution identifier and the invalidate boolean must be set in order to invalidate the distribution.

You can explicitly set an array of invalidation paths using the paths list if you like but it is recommended that you let the deployment provider send only the file paths that have changed. You could use this to invalidate the entire distribution if you wanted.

Note that the authenticated user must have access to the cloudfront API in order to create invalidations.

Be aware that Amazon charges for more than a thousand invalidations per month – you have been warned.

    1 module.exports = {
    2   deploy: {
    3     production: {
    4       s3: {
    5         domain: 'example.com',
    6         credentials: {
    7           profile: 'example'
    8         },
    9         cloudfront: {
   10           distribution: 'XXXXXXXXXX',
   11           invalidate: true,
   12           // optionally specify the invalidation paths (not recommended)
   13           // paths: ['/*']
   14         }
   15       }
   16     }
   17   }
   18 }

There is a meta data leak by containing the distribution identifier in the application configuration but we do not consider this problematic because an attacker would require your authentication credentials in order to modify the distribution. If your credentials are compromised you have bigger problems.

However, for those that would prefer not to leak this information you can put the distribution identifier in the credentials profile and then reference it using the key field, for example add the distribution identifier to ~/.aws/credentials:

    1 [example]
    2 aws_access_key_id = XXXXXXXXXX
    3 aws_secret_access_key = XXXXXXXXXX
    4 cloudfront_distribution_production = XXXXXXXXXX

And then reference it in the cloudfront configuration:

    1 module.exports = {
    2   deploy: {
    3     production: {
    4       s3: {
    5         domain: 'example.com',
    6         credentials: {
    7           profile: 'example'
    8         },
    9         cloudfront: {
   10           key: 'cloudfront_distribution_production',
   11           invalidate: true
   12         }
   13       }
   14     }
   15   }
   16 }

Here we describe in depth how to deploy to github pages.

The questions that you need to answer before configuring a github pages deployment are:

The answers to these questions will determine how you configure the deployment and any steps you may need to take in the github interface.

If you wish to deploy to the user or organization flavour the important thing to remember is that the files must be served from the master branch, if the deployment is for the project style of github pages then the files can be served from the master or gh-pages branches or from a /docs directory – in which case you need to configure this for the deployment.

If you want to commit the build files to your working tree it will speed up deployment as git will only push files that have changed however in some situations this is not ideal. For example if your production deployment goes to a different provider and you want to deploy to github pages for the stage environment it may be preferable to ignore the build files from the working tree.

A small but important detail is that if you are deploying to user or organization pages the site is served from {user-or-org}.github.io which makes it safe to use absolute file paths like /style.css in your web site however if you are deploying to project pages the site will be served from {user-or-org}.github.io/{project} in which case you should not use absolute path references or you will find that files are not found when the site is deployed.

It is recommended that you set up a remote specific for deployment which will make it easier to switch to another repository later. If you use the default remote name of deploy there is no additional configuration required.

git remote add deploy \
  git@github.com:makestatic/makestatic.github.io.git

Be sure to change the remote repository URL to match your setup.

Before going into detail on the deployment logic here are some configuration recipes for the available deployment styles.

There are some repositories that are configured to verify these deployments you may want to take a look at them.

Note that using user or org for the deployment type is effectively the same.

If your build files are checked in to the working tree.

    1 module.exports = {
    2   deploy: {
    3     stage: {
    4       pages: {
    5         type: 'org'
    6       }
    7     }
    8   }
    9 }

If they are being ignored, set the ignored flag:

    1 module.exports = {
    2   deploy: {
    3     stage: {
    4       pages: {
    5         type: 'org',
    6         ignored: true
    7       }
    8     }
    9   }
   10 }

If you are deploying using project style pages set the type to project and configure the remote branch to push to:

    1 module.exports = {
    2   deploy: {
    3     stage: {
    4       pages: {
    5         type: 'project',
    6         branch: 'gh-pages'
    7       }
    8     }
    9   }
   10 }

In this case the remote branch can be either gh-pages or master but must match what you have configured in the repository settings when you enabled github project pages.

If you want to use the /docs directory to serve the web site from you should configure the build to write to the docs directory:

    1 module.exports = {
    2   output: 'docs',
    3   deploy: {
    4     stage: {
    5       pages: {
    6         type: 'project'
    7       }
    8     }
    9   }
   10 }

Each deployment must have a unique tag. By default this will be created using a convention but you can override this logic if necessary.

The convention is to load package.json from the current working directory and prefix the package version with deploy-v such that if your application has a 1.0.0 version the deployment tag is set to deploy-v1.0.0.

If you want to implement your own logic for generating a unique deployment tag you can assign a tag function to the provider:

    1 module.exports = {
    2   deploy: {
    3     stage: {
    4       pages: {
    5         tag: () => {
    6           // generate and return a unique tag name for the deployment
    7         }
    8       }
    9     }
   10   }
   11 }

Before deployment the provider will check that your local repository is safe to work with which means the following requirements are necessary to perform a deployment using git:

The deployment process is the same for all types of pages deployment with slight variations in how files are pushed. This section gives an overview of the common steps.

Before pushing files the deployment provider will:

Then the push to the remote will execute which will vary slightly based on the deployment configuration.

Subsequently the following steps are taken:

At this point the deployment is complete and you have a tag that can be used to revert to an earlier deployment if necessary.

Once you have a remote set up you can configure the deployment in this case we assume configuration for the stage environment. In your app.stage.js file set the name of the remote to use, indicate which flavour of github pages you are using by setting the type and whether the build directory is being ignored:

    1 module.exports = {
    2   deploy: {
    3     stage: {
    4       pages: {
    5         type: 'org',
    6         ignored: true
    7       }
    8     }
    9   }
   10 }

The type may be either org or user (which are equivalent) or project. The ignored property indicates whether the build files are being ignored from the working directory which impacts the logic used for deployment.

Because in this configuration we have set the type to org configuring a remote branch would have no effect as the deployment must go to the master branch as that is where your web site will be served from.

This type of configuration potentially requires you to modify the github settings for the repository. Because the files are being ignored and they are in a sub-folder (the public directory by default) we need to push them to the root of the master branch. The deployment logic handles this scenario but there is one caveat, when the build files are being ignored to push to master when master already exists we need to delete the remote master branch before pushing the updated build files. In order to do this the master branch must not be configured as the default branch for the repository.

If you are creating a new repository and perform a deployment for this configuration then there is nothing you need to do as the deployment will push a branch other than master first which then becomes the default branch allowing master to be deleted when pushing new deployments. However if you already have a master branch and it is the default branch you need to change the default branch, go to the repository Settings > Branches and select a branch other than master as the default branch then you will be able to use this style of deployment configuration.

The processing lifecycle is split into parts comprising of one or more phases, each phase executes plugins sequentially.

It is designed so that source files can be preprocessed with webpack but can also be used to process an existing web application.

Source files are loaded and if necessary compiled using webpack.

clean
Remove build files.
build
Preprocess source files.
sources
Read source files.
pack
Compile source files.

Transform and optimize the compiled output files.

parse
Parse to abstract syntax trees.
graph
Create resource graph.
transform
Transform abstract syntax trees.
verify
Verify document trees.
resolve
Resolve output paths and file content.
validate
Validate output files.
optimize
Optimize output files.
emit
Create auxiliary files.
manifest
Generate file manifest.

Write files to disc and perform a deployment if necessary.

write
Write files to disc.
audit
Audit files that have been written to disc.
deploy
Deploy files to a configured provider.

This section provides some more detail on each of the lifecycle phases.

Removes the output directory to make sure your build is pristine.

The build phase can be used to run plugins or commands that pre-process the source files in the input directory. This website relies on markdown documents some of which exist outside of the input directory – the build phase is used to gather and process those documents.

The sources phase inspects the input file list removing files that are ignored and buffering file content when necessary.

The pack phase is responsible for bundling or packaging your assets.

The parse phase takes the HTML, CSS and Javascript output files and parses them to abstract syntax trees so that plugins can modify the files during the transform phase.

The graph phase builds a graph of the resources referenced in the output files.

During the transform phase plugins can modify the contents of the abstract syntax trees to rewrite or inject content. For example the inline-css plugin finds external stylesheets referenced in HTML documents and converts them to inline styles.

Perform verification of the document trees. Use this phase to check for duplicate element identifiers or verify that named anchors exist in a target document.

The resolve phase sets the output path and seals the file content.

The validate phase is used to validate the output file content. Typically used to validate HTML documents but you can configure any validation plugins you need.

Note this differs from the verify phase which operates on the abstract syntax trees not the resolved file content.

The optimize phase will compress assets. Typically this phase is enabled for the production environment.

The emit phase is used to generate additional files that should be written to the output directory. If the gzip plugin is enabled static gzip files are created during this phase.

The manifest phase generates checksums for each output file and writes the data to a manifest.json file which is added to the pipeline. This is useful if you need to check the integrity of the output files.

Writes the output file assets to disc.

Note that when compiling with webpack this phase does not execute as webpack is responsible for writing files to disc.

Use the audit phase when you need to verify the integrity of files written to disc.

If you want to perform a deployment this phase will execute the deployment provider for the current environment.

Each lifecycle phase will execute a list of plugins in series, by default these are configured for you but if you want to create your own plugins this section describes how to create and configure plugins.

To create a new plugin you can use the plugin generator:

yo makestatic:plugin transform-links

Which will scaffold a plugin in the transform-links directory.

The plugin class is instantiated with the processing context which can be used to inspect the processing options and all the processing state information. It is also passed an options object which are the options that are assigned to the plugin when it is configured.

The plugin may declare before, sources and after methods. The before function is called before iterating the list of files being processed, sources is called with each matched file and after once all files have been processed.

An example plugin class to illustrate:

    1 class Plugin {
    2   constructor (context, options = {}) {
    3     this.context = context
    4     this.options = options
    5   }
    6 
    7   before (context, options = {}) {
    8     return context
    9   }
   10 
   11   sources (file, context, options = {}) {
   12     return context
   13   }
   14 
   15   after (context, options = {}) {
   16     return context
   17   }
   18 }

If a plugin function needs to operate asynchronously it should return a Promise.

Note that the file argument passed to sources is a complex object with many useful properties and functions, see filewrap for more detail.

Plugins can log messages using a simple API exposed by the context.

    1 const log = context.log
    2 log.info('[plugin] %s', files.length)

The following log levels are available:

debug
Log debug messages.
info
Log informational messages.
warn
Log warnings.
error
Log errors.

Plugins are configured by assigning a list of plugin configurations to a lifecycle phase.

    1 {
    2   lifecycle: {
    3     transform: [/* configure transform plugins */]
    4   }
    5 }

The plugin constructor function is required, all other fields are optional.

plugin
Function plugin constructor function.
test
RegExp|String filter pattern for file sources to process.
options
Object map of plugin configuration options.
exclude
Array list of exclude patterns.

To configure a plugin you assign it to a lifecycle phase:

    1 {
    2   lifecycle: {
    3     transform: [Plugin]
    4   }
    5 }

In this instance the plugin has no options and will match all files, you can choose which files should be passed to the sources function by configuring a test regexp (or glob) pattern:

    1 {
    2   lifecycle: {
    3     transform: [
    4       {
    5         test: /\.(sgr|html)$/,
    6         plugin: Plugin
    7       }
    8     ]
    9   }
   10 }

Note that the test pattern is always applied to the raw source file name not the output file.

If you want to pass options to the plugin configure them:

    1 {
    2   lifecycle: {
    3     transform: [
    4       {
    5         test: /\.(sgr|html)$/,
    6         plugin: Plugin,
    7         options: {
    8           // configure plugin options
    9         }
   10       }
   11     ]
   12   }
   13 }

Note that you can pass options directly to the plugin configuration without the options object and they are passed to the plugin options provided they do not collide with the main plugin configuration fields plugin, test, exclude or options.

    1 {
    2   lifecycle: {
    3     transform: [
    4       {
    5         test: /\.(sgr|html)$/,
    6         plugin: Plugin,
    7         // same as {options: {force: true}}
    8         force: true
    9       }
   10     ]
   11   }
   12 }

Sometimes you may want to exclude certain files from being passed to sources even though they match the test pattern:

    1 {
    2   lifecycle: {
    3     transform: [
    4       {
    5         test: /\.(sgr|html)$/,
    6         plugin: Plugin,
    7         exclude: [
    8           // ignore google site verification files
    9           /google[a-z0-9]+\.html$/i
   10         ]
   11       }
   12     ]
   13   }
   14 }

Plugins may want to add or remove files from the processing pipeline; to do so you can use the context.list API.

The getFile(), add(), remove() and get() methods should be sufficient for most requirements, see the core library for the complete API.

To create a new file call the getFile() function passing the file name or options – if you pass a string path it is wrapped in a File object.

    1 context.list.getFile(/* file path or options */)

See the filewrap documentation for more information on options available when creating files.

To get a reference to a file that already exists in the pipeline call the get() function.

Use case: the plugin that converts external stylesheets to inline styles needs to get a handle on the referenced stylesheet so it can inject the styles into the HTML page.

    1 context.list.get(/* relative file path */)

To add a file to the pipeline call the add() function.

Use case: the plugin that generates a manifest of output files calls the add() function to inject an additional file that needs to be written to the output directory.

    1 context.list.add(/* file handle */)

To remove a file from the pipeline call the remove() function.

Use case: the plugin that reads the source file list and performs a filter using the ignore configuration calls the remove() function when a pattern matches so that the file is no longer part of the pipeline.

    1 context.list.remove(/* file handle */)

This section describes how you can configure your project for faster builds when you are working with large binary files.

If your project requires videos it is recommended that you upload them to a service like vimeo or youtube and load them from third-party servers but there are still occasions when you may need to include large files in your build directory. Your project may offer a software release for download or some other type of large binary file. In this instance it does not make sense for webpack to load the file from the source directory and write it to the output directory every time a build is performed.

The solution is to place the file(s) in the output directory and configure the clean option to only remove certain files.

    1 module.exports = {
    2   clean: [
    3     'public/**/*.html',
    4     'public/css',
    5     'public/js',
    6     'public/img'
    7   ]
    8 }

You would then add the large files to your source code management system and ignore all other files from the output directory.

By default loaders are resolved from node_modules relative to the current working directory.

You can configure the paths used to resolve loaders using the modules array of resolveLoaders but you should be careful because if you use long paths webpack starts to get very, very slow which appears to be related to the string splitting in the enhanced-resolve module.

It is recommended that you use the default path for resolving loaders for performance reasons.

The default options are shown below:

    1 module.exports = {
    2   // generate sourcemaps
    3   devtool: 'sourcemap',
    4 
    5   // input directory
    6   input: 'src',
    7 
    8   // output directory
    9   output: 'public',
   10 
   11   // file extension rename matchers
   12   matchers: {
   13     html: /\.sgr$/,
   14     css: /\.sss$/
   15   },
   16 
   17   // files to ignore from the sources
   18   ignore: [
   19     '**/layout.sgr',
   20     '**/_*',
   21     '**/.*'
   22   ],
   23 
   24   // browser sync defaults
   25   server: {
   26     port: 1111,
   27     logLevel: 'silent',
   28     notify: false
   29   }
   30 }

Take a look at the configuration used for this website, it really is quite simple.

    1 const parse = require('makestatic-preset-parse')
    2 const robots = require('makestatic-parse-robots')
    3 const csp = require('makestatic-parse-csp')
    4 
    5 module.exports = {
    6 
    7   // indicate the deployment URL
    8   url: 'https://makestatic.ws',
    9 
   10   // change these to your preference
   11   input: 'src',
   12   output: 'public',
   13 
   14   // configure browsersync options
   15   server: {
   16     ghostMode: false
   17   },
   18 
   19   // configure css processing
   20   styles: () => {
   21     const std = require('makestatic-css-standard')
   22     const conf = std()
   23     // you can add postcss plugins to `conf.plugins` here
   24     return conf
   25   },
   26 
   27   // configure template processing
   28   markup: () => {
   29     const std = require('makestatic-html-standard')
   30     const id = require('makestatic-page-id')
   31     return std(
   32       {
   33         markdown: {
   34           plugins: [require('markdown-it-deflist')]
   35         },
   36         locals: (ctx, options) => {
   37           return {
   38             pageId: id(ctx, options),
   39             pkg: require('./package.json')
   40           }
   41         }
   42       }
   43     )
   44   },
   45 
   46   // configure javascript transpiling
   47   script: {
   48     presets: ['env']
   49   },
   50 
   51   // configure development mode lifecycle
   52   lifecycle: {
   53     build: [
   54       require('makestatic-build-version'),
   55       {
   56         plugin: require('makestatic-build-exec'),
   57         watch: false,
   58         commands: ['mk docs api']
   59       }
   60     ],
   61     parse: parse({js: false}).concat(robots, csp),
   62     graph: require('makestatic-graph-resources'),
   63     transform: [
   64       {
   65         plugin: require('makestatic-sitemap'),
   66         formats: ['html'],
   67         template: 'sitemap/index.html'
   68       },
   69       {
   70         plugin: require('makestatic-csp'),
   71         styles: true
   72       },
   73       require('makestatic-auto-title'),
   74       require('makestatic-dom-version'),
   75       {
   76         plugin: require('makestatic-permalink'),
   77         from: 3
   78       },
   79       {
   80         plugin: require('makestatic-inline-css'),
   81         remove: true
   82       }
   83     ],
   84     verify: [
   85       require('makestatic-verify-id'),
   86       require('makestatic-verify-anchor')
   87     ]
   88   }
   89 }
    1 const conf = require('./app.production')
    2 
    3 // subdomain for stage deployment
    4 conf.url = 'https://stage.makestatic.ws'
    5 
    6 // run as static web server
    7 // disables browsersync network requests
    8 conf.static = true
    9 
   10 // disable manifest generation
   11 conf.manifest = false
   12 
   13 // no need to audit in stage
   14 conf.lifecycle.audit = null
   15 
   16 conf.deploy = {
   17   stage: {
   18     s3: {
   19       domain: 'makestatic.ws',
   20       credentials: {
   21         profile: 'makestatic'
   22       },
   23       prefix: 'stage',
   24       params: {
   25         CacheControl: 'no-store, no-cache, must-revalidate'
   26       },
   27       region: 'ap-southeast-1',
   28       error: 'stage/404.html',
   29       redirects: [
   30         'www.makestatic.ws'
   31       ],
   32       publish: true,
   33       cloudfront: {
   34         key: 'cloudfront_distribution_stage',
   35         invalidate: true
   36       }
   37     }
   38   }
   39 }
   40 
   41 module.exports = conf
    1 const optimize = require('makestatic-preset-optimize')
    2 
    3 module.exports = {
    4   // disable source maps
    5   devtool: false,
    6 
    7   // generate manifest file
    8   manifest: true,
    9 
   10   // always clean files in production
   11   clean: true,
   12 
   13   // configure optimization lifecycle
   14   lifecycle: {
   15     transform: [
   16       {
   17         plugin: require('makestatic-inline-data'),
   18         // css is inlined so no need to process css files
   19         test: /\.(html|sgr)$/,
   20         remove: true,
   21         rules: [
   22           {
   23             test: /logo(-header)?\.png$/
   24           }
   25         ]
   26       },
   27       // NOTE: must generate sitemap before pruning
   28       // NOTE: unused css rules
   29       {
   30         plugin: require('makestatic-sitemap'),
   31         formats: ['html'],
   32         template: 'sitemap/index.html',
   33         robots: true
   34       },
   35       {
   36         plugin: require('makestatic-csp'),
   37         styles: true
   38       },
   39       require('makestatic-auto-title'),
   40       require('makestatic-dom-version'),
   41       {
   42         plugin: require('makestatic-permalink'),
   43         from: 3
   44       },
   45       {
   46         plugin: require('makestatic-inline-css'),
   47         remove: true,
   48         prune: true
   49       }
   50     ],
   51     verify: [
   52       require('makestatic-verify-id'),
   53       require('makestatic-verify-anchor')
   54       // require('makestatic-verify-link')
   55     ],
   56     emit: [
   57       {
   58         plugin: require('makestatic-fingerprint'),
   59         rules: [/main\.js$/]
   60       },
   61       {
   62         plugin: require('makestatic-sitemap'),
   63         formats: ['xml'],
   64         robots: true
   65       }
   66     ],
   67     optimize: optimize(),
   68     audit: [
   69       require('makestatic-validate-html'),
   70       require('makestatic-audit-files'),
   71       require('makestatic-archive-zip')
   72     ]
   73   },
   74 
   75   deploy: {
   76     production: {
   77       s3: {
   78         domain: 'makestatic.ws',
   79         credentials: {
   80           profile: 'makestatic'
   81         },
   82         prefix: 'production',
   83         region: 'ap-southeast-1',
   84         error: 'production/404.html',
   85         redirects: [
   86           'www.makestatic.ws'
   87         ],
   88         publish: true,
   89         cloudfront: {
   90           key: 'cloudfront_distribution_production',
   91           invalidate: true
   92         }
   93       }
   94     }
   95   }
   96 }