Build a kick-ass PHP Microsite in under 4h

Welcome to my PHP Microsite Boilerplate!

Jens Kuerschner
14 min readJan 3, 2022

Around 1.5 years ago, I built a boilerplate to enable developers, creating microsites with PHP from scratch in a very short time, while making no compromises regarding performance (Lighthouse score of 100, baby 😎), security, and SEO (see the original story here).

Since then, a lot of updates and upgrades went into the code base. The maybe most interesting one has been the integration with the headless open-source CMS Directus (for a story about that, click here).

Now, I officially want to introduce you to version 2 of this boilerplate (or almost framework)!

Besides a lot of bug fixes and even more updates, it includes even more fancy things like build scripts for css and JavaScript optimization and more.
Based on the original philosophy, it is still made for those, who are no super senior frontend developers; while being as flexible as possible. This means, you can use a ton of features, but you do not need to. If you are not familiar with DevOps, CI/CD, NodeJS, etc. — do not use it! You can still work with the boilerplate and create an extremely stable site.

You should check the GitHub page at for more details.

This article is all about demonstrating how it works and how it is set up (step-by-step). And I do so, by building the respective demo page from scratch, while blogging about it.

1. Getting the code

Navigate to the official GitHub repository at:

Download or clone the code to your local system.

I hardly recommend to then create your own repository for it. At least I do so in this example. 😊

2. The base — web server setup

The boilerplate is prepared to run the php page on either Apache or nginx webservers. And even within Apache, there are sometimes multiple options depending on whether you are somehow restricted with regards to the available modules.

Theoretically, you can even use the Docker-compose.yml to run it as Docker container. But mind that this is only recommended for local testing!

A side-note regarding nginx:
This is somehow experimental, because nginx often is highly custom. There are so many potential setups, that it is more or less impossible to reflect this here. The provided conf files provide you with an ideal configuration within a specific scope. So, if you start using nginx from scratch, you can definitely use the config in production. However, if you have already another config in place, you should spend some time to somehow merge them in a clever way!
One more thing: Since server blocks in nginx are not as flexible as with Apache when it comes to runtime, the boilerplate also includes an bash script. You can run this on boot to configure the nginx with the issued files. This, for example, is necessary, if you want to use the code with a Microsoft Azure Web App under PHP 8 — they only support nginx as a webserver and offer the option to run a deployment script on web app startup.

But now, let’s get into the config.

  1. Open .htaccess for Apache or nginx.conf (under nginx_conf) for nginx.
  2. Adjust anything with “YOURDOMAIN” throughout the file.
  3. For nginx, check the server blocks, if the port and parameter configuration fulfills your needs.
  4. Check the “Security Headers” part and adjust it to your needs. You can find more detailed information (and later test+validation option) at
    In case you do not have the headers module available under Apache, remove this block and use the php_security_headers.php instead (adjust the parameters there and uncomment it at the index.php.
  5. For caching within Apache, there is also an alternative, if you cannot use mod_headers.
  6. Read through the whole config. There are parts about restricting access to specific files and directories. This is for Hotlinking protection as well as general protection of critical areas. Adjust this to your needs.
  7. Finally, double check the logic which forces www or not.
    Within the .htaccess, there are simply 2 blocks prepared — use the one or the other.
    At nginx, you need to check the very first server block. It redirects non-www to www. Reverse this, if needed.

That’s it for now. It should all be set up now here.

Of course, bringing it to the actual server is another step. I cover this at chapter 7, but bear in mind that this is highly different depending on your available setup.

3. Taking care of the config

You can use the version_nr variable to control css and js browser caching, since the code will add this to the respective urls. But for this initial build, keep it as it is.

First, you should take care of the languages, which you want to use.

The boilerplate already includes 3. Adjust them accordingly.

If you want to build your page without any multi language support, remove 2 and only setup the language of your choice (it is still required to setup a language — and will have at least some SEO impact).

In my case, I stick to the 3 languages to stay close to the boilerplate, while building a demo page for it.

Next, the deployment script: I skip this, since it will be part of step 7.

If you want to include the Google Tag Manager (GTM), you can now include the ID. I hardly recommend to not include any analytics or marketing scripts directly, but use the GTM for it! You should also think about including a cookie banner script via the GTM — I recommend CookieHub for that; they even provide a library and guidance, that can be directly imported into GTM.

In my example, I do not use any analytics and/or tracking, to safe on the cookie banner and because I have no glue, why this would be important for my demo project.

The Directus part is only relevant, if you want to make use of the Directus CMS. I love it, but will not use it here, so I skip the variables. See the respective sub-chapter under #5 for more information on that part.

The page URL is essential. I set the domain, which I use for my demo page:

However, for local development, I temporarily use only “/” instead.
Read the repo readme for instructions on how to use the included docker compose script, to develop and test the page locally, before deploying it to any external system.

This leaves me with the PWA and meta settings. Since the latter one will be part of the next chapter, I only provide a webapp name, leave the status to “true” and set my them color (in my case, I leave it to the default green). Since the comments (always read the comments!!) guide me to the manifest.json, I also check this and update the names and colors there.
Setting $the_webapp_status to true will tell the browser that the website is PWA ready. It will then offer the user the option to install the website as an application to the respective device.

With regards to the theme color, you should also check the browserconfig.xml — but I recommend to do this together with setting up further meta details.

4. Preparing the meta information and assets

We are still in the config.php and are now getting to the meta information. Should be a no-brainer. I adjust it to my needs.

This brings me to the second important “meta information” part. The general_meta.php within the “templates” directory.
This file gets loaded on every page within the head section and holds general meta information.

You will see that many things get filled with some variables, which are usually defined via the page/routing and config.php setup. So, no worries — you won’t see the same meta information on every page.

Still, there are some things, you should take care of here.

First, double-check the Open Graph and Twitter card section.
The Open Graph part is defined as “website”. This should fit most projects, but maybe not yours.
The Twitter card part uses “summary_large_image” as default card scheme. Again, should fit most projects, but maybe not yours.
Both refer to the social_media.png image within the images directory. It is time to adjust this image to your need. You could even think of using a variable here and maybe link it to a new page parameter. Or you use different images for Twitter and Open Graph. Whatever you are up for, mind to also double-check the .htaccess or nginx.conf to whitelist those files for hotlinking!

Second, go to and create favicons for your page.
I recommend to go to the favicons directory (under assets) first and create your icon using the favicon_template.png file. Then, go to the tool and mind to tick all checkboxes, which offer you to create even edge case icons. Get the package and replace the dummy favicons in the folder. This way, all pre-configured icons in the code should automatically match an icon at the assets.
Also mind to check the color at the safari mask-icon and copy the .ico favicon into the root folder as well!

To finish the part, where you setup the assets, you should also already prepare the font files (we will keep them within the application and not use any webfont CDN, because it keeps us with more control and using the CDN brings up some GDPR issues, which are not worth the tiny benefits).
I recommend using Google web fonts and downloading them via
Place the files in the fonts folder. At this point, you could also already have a look at the header.php in the templates directory. Not much to do there, but you should update the section, where we are pre-loading the fonts.

5. Setting up the pages and routing

To define the pages of your website, you need to open routing.php.
It holds a lot of documentation and examples. Therefore, I will skip this part more or less here.

Only four things:

  1. Make sure that you have 1 block per page per language (using the same id)!
  2. You should also keep the pre-defined offline and error pages!
  3. You should always have a legal notice and privacy policy set up somehow — legal issues, you know…
  4. Create the pages as files in the pages directory — and, if defined, respective files for the controller in the controller directory.

In my case, I only define the offline, error, and main pages (since I will redirect to another page for the legal stuff).

Some specialties to consider:

If you need some more logic on one page, you can simply create a php file at the “controller” directory.
You should also have a look at the helper_functions.php in the “lib” directory. They include very helpful small function to validate input and more. They are included by default — so, use them if needed.

In case you want to create any “stupid” redirects directly, use the redirects.php instead of setting the redirect param at a page. This would be the recommended way, if the redirect is a quite static thing and not a temporary workaround. This also helps, if you have a lot of redirects and do not want to put this into the routing file — for example when setting up SEO redirects after a migration.

Side chapter: Setting up Directus CMS integration

Connecting the website to a Directus CMS API is already prepared in multiple ways and therefore quite simple.

First, you need to set up Directus. You can use version 8, but I strongly recommend the latest version 9!
There, you also need to create an API user, so you have some credentials to use.

Second, you need to configure the connection within the config.php of the boilerplate. This is quite straight forward and well documented. I hardly recommend to make use of the prepared caching functionality. However, mind to create a purge mechanism! There is a webhook prepared, which can be triggered from Directus on every content change. Define a secret code, so the URL gets unique and cannot be easily attacked. Also mind the cache_purge_rebuild.php in the lib directory for more options and control (e.g. purging specific files, etc.).

Now, we have the connection established.

Next, you need to decide on how you want to include the content. There are basically 2 options and they are both described and defined within the routings.php.

You can connect a specific page to a specific Directus collection. Afterwards, you can make use of the Directus content within that page (and/or its controller).

Or, you make use of the “Directus CMS dynamic pages” feature, which can be set up towards the bottom of the file. There, you can define one or multiple collections, which act as provider for your website’s pages. This means, that you only need the error and offline page above this block. The website then will pull all other pages directly and dynamically from Directus. To make this work, you need to specify per collection, where the boilerplate (which field) can find the relevant information (more is described in the code).

That’s it. Directus set up.

Side note: You can also always pull additional information from any Directus collection at any time with the getDirectusContent function.
For example, you can pull additional information from a specific collection within a specific controller via:

$sample_array = array();
$sample_array = getDirectusContent(‘specificCollection’, ‘’, ‘’, true, ‘*.*’, true);

You can then use the content via the defined variable on the page.

echo $sample_array[‘fieldNameXYZ’];

6. Building the actual website


In my case, we are making full use of all core features, like the TailwindCSS integration and prepared build script.

This requires nodeJS at least on your local machine. Run the following command to get it all set up:

npm install

If you do not want to use TailwindCSS, you can simply remove it from the style.css file. You can then also uninstall the whole tailwind node package.

The build script still works. If you also do not want to use this and NodeJS at all, you can ignore the respective files (especially Grundfile.js and postcss.config.js), but mind that you need to take care of any JavaScript and CSS optimization yourself!

One more word about TailwindCSS:
Some say that it breaks the rule of keeping styling out of the HTML markup. And there are many more details which will support this.
And I support this as well.
But, Tailwind exists out of similar reasons why this boilerplate projects exists (which makes it such a good fit): Because it speeds up development of quite simple websites, where you do not rely on a project structure, which makes it easy to maintain and work on it with 100 other developers.
Bottom line:
If this boilerplate feels like a good fit for your project, TailwindCSS definitely is too!
If you are up for a bigger project, where this boilerplate feels a little bit strange and “unprofessional”, do not use this boilerplate and also do not use TailwindCSS — it is not bad, but simply not made for your needs 😉.

Let’s move on.

To test locally, I adjust the container_name within the docker-compose.yml and use Docker Desktop to make it run without any server.
Mind the “Local Development Setup” part in the readme for further instructions.
Once setup, you can simply test with localhost and port 80 (localhost:80 in your browser bar).
Mind to reset the page url in the config.php before pushing it to the acutal server!

The actual creation

First, I include my fonts. I have already prepared them in chapter 4 as assets (and preloading items in my header.php). Now, I include them in my style.css. Therefore, I include them as @font-face and use the given examples. They, for example, provide support for all browsers and make use of the font-display swap functionality. In addition, I define them in the “@layer base” for body and a tag.

Second, I update the style.css for some basic styling. This means setting some base colors for the body and selection selectors. I could also use Tailwind for the body, but I rather keep that here as some general anchor. Sue me.
I also double-check the YouTube integration and language switcher style, but do not change anything here — it should fit most setups. If you are about to change something about them, also check the base.js file for those features

Next, I check the header.php and footer.php in the templates directory.
They define the frame for your page.

And now, last but not least, I finally start building the pages one by one.
We already created some files in chapter 5 for them. So, I now simply only fill them with content.

I also double-check the style and content of the offline and error pages.
Here, you will also notice that there are not multiple files per language, but only a general one. In this case, to manage internationalization, we use the gettext feature. We basically only use a placeholder (which also will be used as default) and translate it with a .po file. Checkout the “translations” folder. It holds one sub-folder per language (except for the default one). Within the respective sub-folder “LC_MESSAGES”, you find a .mo and .po file. Work with the .po file and then use a tool like to generate the corresponding .mo file. Everything else should work out-of-the-box.
This is recommended, if you only translate text per language. If you also want to use a different setup or logic for different languages, use different files instead. Or if you really have extremely long texts (as with the privacy policy) — then, I would also recommend to not use gettext, but rather multiple files.


As mentioned before, we use Docker Desktop for local development and testing. To have the css built, we would need to run the following command before refreshing the localhost in our browser.

npm run build

And the following to fire the Docker container (if not already done).

docker-compose up -d

This can be a little bit annoying. To speed this up, we can uncomment the TailwindCSS Play CDN in the header.php. This enables us to test all Tailwind features within the browser without building everything everytime from scratch.
Mind to comment it again, before pushing the project to production!

7. Build and Deploy

In this sample case, we build it locally and commit the built code to GitHub.

I simply use the configured build-script. It automatically optimizes the TailwindCSS code, merges it with my custom CSS, minifies it, combines and minifies all JavaScript files in the js asset directory, and also optimizes the service-worker script.

npm run build

I make also sure that in the config.php, I set up a deployment slug and script (do not use the default!). For the script, use the sample_deploy.php as template.

At my webhoster, I navigate (via SSH) to the respective root directory for the project.
There, I clone my repository as an initial step.

At GitHub, I setup a webhook, which triggers my deployment script (mind to not include the deployment script file into your repository!). Note the secret!

Back at the server, I now manually upload my deployment script, where I have added my GitHub secret before.

The final logic: Any push to the repo, triggers the webhook, which calls my deployment script. The script then pulls the latest update from GitHub, using the secret to authenticate.

CI/CD successfully implemented!

Final Words

This was it. Building a fully custom PHP microsite at lightspeed time!

🐱 The GitHub Repo:
📺 Demo:

Even as with version 2, it starts to feel like a full PHP framework, you should not look at it that way. It is a boilerplate. So, feel free to check out all the other files and use this as some foundation for your project. The major idea was to provide some fundamental groundwork without a lot of overhead, where you can get in quite fast as a maybe not very senior developer.

Side note: I hardly recommend to also add some CDN like Cloudflare. This brings you additional speed and protection. Plus, it takes away some load from your server.

Happy coding! 🚀



Jens Kuerschner

Tech Founder, Leader, End-to-End Product/Program Manager, Full-Stack Developer, Marketing and Digitalization expert. 🚀