Adding 2FA to all my services

A project that has been on my mind for some time now is to establish Single-Sign On for a couple of my self-hosted services at home. I am using Bitwarden as a password manager and have separate log-ins with 2FA even for services which are not exposed outside the home network; there are some service I would like to expose, which do not have 2FA integrated though (like Calibre-Web). So I started to do some research on self-hosted solutions!

Settling on Authelia

There are multiple very interesting solutions to achieve SSO and store users. Most prominent in the Open-Source community is probably Keycloak, which is enterprise ready and integrates with popular LDAP servers. However, even though I am always eager to learn new stuff, this solution seemed way to sophisticated for my needs. Next on the list was Authentik, which may be the almost ideal solution for home setups. Next to supporting all major protocols it does have the added benefit to also have a nice UI and serve as a remote proxy, theoretically mitigating the need for an additional solution like Caddy, Traefik or nginx; also, it can be used as an application portal after Log-In, furthermore replacing dashboards like Flame or Heimdall (I am using Homarr).

So what kept holding me back? While Authentik seemed to be the ideal solution, resource requirements amounted to 1-2GB RAM. While I do have a powerful enough server, allocating 2GB of RAM only for the authentication part seemed to overkill to me. In contrast, my current revery proxy required 30MB of RAM, and the authentication solution which is now running consumes a mere 40MB of RAM. Also, I do not need most of the features Authentik describes in the comparison table on their website, which is also missing that the winning contender, Authelia, does support OIDC at the time of writing.

So why Authelia? While Authentik might be the more sophisticated solution, Authelia seems to integrate into my existing setup very easily. I was initially considering to host Authelia as a docker-compose stack with redis and MariaDB - while this would be a preferred solution for production, my use case can sufficiently be solved with a simple yaml-file as user store and an encrypted sqlite-database file as storage provider, further emphasizing simplicity of the solution. Setting up and syncing my own LDAP with FreeIPA or OpenLDAP to Authelia for fun was tempting, but if I ever decided to upgrade the user backend, I would probably skip LDAP directly and just properly setup Authentik. So what am I using Authelia for now? I basically have two authentication scenarios: proxy-authentication via Caddy headers and OIDC connections.

Configuration and integration with Caddy

For installation I went the easiest route this time: Installing the Docker container via the Community Applications plug-in in Unraid. Since I decided for maximum simplicity, this seemed like the way to go. The first start of the container will fail, since the configuration.yaml file first needs to be adapted. The configuration file offers a lot of explanation for various options; this example gave additional guidance. Also, the user_database.yml file needs to be created if that option is chosen. For extra security, I decided to pass sensitive information like SMTP password or encryption keys to the container as secret files.

The most important setting might be access control, however. If these settings can be bypassed, an intruder would be able to directly access a protected service’s log-in page; in the case of proxy_authentication, they could then inject a header into the request and bypass authentication altogether. Therefore, the default ACL rule should be “deny” except for specified eceptions:

access_control:
  default_policy: deny
  rules:
    ## Rules applied to everyone
    - domain: 'auth.DOMAIN.tld'
      policy: bypass
	- domain: '*.DOMAIN.tld'
      policy: two_factor

The two exception rules I have defined are that the Authelia authorization page is always accessible at the subdomain auth. - otherwise users would have no place to log in at all. The other exception defines that all subdomains under my top-level domain should require two-factor authentication, if an authorization request is forwarded to Authelia. This requires all my users to set up a secondary factor, unless I specify otherwise for a specific subdomain.

This Authelia configured correctly, I can now proceed to configure my services to make use of this provider. As a first service, I set out to secure the aforementioned Calibre-Web instance with 2FA. Calibre-Web does support proxy authentication from Authelia, if the Remote-User header is correctly passed through from the proxy and also activated and configured under feature configuration in Calibre-Web’s admin settings. To forward the request to Authelia and pass the header, I can make use of Caddy’s forward_auth directive. Following the Caddy example on the Authelia website, I only had to adjust the Calibre entry in my Caddyfile per the following:

calibre.DOMAIN.tld {
        forward_auth Authelia-IP:Authelia-Port {
                uri /api/verify?rd=https://auth.DOMAIN.tld/
                copy_headers Remote-User Remote-Groups Remote-Name Remote-Email
                import trusted_proxy_list
        }
        reverse_proxy Calibre-IP:Calibre-Port {
                import trusted_proxy_list
        }
}

For this to work Authelia needs to be externally exposed at auth.DOMAIN.tld, of course. When I try to access calibre.DOMAIN.tld now, my request gets forwarded to Authelia; I log in with 2FA, and get redirected back to Calibre-Web. I am authenticated against the proxy header now and am immediately logged in. For this to work, the Authelia user must be equal to the Calibre user! If they are different, the header would need to be manipulated in the Caddyfile. It probably much easier to just edit the username in Calibre to match the Authelia username in such a scenario.

Further Potential

One of the reasons that finally convinced me to start looking into this whole topic more recently was the idea for another upcoming project, moving my Tailscale coordination server to a self-hosted instance with headscale. Since headscale can make use of either pre-authenticated keys or OIDC, I decided it would also be nice to have an OIDC infrastructure in place before I made the transition! I have only briefly tested this so far but can already attest that OIDC is working with Authelia pretty nicely as well. I will have to explore headscale in more detail in the future, but for now it is nice to know that I do have the option of 2FA-authentication here as well.

One limitation with Authelia I have come across is the restriction to only use one hardware security key at the time. An issue for that topic is open and while I could manipulate the sqlite-database with SQL (changing the key identifier and then registering additional keys) as a workaround, one key is sufficient for me as long as I have TOTP codes as a backup. Multiple key support is on the roadmap for version 4.38.0, so I’ll be holding my breath for now.

While having this solution in place tempts me to expose more services outside of my own network this would defeat the purpose of adding extra security - in the end, I would also increase my attack vector. So instead of going wild configuring every other service out there, for now I am using Authelia just for what it is: an added layer on top of already exposed services. IF I changed my mind later on, there are multiple candidates to configure as well, however: Home Assistant requires a HACS plug-in for proxy-header authentication, paperless-ngx has this option built-in. Portainer, miniflux, and Gitea all support OIDC. FileRun and Bitwarden only offer this in their enterprise versions. For these I will stay the their native two-factor authentication, since I do want to avoid duplicate log-in screens.