Skip to content

Check for broken links in CI

Uses linkchecker during the build CI test to look for broken links in our HTML.

After building the site, it uses the inbuilt ruby webserver and runs it on localhost:4000. This is then crawled by linkchecker.

It doesn't do anything special with the output, because linkchecker returns with a non zero exit code if it fails, which correctly trips up CI.

It ignores the following URLs:

  • Which are external to the website. This could be enabled in the future, but will have a big performance hit.
  • fdroid-app: URLs, due to jekyll-fdroid#19 (closed).
  • Ending in /news/, because that aggregates individual news posts which will get checked anyway.
  • Posts from 2015 or earlier (feel free to fix these at some point, but can't be bothered right now).
  • .*/packages/.* due to #75 (closed).
  • Only ends up checking the /en/ site, because the rest will have the same links generated anyway. Nothing is stopping the crawler from finding a link to the other locales and crawling them, but it doesn't find any.

Here is an example of a failed pipeline which I used to fix up the missing links.

Fixes #74 (closed).

Edited by username-removed-25042

Merge request reports