WIP: Find unused CSS (selectors by usage)

Open username-removed-892863 requested to merge feature/find-unused-css into master


See our CSS selectors sorted by usage, https://gitlab.com/snippets/1676232

How can I mess around with this?

Download the HTML pages from all of the rspec tests and save to $gdk/gitlab/fixtures/pages, 📎 gitlab-rspec-pages.zip

If you want to generate the HTML yourself, simply run the rspec tests, bundle exec rspec spec/features. There is a rspec after hook in place to save the HTML. When I was running the tests, I had to run in smaller chunks to better follow when things started to fail because of random issues like the webpack crashing. You will also need to strip out the <script> tags because uncss will load the page into jsdom and fail to load the scripts and bail out. Find <script(.*?)<\/script> and replace with nothing.

Download the cached selector files for each page and save to $gdk/gitlab/fixtures/pages/cache (these are generated in the script but will save a lot of time), gitlab-rspec-pages-cache.zip

Run the script:

node fixtures/find-unused-css.js

See the report, $gdk/gitlab/fixtures/pages/cache/#selector-map-by-usage.json

The code is very rough and is only meant as a quick PoC to get one-off stats around our CSS. I wouldn't choose uncss again because it is very slow and doesn't give proper insight/stats.


I saved off the HTML from every rspec test in spec/features using a after hook.

Because uncss tries to load the <script> tags on the page, I stripped them via a simple find/replace script (Sublime find/replace was crashing and leaving a bunch of unsaved tabs open so I had to delete the sublime session file manually 😑). Find <script(.*?)<\/script> and replace with nothing.

Then I ran into uncss choking on memory because of the sheer number of HTML files(it batches up a bunch of jsdom instances 🤢)

<--- Last few GCs --->

  239117 ms: Mark-sweep 1323.8 (1434.4) -> 1322.8 (1434.4) MB, 992.2 / 0 ms [allocation failure] [GC in old space requested].
  240073 ms: Mark-sweep 1322.8 (1434.4) -> 1322.5 (1434.4) MB, 955.9 / 0 ms [allocation failure] [GC in old space requested].
  241047 ms: Mark-sweep 1322.5 (1434.4) -> 1322.5 (1434.4) MB, 973.6 / 0 ms [last resort gc].
  242000 ms: Mark-sweep 1322.5 (1434.4) -> 1322.5 (1434.4) MB, 952.7 / 0 ms [last resort gc].

<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0000003B3D8C9E51 <JS Object>
    1: filter(aka filter) [uncss\node_modules\jsdom\lib\jsdom\living\node.js:95] [pc=0000036AE42573CA] (this=0000003B3D804189 <undefined>,node=000003E2FFBDB901 <an EventTargetImpl with map 000002294BDB22C1>)
    2: treeToArray [uncss\node_modules\symbol-tree\lib\SymbolTree.js:~338] [pc=0000036AE35A29D1] (this=000002F4DD4ABFD9 <a Symb...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory

So I forked uncss and made some rough PoC changes. It now creates the jsdom instances only when needed inside the loop. I simplified the CSS fetching process so it only looks at the stylesheet provided in the options. Added some result caching so we don't have to find the selectors on a page every single run-through (helpful when testing things). I also bodged in some nicer reporting so we can get the selector usage.

See https://gitlab.com/gitlab-org/gitlab-ce/issues/38132

cc @timzallmann

Edited by username-removed-892863