Thanks to @fatihacet refactoring of the issue page using Vue, discussions are now asynchronously loading, we see a significant drop in the initial page load for issue 1 on GitLab.com:
While this is a great change, notice that it still takes several seconds before we even initiate the load of the discussions.json page:
I've highlighted static assets (e.g. JavaScript, CSS) that are blocking this load. Because many of these load relatively slowly from gitlab.com, it takes a while to both download the static files and render the DOM.
Stan Huchanged title from discussion.json takes several seconds to even start after loading of page to discussions.json takes several seconds to even start after loading of page
changed title from discussion.json takes several seconds to even start after loading of page to discussions.json takes several seconds to even start after loading of page
@stanhudiscussions.json is delayed because DOMContentLoaded takes so much time. As we can see the timeline above, discussions.json is actually blocked by loading of stylesheets and scripts. Browsers allow an arbitrary number of parallel downloads, 6 for Chrome, so it would be better if we serve less scripts and stylesheets by combining them together.
jQuery's $(fn) and $(document).ready(fn)is pretty much the same thing.window.addEventListener('load', fn)` would be much more slower. Looks like DOMContentLoaded is the best chance we have.
@ernstvn The scripts are not blocking the HTML+CSS rendering of the page anymore, but as the discussions.json is loaded by the JS it still needs to wait until all JS is loaded and running.
@timzallmann yeah that's correct. Currently we serve 3 CSS and 6+ JS files. I believe concatenating them into one file would make difference. Especially those CSS files should be concatenated because one of them is more than 1.3Mb and two of them below 10Kb. I believe we also do this for the JS files too. However I made a session with @mikegreiling about this and he was some concerns about concatenating.
@mikegreiling can you please add your thoughts here too?
@fatihacet For the CSS it totally makes sense when they are so small and that is already in work with the CSS cleanup of the new navigation (they were both for that).
With the JS I also think that we are better off with separating the application in smaller more specific chunks (then by example the libraries we need are already cached) and especially reducing the size of main.js + load specifically only the locales that are needed. Also, the mentioned integration of a CDN will help a lot there.
Concatenating our javascript will not fix this problem, as we discussed in the call. There are currently a few things blocking the discussions.json from being requested earlier in the timeline:
The response from the server must be complete, the CSS must be loaded because it blocks both page rendering and deferred script network requests, then all of the javascript bundles must be downloaded, parsed, and executed, then the DOMContentLoaded event callbacks fire (and we have a lot of code listening on this event), then discussion Vue components are mounted, and then the request for discussions.json is made.
That accounts for the long delay in the request being made. There are a few things we can do to optimize this:
Load some portion of the discussion thread embedded in the initial response rather than doing this asynchronously.
Add some embedded javascript that performs an XHR for the discussions.json early on in the page.
- content_for :page_specific_javascripts do :javascript window.discussionsRequest = new XMLHttpRequest(); discussionsRequest.open("GET", "/gitlab-org/gitlab-ce/issues/1/discussions.json");
and then we reference and use window.discussionsRequest within our discussions javascript bundle to access the request in progress. -- OR --
Use a jsonp callback method to prime the request earlier in the process.
<!-- just below our webpack bundles (so this gets loaded last) --><script src="/gitlab-org/gitlab-ce/issues/1/discussions.json.js"defer="defer"></script>
and the url above would do something like:
loadDiscussions({// ... (discussion data)});
in the discussions Vue component, we'd export this loadDiscissions method to be run by the jsonp request:
@mikegreiling thanks for your input. Option 2 seems the most boring solution for now. We can store the response in a variable so we can use it whenever we want. I think when option 1 and 2 combined together, it would be a huge improvement.
With the JS I also think that we are better off with separating the application in smaller more specific chunks (then by example the libraries we need are already cached) and especially reducing the size of main.js + load specifically only the locales that are needed. Also, the mentioned integration of a CDN will help a lot there.
The one problem (noted and repeating what is written in the link issue) is that because we use HTTP1, we can only have a max of 6 requests at a time unless we use some CDN tricks. E.g: different CDNs for different resources. Like having a separate CSS CDN and a JS CDN etc. Or we adopt HTTP2. We are currently hitting this limitation and it is currently slowing us down. Further separating our JS (which in a single issue loads 7 JS files) is not an option, and will slow us down even more, due to this limit. Unless we change things up.
At the moment per my research our biggest win would be to reduce the size of our CSS and the response time from the server. Those two things alone use up at lease 3.5 seconds of load time. Our CSS is 1.3mb uncompressed
@mikegreiling I believe concatenating the files (especially all the common stuff that needs to be loaded in every page anyway and taking the time decrufting main.js) will solve the 6 max files problem we are currently hitting. A further separation of concerns will exacerbate this problem. Also as noted another BE dev, less requests is easier on the server and a major time saver. For example in CI/CD @ayufan is currently taking the many server requests that would happen and turning them into one big one as an optimization.
Also as noted another BE dev, less requests is easier on the server and a major time saver. For example in CI/CD @ayufan is currently taking the many server requests that would happen and turning them into one big one as an optimization.
It depends. In some cases it does, in some might not. But having single request allow us to use a number of caching techniques that makes it much easier to handle, both on frontend and backend.
Can we actually test and enable HTTP/2 on GitLab.com? It does introduce connection multiplexing, so effectively we should be able to load all resources faster.
HTTP/2, in general, should help greatly with initial page load.
HTTP/2 would definitely be useful. Though there are a lot of factors to consider, and some network tradeoffs between asset size, compression, browser cache strategies, etc. Some useful reading:
That looks like the parsing of the JS + actual initialization takes that much time.
@fatihacet + @jschatz1 :
Without having a look at the code. What do you think about doing for that use case a vanilla initialisation/injection? Maybe that makes sense to close that gap. What i mean by this :
That special page gets a JS which is loaded before our first bundle which is a vanilla JS that starts already the async loading of discussions.json to a variable -> then the normal stuff starts and where currently discussions.json loading is fired of you simply pick up the already loaded discussions.json or you wait until it is there. In that way we can mitigate the big js loading time and use it in parallel.
@timzallmann as far as I understand what you are suggesting is pretty much the same thing @mikegreiling mentioned in this comment and it definitely should help a lot. I think we should prioritise it and assign someone to work on it. @jschatz1 I can start working on this if it's ok with you. WDYT?
I still think option 3 is the cleanest implementation. It would let the browser dictate the priority of the discussions.json request meaning it would queue right away, and execute immediately after the last deferred script is loaded, which is exactly what we want.
@mikegreiling right but it still will be slower than adding some initial portion of the discussions into the payload like we discussed in the call we did and mentioned in this issue
@mikegreiling oops you are right. If you remember we disabled that option in my DevTools while in our session. I forget to enable it back. I will provide new screenshots.
Among other things... in theory if we could decruft the main.js to only include what is in this page we could knock off ~160ms. There is clearly a lot going on there. Also gaps of unknown time that is not rendering.
Yep, a lot of this is parsing, and then executing each of the bundles, which include a lot of code that is not needed on the page (primarily main.js and dispatcher.js are the culprits). I think #38130 and #37792 would really improve this.
Also doing things like replacing underscore with lodash, removing the vue runtime compiler, and removing outdated library dependencies will improve the common.js bundle performance as well.
I also think we need to take another approach to our locale bundle. Loading every translation of every string for every user is wasteful. We should only download foreign languages support when it is needed.
in theory if we could decruft the main.js to only include what is in this page we could knock off ~160ms.
Actually it may be more than this. Keep in mind, the code that executes upon DOMContentLoaded also includes code from main.js which could be reduced.
A lot of it is silly junk like:
// from importer_status.js$(function(){if ($('.js-importer-status').length){varjobsImportPath=$('.js-importer-status').data('jobs-import-path');varimportPath=$('.js-importer-status').data('import-path');newwindow.ImporterStatus(jobsImportPath,importPath);}});
... which executes on EVERY page and checks whether .js-importer-status exists in the DOM. We have a lot of scripts with patterns like this leftover from the pre-webpack days.
@mikegreiling how big is the translation files? Does it make sense to store them in LocalStorage?
Here are two potential solutions I considered in another issue earlier this summer:
Create separate bundles for each language with the translations baked in. This can be achieved with a webpack plugin.
Devise some wrapper that loads the languages asynchronously, like we do with the emoji chunk currently. This would be more complicated, but would have less overhead in compile time and omnibus package size than option 1 would incur.
I think decrufting our main.js should be the top priority?
Yeah, I think we should make this a deliverable sometime soon. There's a multi-step process involved here:
Old style modules need to have IIFEs removed and use proper ES module import and export syntax.
If necessary, side effects like the $(function() { /* do something */ }); need to be refactored and moved into the dispatcher or some exported loader function instead.
The import can then be removed from main.js and be placed where appropriate in dispatcher.js where it is only referenced and executed when needed, saving execution time.
We can then follow up with a solution similar to #37792 to split up the dispatcher sections into separate async chunks that are loaded on-demand, saving both bundle size, and JS parsing time.
^ I have already started to chip away at these with MRs like: !12581 (merged) and !12683 (merged) where I basically performed steps 1-3 for most of the modules starting with the letters s through z.
@mikegreiling I think this is at least a two person job. I say 1 from your team and 1 from my team. @victorwu can we schedule this for the next release? This is a development performance improvement.