2

Question for the frontend devs:

When you build the front end of the website, do you keep PageSpeed Insights or Lighthouse analytics in mind,

or is it more of a "I'll deal with that later" kinda deal?

Little backstory to this - Our management wants these analytics to be close to 100% and one of the metrics is Largest Contentful Paint which is 3.2 secs, and I have no idea what that is.

Comments
  • 4
    You cannot patch in performance after the fact. LCP 3.2s is pretty bad and proves management right.

    Pagespeed tells you about LCP under "Largest Contentful Paint" with an explanatory link. You can also use https://gtmetrix.com/ to check the timeline, it tells you when LCP happened, and you can see what was rendered.
  • 2
    I usually deal with it later :)
  • 1
    @Fast-Nop This is exactly what my confusion is. Check the screenshot I posted.

    These results don't match with the LightHouse extension on the Chrome browser.
  • 2
    @Sid2006 Could be a browser issue - https://pagespeed.web.dev/ is also OK? It's only with your local browser?

    Maybe it's something with the location. As in, LCP is bad on your local machine because the server is geographically far away with mediocre network path. That could explain why the online test tools see different results.
  • 1
    @Fast-Nop Alright, thanks for the info. It's just these weird quirks of the internet that I keep finding out.

    Some months ago I found out that an email that goes like "blahblah@yahoo.com" isn't necessarily a yahoo.com email. It can also exist on the Google Workspace Suite. Still blows my mind.
  • 1
    It would be impossible to achieve 100% when you choose a client side framework. RSC is brain fuck at the moment. I suspect it would be impossible to improve performance when the foundation is fucked
  • 0
    Management should only request that using the app feels snappy & fast, even if you "only" have a score of 85%. They should trust you to use the right tools and make a balanced cost-benefit assessment. The last 20% of the work risks taking 80% of the time.
  • 0
    @webketje No, demanding 100% is the right way because devs will always find excuses. If you demand only 80%, they get sloppy and then excuse why they only reach 60%.
  • 1
    @Fast-Nop you´re right, I should stop assuming the majority of devs is as dutiful & diligent as I am for the craft
  • 0
    @ostream I had that as well in some projects, only that 100% code coverage was the goal. Obviously, we didn't reach that.

    The solution was to document each and every code path not taken, analyse, and argue why it was not covered. That went especially for stuff like defensive coding where taking the failure code paths would have pointed to a SW bug.

    The immediate consequence, however, was that we started to think more in terms of potential test case deficiencies and actually did improve code coverage.
  • 2
    Just for reasons...

    *takes a metal fork and starts scratching on a chalk board*

    Now I'm feeling less murderous.

    Do these tests make sense? Yes.

    Do these tests have meaning? Yes.

    Should these tests be given to a dev and say improve this? In most cases, nope. Hell no.

    The simple reason - as @Fast-Nop mentioned - is that these tests require a baseline.

    If 4 devs try to improve the tests with 4 different browsers on 3 different OS in 4 different geographical locations... Yeah. You get a wild fuckity of numbers which are completely irrelevant - as they most likely cannot be reproduced.

    Make a baseline. Run it at a fixed geographical location via an CI, do multiple runs, check for statistical regressions etc.

    Best is by the way to run it in an isolated environment or in case of prod to add metrics regarding load of the server to the test data.

    But please, stop bullshitting by just randomly clicking a page and saying page speed sux, we must do sth.

    That's just a completely waste of time.

    ... Reason I cannot stress this enough is cause too many people run these tests without a clue / without any scientific approach.

    Yeah. The so called SEO experts whose job seems to consist of being annoying fucktards.

    Time to take a bottle of valium.
  • 0
    @ostream That was indeed an embedded C project in a regulated domain. :)
Add Comment