That depends on what's being measured, most especially by way of survivor bias.
If the only people whose time-on-site is measured are those who 1) don't blacklist the site, 2) don't disable JS, and 3) don't immediately leave and never return, then annoyances may well give apparent measurement of longer time-on-site as the remaining readership is curated to those who will tolerate (or have no alternative to) such bullshit.
Web metrics are very poorly understood even now. YouTube's infamous experiment where latency improvements degraded apparent site performance ... because people with exceedingly marginal connections could now actually use the site if even very poorly.
(I can't find that story though it's from ~10--15 years ago. Both my DDG-fu and FastGPT-fu (Kagi) are failing me. I'm pretty sure HN has discussed this at least once.)
If the only people whose time-on-site is measured are those who 1) don't blacklist the site, 2) don't disable JS, and 3) don't immediately leave and never return, then annoyances may well give apparent measurement of longer time-on-site as the remaining readership is curated to those who will tolerate (or have no alternative to) such bullshit.
Web metrics are very poorly understood even now. YouTube's infamous experiment where latency improvements degraded apparent site performance ... because people with exceedingly marginal connections could now actually use the site if even very poorly.
(I can't find that story though it's from ~10--15 years ago. Both my DDG-fu and FastGPT-fu (Kagi) are failing me. I'm pretty sure HN has discussed this at least once.)