There is such a thing as best-case, worst-case, and average-case computational complexity.
Also in this case complexity we seem to care about is function of two factors: number of files and number of changed files. Call the first one n and the second one p.
The complexity of webpack would be O(n) + O(p) [+] and the complexity of snowpack would be O(p) in development. Worst case p = n, but best case p = 1, and usually in development p is on average very close to 1. Also p is parallelisable, making it appear like p = 1 WRT wall clock time for small p values even though CPU time would be O(p).
[+]: which one of the p or n constant factors dominates is unclear, but some hardly parallelizable n-bound processing is always present for webpack.
Also in this case complexity we seem to care about is function of two factors: number of files and number of changed files. Call the first one n and the second one p.
The complexity of webpack would be O(n) + O(p) [+] and the complexity of snowpack would be O(p) in development. Worst case p = n, but best case p = 1, and usually in development p is on average very close to 1. Also p is parallelisable, making it appear like p = 1 WRT wall clock time for small p values even though CPU time would be O(p).
[+]: which one of the p or n constant factors dominates is unclear, but some hardly parallelizable n-bound processing is always present for webpack.