Your statement may be mistaken. The order in which pure functions are applied matters often, but not necessarily. You can think of a few trivial algebraic examples where order matters.
Note that the parent said "if their inputs don't change".
In general in this thread, some people are following the root comment in using "order in which functions are applied" to refer to what you mean - applying one function to the output of another, or vice-versa. Of course this affects the correctness of the result (and possibly also the well-typed-ness of the program).
Other people, like the author of your parent comment, are using "order in which functions are applied" to mean "order in which functions evaluated", which indeed does not for the correctness of pure functions (except possibly with respect to nontermination).
Working with f(x) = x + 3 and g(x) = 2 x, the expression f(g(7)) can be evaluated inside-out or outside-in:
I don't see why your comment is not simply additive to mine. Both our statements are true, and are related to the general discussion. Your statement relates to the most "pleasingly parallel" situations. Mine refers to problem composability.
This was the post right before the one we're talking about:
>No, no it isn't. Just like you wouldn't expect 2 3 + 8 to suddenly return 22 when you intend it to, you shouldn't expect the order in which you apply functions to not matter.*
A reasonable interpretation of the text and the one that follows is that some claims have been made that functional programming provides benefits of commutativity, and that people are disappointed to find no such benefit.
f(a) = a + 3
f(b) = 2b
f(a).f(b) can have different results vs f(b).f(a)