For general purpose algorithms yes.
Blake3 is not a password hashing algorithm it’s a general purpose one, used for checksums or verifying 2 files are the same etc.
For password hashing you’re correct they should be slow, bcrypt for example.
The commenter above obviously meant "slow to brute-force".
You're pedantically interpreting it to mean "slow for a given implementation, even if there are some scenarios under which it would be faster to brute force".
What the parent comment meant is clear enough I think, and it's accurate in that context. The whole point of modern hashing algorithms that require memory and have scaling difficulty factors is to make it slower for an attacker with certain kinds of resources to brute force it.
If you have a very slow hashing algorithm implementation of a fast hashing algorithm (say 'sleep 5, echo 1'), that's not slow by the parent's comment definition because it's not slow to brute force.
Similarly, if the hashing algorithm has predictable output that allows the attacker to derive information about the input, that obviously is also faster to brute-force.
That sort of pedantry is arguably somewhat useful if you also choose to provide a more precise definition or explanation. It's definitely not constructive if you just add a little smiley face and do it as an asinine "hah look at how smart I am".
Wow, so offensive. Didn't mean it that way, but if it turned out that, what a heck then. What I wanted to say was that slowness is not necessary attribute, unless we really seek it. If we could get very good and quick algorithm without being it slow, we would use it. Thanks for misinterpreting my smiley too. Have a good evening there.
> If we could get very good and quick algorithm without being it slow, we would use it.
No, that's simply not true, and that's what the person you replied to was trying to say. For a password hashing algorithm, you actively want it to be slow - to have a mathematically guaranteed minimum amount of time that each attempt takes, regardless of implementation. This is a security property of a password-hashing algorithm. It is a necessary attribute - you can't have a "good" password-hashing algorithm whose output can be computed arbitrarily quickly.
What you might be saying is that if an inefficient implementation is slow, we'd like to speed it up. Sure. The requirement of a good algorithm is that the most efficient implementation possible is slow.
Resistance to quickly creating a rainbow table with significant coverage, however, is a reasonable property.
The mentioned bcrypt library allows you to “tighten the ratchet” over time as average computing power increases to make hashing a single password in 2010 take about the same amount of time on average modern computing hardware as in 2020 (assuming you correctly increase the number of iterations)
For password hashing you’re correct they should be slow, bcrypt for example.