Out of curiosity, does that mean that GPUs are indeterministic only in their exact execution run-time behavior, or also in the actual output they produce? If the latter, is it something to do with the indeterministic order of floating point operations causing small output differences, or something completely different?