Ah, I see what you mean. The problem is indeed sampling. For performance reasons the graph only plots a point every 8 pixels each frame, so if you have drastic changes that happen below that interval you get weird rendering artifacts. Unfortunately this will inevitably be the case until I can get a parser that's fast enough to sample for every pixel.
Can you decouple rendering from simulation? I guess so, since it's easy to test whether a point lies below or above the curve. If at least what was simulated were reliable that level would be (somewhat) playable.
I would suggest having reliable visual sampling too (samples in absolute space, growing in number with zoom out), but I'm not aware of the actual performance implications.
I still think there's something else going on besides sampling error. Why is it (visually) working correctly for x*7 until I sled? Why does it change so dramatically over time?