The idea of representing a program by a declarative computation graph of matrix transformations has been big in deep learning research for a few years. Theano [1] is the canonical example, though more recently the space has gotten bigger with other frameworks including CGT [2] and now TensorFlow. The computational graph abstraction is really nice in part because it gives you very straightforward automatic differentiation, which dovetails very nicely with the trend in machine learning of casting a wide range of algorithms as (stochastic) gradient ascent on some loss function.
Thanks! Yea, though in fact my idea is more about wiring the actual (matrix/tensor) operation with dataflow, rather than just the dataflow between such operations.
But it might be bit of a different problem area :)
[1] http://deeplearning.net/software/theano/
[2] http://rll.berkeley.edu/cgt/