You are correct in that a Turing machine, given enough time, can perform all computation required by Tensorflow. So, if you don't care about time an cost, you are correct that a new paradigm is not required.
But, since a Tensorflow network does not _require_ all the functionality of a Turing Machine, it _allows_ you to limit you to more specialized computational models. And in doing so, you can run your system several orders of magnitude faster. (And the TPU type of processors are still at an early stage, expect at least a few more innovations to widen this gap further)
This comes with a few changes:
- A lot of algorithms (for instance those using recursion) are hard to transfer to a "tensor machine", making some skills obsolete.
- Many/most problems are solved using machine learning techniques, that tend to favor more stats/data science/maths than many pure programmers have.
- Simultaneously, the new approaches are almost equally alien to old school statisticians, as the problem definitions (in particular the kinds of patterns to look for) tend to be quite different.
- Also, the way to approach problems, estimate what problems can/cannot be solved or how complex they are, are quite different. (Requiring changed intuition.)
- New development is to a large extent fueled by changes in hardware (CPU->GPU->TPU->?), and I think problem solutions will become more and more about seeing possibilities across both hardware and software, instead of treating these mostly separatly.
In sum, I think the change in emphasis, skills, methodology, etc is enough to be thought of as a paradigm shift.
But, since a Tensorflow network does not _require_ all the functionality of a Turing Machine, it _allows_ you to limit you to more specialized computational models. And in doing so, you can run your system several orders of magnitude faster. (And the TPU type of processors are still at an early stage, expect at least a few more innovations to widen this gap further)
This comes with a few changes: - A lot of algorithms (for instance those using recursion) are hard to transfer to a "tensor machine", making some skills obsolete. - Many/most problems are solved using machine learning techniques, that tend to favor more stats/data science/maths than many pure programmers have. - Simultaneously, the new approaches are almost equally alien to old school statisticians, as the problem definitions (in particular the kinds of patterns to look for) tend to be quite different. - Also, the way to approach problems, estimate what problems can/cannot be solved or how complex they are, are quite different. (Requiring changed intuition.) - New development is to a large extent fueled by changes in hardware (CPU->GPU->TPU->?), and I think problem solutions will become more and more about seeing possibilities across both hardware and software, instead of treating these mostly separatly.
In sum, I think the change in emphasis, skills, methodology, etc is enough to be thought of as a paradigm shift.