Well, when I used VHDL back in college, I noticed that it had a really hard time with math. So for example it could do look up tables all day (basically switch commands) but if you tried to encode that logic into the kind of math we're used to in a C-based language where A = B (insert operator here) C, it fell down hard and the circuit would be so unstable that it would only run a few cycles before spinning off into some exceptional state that was nothing like we expected.
I think that's because humans have a hard time considering the ramifications of things like boundary conditions and edge cases with respect to types. So maybe we can visualize one register being added to another, but we can't intuitively extrapolate what happens when one is signed and one is unsigned, or their widths are different, or one is floating point, etc etc etc. VHDL doesn't touch on all of these edge cases very well (because for one thing they are hard!) it just does exactly what it’s told. That often flies in the face of intuition, once we’ve analyzed the circuit and seen how much we underestimated the complexity of what we were asking. In other words elegant math doesn’t always translate to simple circuits, and vice versa. So it really needs a meta language that can grapple with these subtle nuances and compile to VHDL without a lot of friction.
Probably what’s going to happen is we’ll see DSP logic (and limited subsets of it like GPU shaders/OpenCL/CUDA) and VHDL/Verilog merge into a functional concurrent language that can cover all of it. It won’t be as explicit as Rust because it will infer what the user is after but allow for overriding default assumptions. It won’t have opaque syntax either like most functional languages today. I’m thinking probably it will look more like MATLAB/Octave but have access to some of the more concise notation of Mathematica. So think Excel except having cells arranged arbitrarily in some ND space rather than 2D, and we’ll be able to specify formulas on groups of cells rather than individually, and in any language we desire that’s then compiled to Lisp and either run on distributed economy hardware or translated to a hardware description language. CλaSH probably isn’t it, but its approach and open source license is certainly a start.
Realized I didn't answer the question - different yes, but probably not different enough to be compelling for mainstream use at this point. Without having ever used it, I have concerns that circuits will still fall down or take up gratuitous chip area because handling the edge cases is one of the more complex problems to solve, and I'm not convinced that functional programming alone is enough.
I think that's because humans have a hard time considering the ramifications of things like boundary conditions and edge cases with respect to types. So maybe we can visualize one register being added to another, but we can't intuitively extrapolate what happens when one is signed and one is unsigned, or their widths are different, or one is floating point, etc etc etc. VHDL doesn't touch on all of these edge cases very well (because for one thing they are hard!) it just does exactly what it’s told. That often flies in the face of intuition, once we’ve analyzed the circuit and seen how much we underestimated the complexity of what we were asking. In other words elegant math doesn’t always translate to simple circuits, and vice versa. So it really needs a meta language that can grapple with these subtle nuances and compile to VHDL without a lot of friction.
Probably what’s going to happen is we’ll see DSP logic (and limited subsets of it like GPU shaders/OpenCL/CUDA) and VHDL/Verilog merge into a functional concurrent language that can cover all of it. It won’t be as explicit as Rust because it will infer what the user is after but allow for overriding default assumptions. It won’t have opaque syntax either like most functional languages today. I’m thinking probably it will look more like MATLAB/Octave but have access to some of the more concise notation of Mathematica. So think Excel except having cells arranged arbitrarily in some ND space rather than 2D, and we’ll be able to specify formulas on groups of cells rather than individually, and in any language we desire that’s then compiled to Lisp and either run on distributed economy hardware or translated to a hardware description language. CλaSH probably isn’t it, but its approach and open source license is certainly a start.
Realized I didn't answer the question - different yes, but probably not different enough to be compelling for mainstream use at this point. Without having ever used it, I have concerns that circuits will still fall down or take up gratuitous chip area because handling the edge cases is one of the more complex problems to solve, and I'm not convinced that functional programming alone is enough.