In the beginning, if you wanted a computer, you had to design and build it yourself. You had to be an expert in electronics and computer science to do it. And back then, there were no computer parts. You had to manufacture everything yourself from vacuum tubes. Once you were done, you would have a tiny amount of memory and processing power to work with, meaning that you had to be an expert programmer to take advantage of that tiny processing power. And, there were no stored programs. Your program was implemented as an arrangement of cables on a plugboard, something that was error prone and time consuming.
There were no compilers or interpreters for your code therefore no compiler errors, no runtime errors, no debugger, no anything. If things didn't work, they just didn't work and some types of mistakes would have disastrous effects on the hardware.
Now, you don't need any of that. Yet, you can call yourself "full stack". No one truly full stack anymore.
no one is saying to take it back to assembly, but at least maybe try to understand how asynchronous code works if you're going to try to program professional applications in Javascript.
Now, you don't need any of that. Yet, you can call yourself "full stack". No one truly full stack anymore.