And it was learnable. I mean, you could learn every detail of the language with a reasonable effort. Just read through the manuals, which are extremely well written. Not many modern languages you can say that about.
There was a conversation in a ruby channel about this recently.
I mentioned that if 15 years ago you had suggested that in 2012 you would need HTML, CSS (and probably LESS), JS (and probably JQuery + CoffeeScript + Backbone), SQL or Mongo and Ruby or PHP or similar just to do a simple form application - we would have laughed.
I got shot down - "it's a web application now, so not comparable".
But I really think it is comparable. The number of technologies you need to know reasonably well to do a basic business function online is nuts.
To make it worse the number of flavours of languages or components involved and the requirement to hire vertical developers (specialising in one language) rather than developers who could understand the full domain means that business web apps are becoming unmaintainable more quickly.
These tools and languages in the hands of startup teams with above average commitment/experience/knowledge are fantastic, but for general business use they are big step backwards.
I've noticed several sites recently that are including more than one version of JQuery (and a copy of Prototype) on the same page - which is a symptom of this to me.
Don't get me started that I could press F9(?) 15 years ago and Turbo Pascal would build in 2 secs on a 386. Takes irb that long to start on this 8 core.
I don't know if this is really a fair comparison. You can do simple forms in plain HTML if you want to, after all. No CSS required, no Ruby or PHP, no scripting, no nothing.
If you want to persist the form submissions you need to have some logic (which brings in a programming language) and a database (which brings in SQL, or maybe structured text files), but that's just as true in a single-user or client-server application as it is in a Web application. The only differences are that the fifteen-years-ago application probably used a language like Visual Basic rather than PHP or Ruby, and a simple embedded database engine like Jet (http://en.wikipedia.org/wiki/Microsoft_Jet_Database_Engine) rather than a standalone database server. But the basic concept of a form-based application hasn't changed much, and the scope of knowledge you need to build one hasn't changed much either.
I've noticed several sites recently that are including more than one version of JQuery (and a copy of Prototype) on the same page - which is a symptom of this to me.
This is more of a symptom of idiots having access to programming tools. Idiots haven't changed much in the last fifteen years either. There were plenty of line-of-business apps written in (say) VB whose code would make you scream "Burn it! Burn it with fire!"
> I mentioned that if 15 years ago you had suggested that in 2012 you would need HTML, CSS (and probably LESS), JS (and probably JQuery + CoffeeScript + Backbone), SQL or Mongo and Ruby or PHP or similar just to do a simple form application - we would have laughed.
You can still laugh today, either at the large amount of complexity people invoke for simple tasks or at the notion that the minimal set of these to accomplish the task ("form application") is really complex or hard. 8 years ago you'd use, say, HTML with PHP run from Apache (and Apache is largely out of your mind if you're using shared hosting..), and either text files or a simple SQL table to persist data. (Or depending on the form, just email the data.) You can still be this minimal (or even more minimal--e.g. Flask which comes with its own "Apache", even if you shouldn't use it in production) today. And the basic CSS and JS needed to make the app "pleasant" aren't difficult at all to learn and with only a bit of discipline it's not that hard to keep them from making your page into a complex hairy ball code-wise.
8 years ago was also when AJAX hype was at its greatest. Now it's more about "framework hype"; I keep this in mind: http://marc.info/?l=php-general&m=112198633625636&w=... "Before you blindly install large "AJAX" libraries, have a go at rolling your own functionality so you know exactly how it works and you only make it as complicated as you need."
The web is pushing programming in a good direction, but it puts more pressure on the programmer. A good webapp not only functions well and looks good, but it's composed of components (the DOM) which are malleable and understandable by other programs. The important program is the Googlebot, but there are countless other, lesser programs (such as a bevy of bookmarklets, and even programs like Evernote) which benefit from having our applications be parsable.
The biggest thing holding back web programming is not JavaScript, but CSS. Making stuff look good is way too hard. I know a lot of programming languages, but CSS is by far the hardest language I've ever had to learn - the interpreter is a black box, properties are strangely coupled, and there is no error reporting or insight into the layout engine. Not to mention that it is difficult if not impossible to accurately know which selectors apply to an element short of actually running the page and checking in a tool! But I digress.
The fact is that we're paying the price now, but I think it's worth it.
I mentioned that if 15 years ago you had suggested that in 2012 you would need HTML, CSS (and probably LESS), JS (and probably JQuery + CoffeeScript + Backbone), SQL or Mongo and Ruby or PHP or similar just to do a simple form application - we would have laughed.
This is a major reason I don't do any web programming. I don't exactly know where to start any more, and every few months when I think about throwing something together, there's been some change in what's fashionable - whatever was hot at the beginning of the year is now 'considered harmful' and so on.
I'm actually a bit encouraged by adobe Edge because I like their tools and Flash had a decent run, so I may get back into it. But I am so not into technology for its own sake.
The problem is 15 years ago had you suggested that most apps will run server-side, with an interface exposed in a browser, people would have laughed at you regardless.
People don't really remember that 15 years ago it was a monumental feat to build something usable, without having the resources of Microsoft.
I'm not so sure. 15 years ago there was a lot of hype going around about the Java, the "Network Computer" and such initiatives. Sure they didn't work out, but a lot of people and companies believed in them, so it didn't seem so far-fetched.
> The number of technologies you need to know reasonably well to do a basic business function online is nuts.
While this is true on the high end there is also a massive amount of help/tutorials/documentation available today which circumvents this issue for most applications. In many situations all someone really needs is basic understanding and the ability to google effectively to find the proper snippets of code to copy/paste. Few applications get to the point in either scalability or complexity where deep knowledge of the involved technologies is required.
you would need HTML, CSS (and probably LESS), JS (and probably JQuery + CoffeeScript + Backbone), SQL or Mongo and Ruby or PHP or similar just to do a simple form application
People are working on that. For example I've started to learn clojure and recently wrote a very simple web app using Clojurescript, Noir, Korma and a couple of helper libraries like fetch and crate. Using that stack I managed to write an entire web app without ever leaving Clojure. Everything felt like clojure and the borders between client-server-database was rather fuzzy.
While I'm sure using clojure for the entire stack simplifies things for pros like you. It is not an adequate replacement for newbies. You still need to know: Clojure (and the lisp way of things), Java, Javascript, HTML, CSS, and SQL to understand the system. Leaky abstractions indeed.
HTML, CSS, and SQL are all fantastic domain-specific languages. It's the split between frontend languages like JS and backend languages like Ruby that's the real problem.
CSS is a waste of space and JS is rather unfortunate (though modern web frameworks are pretty good at abstracting it), but 15 years ago you'd still have wanted a declarative form language (though whether you could spare the resources for it is another question), a separate datastore, and a language for actual programming logic.
That's not true. I remember trying to learn Turbo Vision, this text-based GUI library that it came with. The reference documentation that it came with was awful and I tried doing that with the help of a book. Couldn't understand anything.
I very much disagree with this. Java is much more modern, but you can learn the entire language in some months of work.
I will grant you that it will take longer to learn all the libraries you want to use, along with the entire Java standard library, effective design of Java programs, etc.
But the languages haven't gotten much bigger over the years.
I have a 2 foot tall stack of Java manuals here that beg to differ :)
Java is big, the java ecosystem is huge.
Sure, it does more than turbo pascal, but if you're able to go from 0 to learning the entire language in a few months of work then you're a pretty hard worker!
Going from 0 to knowing turbo pascal (language, ecosystem, ide) inside out in a few months is no remarkable feat.
Sure it is. However you don't have to learn all of it.
I never learned much about Swing, AWT, JSP, JSF, EJB, Struts, Spring, Hibernate and all that crap. I tried them out at some point, but eventually I had the common sense to just say no.
You're basically comparing a language with the entire ecosystem of another. I remember trying to make MOD files play in Turbo Pascal, which left me with about 200k of memory for whatever else I wanted to do. I remember trying to read BMP files and because I had no available libraries, I had to implement the functionality by my own (and forget about GIFs or Jpegs). I remember trying to create a window with a button in Windows 3.11 and for some reason whatever I did was freezing my computer, so I had to reboot all over again, until finally giving up in agony.
I know at least one man that gave up on programming when the transition to Windows happened, because Borland Pascal was so goddamn awful for anything not involving a text interface running in 286 real mode - that it left that man with the impression that programming was not for him.
Yes, you can go from 0 to "knowing turbo pascal" in a short time. However that doesn't mean much, when in that same time you can accomplish so much more in a modern environment, even if you end up just scratching the surface.
I was thrown at turbo Pascal at university. Horrible language to be honest as it was utterly painful to produce any reasonable data structures in it. The verbosity usually broke the memory constraints of the machine just before you finished what you were doing meaning hours of search and replace on identifiers to shorten them to clean up ram for the editor. Ick.
It made me actually long to get back to a m68k based sun 3 with a c compiler which took literally 2 minutes to compile hello world if some asshat wasn't running emacs on the same box which was swapping to disk 100% of the time...
However what they managed to cram in was impressive.
Hmm, I don't have that experience; TP worked well for me and I created quite substantial commercial projects which made me a 'wealthy' student in university. Especially when they introduced overlays we were able to use all those kilobytes :) I have good experiences with it and I still have all the sources of the software I wrote then (a lot of it is still sold and a lot I ported to Delphi after that and is still sold). Do you have concrete examples of what you are referring to? I used structures, but never too complex as I came from assembler (Z80/68k) and didn't (and don't) really see the use of very complex structures. Reasonable worked fine.
That Sun workstation probably had multiple megabytes of RAM and cost 4x more than a fully loaded PC. The 68k had a flat 32-bit address space and even virtual memory.
TP was heroic for what it could do in the 16-bit segemented x86 architecture in well under 640k of RAM. It was the first high level language implementation that made "real" developers consider using something other than hand-coded asm.
apart fron the editoe problems, It was used to write a control program for a cnc mill which talked to an Isa card that controlled stepper motors. The program took a basic description of the part to be milled and converted it into tool movement steps. this was all 2d but the parts were quite complicated as were the calculations. Tool movements is what killed it. We started with a list of lists structure but that took up too much ram so we moved to a skip list implementation. That also took too much ram so it was back to a singly linked list and an external preprocessor. So it turned from an elegant tool into a few chunks of ugly which had to be chained. That really doesn't work well on DOS. This was before EMS etc so 640k of segmented vomit it was. It was moved to a dedicated 68k SBC eventually with the ui in turbo pascal (over a serial line) which worked well and we could throw 4mb at it and write the control program in c using a cross compiler on unix (plus no segments=bliss).
As I recall it, when TP came out on the it meant that you no longer had to choose between Basic and 16 bit segmented mode asm.
Not many people had heard of C back then and Pascal was considered a very well-designed language. The PC C compilers of the time were somewhat lousy. Few people doing PC development had access to a proper *NIX machine.
I also used Turbo Pascal at university (around '90 / '91), and while I can remember some growing pains in terms of learning to program I can't recall any such problems that you experienced. My memory is fuzzy on the detail, but I think there was a time in the mid -> late 80s where Turbo Pascal was greatly improved, and become close to the Object Pascal language which is still in use today.
I'd be guessing you were using the 'pre-improvements' version of TP. I have to admit, I have a soft-spot for Object Pascal, and still use it fairly often (in a project for a long-standing client; the software was originally developed in TP, and was eventually moved to Delphi 7 which is what I use for maintenance to this day).
If you think Turbo Pascal was bad, you should have seen what we used before TP came out -- the UCSD Pascal p-system. Ugh... compared to that, Turbo Pascal was a pure delight to use.
The comparisons the article makes are not apples-to-apples. The authors start with a binary file, and compare it to uncompressed text files. For example, zlib.h on Mountain Lion is 80511 bytes, but only 22031 bytes when gzip compressed (around half the size of Turbo Pascal).
The article's tone implies that these were the 'good old days' when compilers were small and fast. However, priorities have rightly changed. We now have computers that can execute billions of operations per second without breaking a sweat, I/O devices which can perform tens of thousands of IOPS, and laptop GPUs that can push 3D content to a 5.1 million pixel display. It's likely that Turbo Pascal was highly optimized to deal with the constraint's of that day's computers. A lot of developer man hours were almost certainly allocated to shaving off extra bytes from the executable. It's usually not worth devoting such man hours to tiny performance improvements, considering the beefy hardware most people have.
> The article's tone implies that these were the 'good old days' when compilers were small and fast.
That's reading a lot into two short paragraphs. Tone doesn't come across well in text, and the shorter the text is, the harder it is to figure out tone.
I don't hear nostalgia for the 'good old days' in the article, what I hear is, "Wow! What a feat of engineering to fit an IDE and compiler into 40k!" We get the same kick out of watching demo competitions. Have you seen the "Elevated" demo? Including the music, sound effects, textures, everything, it's only 4k...
Knowing the author's other writing (which is, by the way, very recommended), I would think he isn't trying to say "look how wrong we went, lets make things smaller again" but instead "look how fun that was, but come on, those days are over. Now that we have comparably infinite resources, lets focus on what can be made with them."
tl;dr The Author would probably agree with you more than you think
I was a big fan of Turbo Pascal. It was a great learning language, and had a robust library that was really fun for beginners to tinker with! I'd almost call it the Python of its time.
That's a great article! I learned to really program on Turbo Pascal, and if I remember correctly, the small executable made it easier to store programs on the "program disk". Though by the time I built my own computer, it had dual floppies so it wasn't as big of a deal.
* the machine I ran Turbo Pascal on had a 115meg hard drive which was large compared to my co-workers 80meg drives.
* Turbo Pascal could not edit or display Chinese, Japanese, Korean.
* The editor could not search and replace across files with undo like my current editor. Nor do I think it could do a regex IIRC.
* It could not tag 40k+ files and bring up help on any function in near realtime.
* It could not refactor code.
* The machine ran in 80x50 characters. No windows. I could not view both reference material and my code at the some time on the machine, I needed a book. If I didn't have the book I needed I had to either go to a library or find a tech bookstore. (I only new of 3 in all of both Northern and Southern Califorina).
* The machine I ran it on had no networking.
* I had to manually edit config.sys, autoexec.bat and move DIP switches when ever any new devices were added. There was no USB only serial and parallel.
* When my program crashes I often had to reboot the machine.
* There were no fonts, everything was text.
* There were no images except a few icons in my apps. No way to easily display a company logo or screenshots in docs.
* There were few if any libraries. If I needed to read an image or parse a file I had to write it from scratch. If I needed to draw some graphics I had to poke registers. If I wanted to play audio I had to do device specific stuff.
I'm sure we could go on listing all the things those machines did not do nor did Turbo Pascal do compared to today's environments with their large libraries and access to all the stuff we've built up today.
Sure I miss the time when I could basically own the entire machine but I'm happy that today I can just plug in a USB stick or a digital camera and they just work. That I can participate internationally in multiple languages. That 1-20meg images are trivial to manipulate and display. That mp3s, wavs, and other audio is ubiquitous, etc....
Those simple days were fun but a little reflection will show that today's supposed complexity has made it easy to take a lot of things for granted. In the Turbo pascal days. If you wanted to display a photo it would take hundreds or thousands of lines of code, especially if you wanted it to work for more that just your machine. Today it's step 1: copy file from camera, step 2 <img src="path/to/file.jpg" />
Although I used it in the early nineties, I had a much different experience than you mention. TP had Turbo Vision, a very usable windowing library. You could open code in one window and help in another ... hotkeys opened up the help on the current word etc.
I also dialed into BBSs and downloaded .gifs of elle macpherson, so plenty to do. All on a 386/40. I moved up to a 486/40 at some point.
A comparison to Lightspeed Pascal (which was not quite so small but well under 400kB) might be more instructive. (Or Mac Pascal -- an interpreted Pascal IDE available on the Mac even earlier.) It had a proper GUI, could (theoretically) have been localized by editing resources, had undo (one level) and find/replace, the Mac was networked from day one, and so on.
There should be a followup, "Things That Turbo Pascal is Faster Than". Sure to be included are "loading the wikipedia C++ page" and "loading the photo of the iPhone 5 on apple.com".
I was really impressed by this, until it occurred to me that the source code for the C64 Forth compiler I used back in the day was probably considerably smaller than 39K. :)
39K worth of Forth is a very large program. Most forth code never grows beyond a couple of screens. That doesn't mean there isn't a lot of functionality, it just means that forth is unusually dense.
They had some in those days, they were called "overlays" although that might have been in the 90's and dos protected mode, etc. I'm sure the help, etc were in separate files.