I'm folding in responses to your other comments here. Well met BTW.
- - - -
You found Dijkstra's review. I'm a fan of his but the guy was hugely arrogant. IMO he craps on them pretty harshly.
I can kind of see where he's coming from, but "he doesn't get it" IMO. He misses the point (or maybe HOS sucked compared to what was later described in Martin's book...)
- - - -
FWIW, James Martin went on to write "System Design from Provably Correct Constructs: The Beginnings of True Software Engineering" which is where I learned about all this. I don't actually know much about HOS specifically, only what's in that book. If you're interested that's the thing to read.
- - - -
> So how does this prevent any bugs which aren't already prevented by the language?
The UI would not let the user enter syntactically nor semantically incorrect "trees".
In modern terms, if you had a syntax-oriented editor (like Alice Pascal) for a language with good type-checking I think you would have most of what HOS et. al. was or did. At the time of Apollo 11 (circa 1969), type-checking was barely a thing:
> In 1969 J. Roger Hindley extended this work and proved that their algorithm always inferred the most general type.
> Equating the most trivial kind of errors (which are caught by the compiler anyway) with "all sources of bugs" ignores the kind of bugs which are actually hard to prevent and might slip into production.
Yes, I know, sorry. I admitted that "all sources of bugs" was hyperbole. I was going for rhetorical effect.
> caught by the compiler anyway
First, why wait? If the errors cannot be committed in the first place surely that's better than detecting them only at compile-time?
Second, compilers didn't catch those errors back in the day. The whole reason Dr. Hamilton made up this stuff was because existing methods, tools, and technology would have crashed her spaceship.
> ignores the kind of bugs which are actually hard to prevent and might slip into production
Every moment saved by the machine is a moment the humans can use to prevent or detect the errors the machine can't detect automatically.
Here we are back to Dijksra. You know he only got a physical computer when his colleagues forced him to get a Mac so they could email him, eh?
He held, and I agree, that the kind of errors you're talking about do not happen while typing in the software. They occur "between the keyboard and the chair". If I may wiggle a little, I think of "bugs" as glitches in the machine, while the kind of errors you're talking about I think of as just "errors". But I know that's idiosyncratic, and that most people lump them together as just "bugs".
The only thing you can do about them is think clearly.
- - - -
The original question of this subthread was, "What is software engineering, anyway?"
My answer is, "What Margaret Hamilton did."
My point is that we have had tools that systematically eliminate sources of error. All automatically preventable bugs should be prevented (modulus economic consideration, but here costs would be trivial for automation of error prevention and the benefits and cost saving would be pretty high.)
Otherwise, calling ourselves "engineers" is pretty lame. IMO.
- - - -
> If you want to interest people in these ideas, you should show how it can solve real problems.
Real problems, eh? :-) Sending a spaceship to the moon? And getting it back? And no one died?
To be fair, I don't know to what degree J. Halcombe Laning's software was influenced by Hamilton. She's pictured next to the "software" section of the Apollo Guidance Computer Wikipedia article, but not mentioned.
> The design principles developed for the AGC by MIT Instrumentation Laboratory, directed in late 1960s by Charles Draper, became foundational to software engineering—particularly for the design of more reliable systems that relied on asynchronous software, priority scheduling, testing, and human-in-the-loop decision capability.[14] When the design requirements for the AGC were defined, necessary software and programming techniques did not exist so it had to be designed from scratch.
Modern IDE's like IntelliJ or Visual Studio hightlights syntax and type errors in real time, as you type. You could even avoid typing and just pick tokens from the autocomplete menu. It would be tedious, but it is possible.
So I guess this part of the vision have come to fruition. It is a solved problem. Great!
I just fundamentally disagree with your terminology. Calling syntax errors "bugs", and redefining actual bugs to "errors" does not help anybody. The bottom line is that the major challenges facing software development is not a prevalence of syntax errors.
- - - -
You found Dijkstra's review. I'm a fan of his but the guy was hugely arrogant. IMO he craps on them pretty harshly.
I can kind of see where he's coming from, but "he doesn't get it" IMO. He misses the point (or maybe HOS sucked compared to what was later described in Martin's book...)
- - - -
FWIW, James Martin went on to write "System Design from Provably Correct Constructs: The Beginnings of True Software Engineering" which is where I learned about all this. I don't actually know much about HOS specifically, only what's in that book. If you're interested that's the thing to read.
- - - -
> So how does this prevent any bugs which aren't already prevented by the language?
The UI would not let the user enter syntactically nor semantically incorrect "trees".
In modern terms, if you had a syntax-oriented editor (like Alice Pascal) for a language with good type-checking I think you would have most of what HOS et. al. was or did. At the time of Apollo 11 (circa 1969), type-checking was barely a thing:
> In 1969 J. Roger Hindley extended this work and proved that their algorithm always inferred the most general type.
https://en.wikipedia.org/wiki/Type_inference#Hindley%E2%80%9...
- - - -
> Equating the most trivial kind of errors (which are caught by the compiler anyway) with "all sources of bugs" ignores the kind of bugs which are actually hard to prevent and might slip into production.
Yes, I know, sorry. I admitted that "all sources of bugs" was hyperbole. I was going for rhetorical effect.
> caught by the compiler anyway
First, why wait? If the errors cannot be committed in the first place surely that's better than detecting them only at compile-time?
Second, compilers didn't catch those errors back in the day. The whole reason Dr. Hamilton made up this stuff was because existing methods, tools, and technology would have crashed her spaceship.
> ignores the kind of bugs which are actually hard to prevent and might slip into production
Every moment saved by the machine is a moment the humans can use to prevent or detect the errors the machine can't detect automatically.
Here we are back to Dijksra. You know he only got a physical computer when his colleagues forced him to get a Mac so they could email him, eh?
He held, and I agree, that the kind of errors you're talking about do not happen while typing in the software. They occur "between the keyboard and the chair". If I may wiggle a little, I think of "bugs" as glitches in the machine, while the kind of errors you're talking about I think of as just "errors". But I know that's idiosyncratic, and that most people lump them together as just "bugs".
The only thing you can do about them is think clearly.
- - - -
The original question of this subthread was, "What is software engineering, anyway?"
My answer is, "What Margaret Hamilton did."
My point is that we have had tools that systematically eliminate sources of error. All automatically preventable bugs should be prevented (modulus economic consideration, but here costs would be trivial for automation of error prevention and the benefits and cost saving would be pretty high.)
Otherwise, calling ourselves "engineers" is pretty lame. IMO.
- - - -
> If you want to interest people in these ideas, you should show how it can solve real problems.
Real problems, eh? :-) Sending a spaceship to the moon? And getting it back? And no one died?
https://en.wikipedia.org/wiki/Apollo_Guidance_Computer#PGNCS...
To be fair, I don't know to what degree J. Halcombe Laning's software was influenced by Hamilton. She's pictured next to the "software" section of the Apollo Guidance Computer Wikipedia article, but not mentioned.
> The design principles developed for the AGC by MIT Instrumentation Laboratory, directed in late 1960s by Charles Draper, became foundational to software engineering—particularly for the design of more reliable systems that relied on asynchronous software, priority scheduling, testing, and human-in-the-loop decision capability.[14] When the design requirements for the AGC were defined, necessary software and programming techniques did not exist so it had to be designed from scratch.
https://en.wikipedia.org/wiki/Apollo_Guidance_Computer#Softw...
- - - -
FWIW I suspect that these folks are going to be the coming cutting edge (of true software engineering): https://www.categoricaldata.net/ ... fallout from Applied Category Theory http://www.appliedcategorytheory.org/