Well, you didn't just find a cave, it was made for you by other people. Interdependence is a hallmark of social species such as Homo Sapiens. Even your caveman ancestors were probably reliant on one another in many ways.
>It seems that someone asked the great anthropologist, Margaret Mead, “What is the first sign you look for to tell of an ancient civilization?” The interviewer had in mind a tool or article of clothing. Ms. Mead surprised him by answering, “a healed femur (thigh bone)”. When someone breaks a femur, they can’t survive to hunt, fish or escape enemies unless they have help from someone else. Thus, a healed femur indicates that someone else helped that person, rather than abandoning them and saving only themselves.
>It really clued me into how the language got such a following in HR/Finance applications. You got an array of everyones hours per day. Weekly total hours, pay, tax, medicare, etc, were all broken out in little calculative statements based on the array of hours.
Yeah I guess APL is kind of 'spreadsheets as a Real Programming Language', isn't it?
>I was at a talk that had something to do with Unix. Fortunately, I’ve succeeded in repressing all but the speaker’s opening remark:
>I’m rather surprised that the author of sendmail is still walking around alive.
>The thing that gets me is that one of the arguments that landed Robert Morris, author of “the Internet Worm” in jail was all the sysadmins’ time his prank cost. Yet the author of sendmail is still walking around free without even a U (for Unixery) branded on his forehead.
I'm guessing this is referring to the fact that early versions of Fortran stored return addresses in specific memory locations (at the end of the function definition IIRC) instead of on a call stack. This is why those versions of Fortran couldn't do recursion, because the new return address would overwrite the old one.
Yep, before ASCII was standardized it was common for machines to be built with word-addressable memory and words that were multiples of six bits. Two octal digits easily represent a six-bit byte, just as two hexadecimal digits easily represent an eight-bit byte
but they didn't represent lowercase (!). (2⁶ would have allowed you to represent lowercase but you would have to sacrifice a whole lot to do so -- as alphanumerics alone would use 62 positions, leaving you with maybe one position for a space and one punctuation mark, and no newline...)
Apparently EBCDIC derives from IBM's 6-bit BCD codes
and is interesting because it uses 8 bits, successfully represents lowercase, and leaves a ton of (non-contiguous) positions unspecified.
Maybe our standards (no pun intended) are just shifting as we deal with more and more capable software, but I'd be inclined to say that seven bits "easily encode" standard English text, and six don't, on account of the lack of case distinction. (Although you could certainly choose to handle that with control characters, and I'm sure some 6-bit systems did so.)
Yeah once you go to six bits (five is weird because it’s an odd number) you really start looking at seven - and seven is also odd (which is why seven + parity took off for awhile). But now you have eight and that should be good enough for anyone.
Interestingly enough B talks about how the new computer can address a “char” and not just a whole word at a time.
It’s an ASCII approximation of the proper typography for single quotes (except when you’d use inverted quotes for some reason, not sure how common it is in English).
Similarly, LaTeX uses `` and ‘’ for double quotes because the opening and closing symbols are not the same in properly typeset texts.
>My guess is that Microsoft has a secret new OS (written from scratch) that's super modern and efficient and they're just waiting for the market opportunity to finally ditch Windows and bring out that new thing. I doubt it'll ever happen though because for "new" stuff (where you have to write all your stuff from scratch all over again) everyone expects the OS to be free.
https://aliexpress.com/item/1005003741287162.html