Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The iPhone killed my inner nerd (theverge.com)
48 points by merlinpierce on June 30, 2017 | hide | past | favorite | 57 comments


There's precious little contemplation in this piece. It's like things are just happening to the author, like waves crashing over his head. Really what he's saying is that his "inner nerd" (I'll never get used to (give in to) the enthusiastic adoption of nerd and geek etc) never really had much interest in technology beneath the level of practical communications for home/hobby use. His interest in those things never made him go lower down. But there's always going to be a layer you can go to where tinkering and fiddling are still required.

So his point is taken: for people whose main interest in technology was effectively for personal organisation and home entertainment there is no longer much need, or use, for technical knowledge. Perhaps that will close one pathway into technology. Others will open up.


I wonder if this will have an effect on how kids get into programming. Back when I was a kid we had MS-DOS, my primary motivation was to install and play games, but there was this easy built-in pathway of: Hey, you can do a lot more with this machine if you want, including making your own games.

Nowadays consuming and developing are so disconnected, it seems almost impossible for the younger generations to just stumble upon it like many people did back then.


The curiosity is still there in kids though, and that's what matters when it comes to learning game development. Around the age of 11 or 12 I got curious, and discovered Game Maker which at the time was a very simple little tool with a simple graphics editor and some action/event blocks you could drag in order to make things move. Making some basic puzzle games with this was easy, but anything more complicated required diving into the built-in scripting language GML (Game Make Language) that I've long since forgotten.

After that I also played with flash games, disassembling them using some strange tool into a kind-of assembly-like code where I tweaked values and instructions to see what happened.

Some time around 15-16 years old, Minecraft became popular in my school. This was what tipped me over to more traditional programming, because there was already a modding community which provided tools for easy deobfuscation of its classes. Instead of revising for GCSEs, I made a small number of mods including elemental creepers which became popular enough to earn a small amount of ad revenue. Something like £20-30 a month for nearly 2 years wasn't great money, but as a kid it maintained my phone contract and a library of games on steam to play.

EDIT: Can't forget Ludum Dare, switching from Java to C# after my first LD experience improved my programming a lot even though the languages have a lot of similarities.

Sure, kids these days may not just happen to stumble upon the code of a game by opening its BASIC file in a text editor, but with curiosity and the internet it is very easy to begin learning from a young age. For me, this all started ~11 years ago, but I'm sure everything I did with my curiosity is still possible for kids today.


I think the kids who are likely to become good programmers are already tinkering with things. They will discover what they can do with a computer, one way or another. If anything, getting started in programming is easier these days, since you have a full dev environment in your browser and tutorials everywhere.


I would like to think that the industry learned the error of its ways.

The 1980s and 1990s were prime times for kids getting into programming. Early personal computers came with BASIC. Early HTML and JavaScript were simple enough for a child to get into. That seemed to fall apart in the 2000s when programming languages did not come standard with personal computers and the growing complexity of web development made it increasingly difficult to create anything exciting by the standards of the day. Those were also the times when programming books geared towards kids seemed to disappear.

Of course many of the modern efforts are much more artificial since they rely upon programming languages geared towards kids. Yet that is the price of complexity. Regardless of what the article's author argues, modern devices are much more complex under the hood. Much of that complexity bleeds into the tools that would appeal to the inner nerd. Languages geared towards kids both reduce that complexity and add to the excitement of creating more sophisticated programs. Children who learn them will have the basics to leap into "real" programming languages.

The other thing that we have witnessed over the past few years is a renewed interest in books geared towards children of most ages. For a while the best that I saw were titles geared towards teens and determined tweens. Now it is possible to find Python (and Ruby) books that mirror some of the BASIC books that were written for children in the early 1980s.

Things are getting better, even if walled gardens are leading some people to claim otherwise.


For me I found it way harder to get into programing in the 90's, as a kid in this decade I really wanted to make my own games and software but being on a Mac I couldn't find how to cross the gap, you didn't have BASIC like before with the Apple II or MSDOS, it was full-featured GUI with expensive compilers or nothing. Now any kid can open the dev console in a browser and write javascript, or even create basic HTML files. Maybe we should encourage more quick and dirty approach and avoid recommending frameworks and complex build processes for hobbyist projects.


What about HyperCard?


But even in the MS-DOS era things were disconnected because with the exception of things like DONKEY.BAS, nearly every game was a sourceless executable. On mini-computers and early microcomputers, games like "Star Trek" and "Lemonade Stand" were written in BASIC, meaning that you could dive in and modify the games you had just been playing.


> sourceless executable

This doesn't matter, the point is the option to do more was always present. Kids are curious and if the option exists at least a few will find it and try to understand it.

Sure you didn't have the sourcecode to Doom but QBasic was always there and to copy games off your friends and even run them you had to at least understand a few basic commands and intricacies of the OS. Not just tap store > tap install.


This was precisely the concern that led Eben Upton to develop the Raspberry Pi.

The idea was to make a cheap, small, device that booted to a prompt so that you interacted with it in the same way that you would have interacted with the DOS prompt. You could then have a few of these in schools etc so children could play with it without the concern that they would brick an expensive piece of kit.

It's clearly ended up vastly more successful than was originally envisaged. It remains to be seen, of course, whether it has inspired another generation of hackers or not.


In saying that I notice that a lot of the developers I work with who are gamers always seem to know a lot more than me about networking. I assume that its related to setting up LAN parties and the likes.


Maybe this is changing, but in my generation, gamers are typically more proficient in a lot of the low-level nuts and bolts. Modding was such a big force in pc gaming, and typically the tools to do it were a little sketchy and rough.

I remember hacking around in data files for Civilization II, or poking at python scripts for Mount & Blade, way before I learned any proper programming.


Also, modern cars with their proverbial weld-shut hoods have killed my inner mechanical engineer.


Get a Jeep Wrangler. Though the engine is highly computer controlled and "hands off", you can have a lot of fun bolting everything you can imagine onto the Jeep itself.

I added a pop-top roof, solar panels, dual batteries, a water tank,pump and filtration system, interior cabinets, a fridge, new suspension, bumpers, a winch and a whole lot more to mine!

All the details here: http://theroadchoseme.com/the-jeep

I designed and built the whole thing, which was channeling my inner engineer!


Oh hey! Following you on youtube. Fan of the wiki. Didn't know you were on HN too.


Cheers! Don't hesitate to reach out if there is something specific you would like me to photograph/film/write about!


Some cars (like the Miata) are still specifically meant to remain more or less tinkerer-friendly, but yes, unfortunately the trend seems to be doing everything possible to force you to visit the stealership for even the most basic of repairs (headlight bulb replacement, oil change, etc.).

I would pay a healthy premium for a solid off-road-capable vehicle that wasn't intentionally as locked down as possible. I imagine one of the chief challenges is manufacturer fear of NHTSA or EPA retribution when someone releases an ECU image that turns a Prius into a rolling coal in exchange for a bit more power. I doubt this is going to change without explicit regulation protecting tinkering.


Volvo tried that with a concept car designed to meet "the particular needs of female drivers": https://en.wikipedia.org/wiki/Volvo_YCC


And it is funny because that's the case of basically everything, yet at the same time, being a maker is fashionable and for a reasonable amount of money, people can and do get themselves 70's industrial level workshop in a lot of what was trade only fields.


Can you be a little more specific? I find modern cars to be funnier just because of the closed source stuff. It's more rewarding (for me, at least) when i figure out a can bus message or how to flash my ECU with diy adapters.


That sounds more like IT-work than mechanical engineering.


> I’d sit smugly reading my emails on a train with my iPAQ or one of the original HTC Pocket PC devices with a stylus. I couldn’t download apps from an app store for these phones because those stores didn’t even exist yet.

This killed me- I used to run a 'App Store'-like site for Pocket PC around this time which allowed installing apps OTA- usually this required running the installer on your Desktop PC and syncing to device. I knew lots of people wanted this but didn't know how to let them know about my site. Web archive has a copy, but only the home page :(

http://web.archive.org/web/20051201023911/http://www.cabfile...


I recognize the domain, I'm pretty sure I used your site all the time.

I usually got there from a search after looking for something specific though.


Awesome :D


Now I've grown up and free time has become scarce, I tinker a lot less than I used to but I recently enjoyed a weekend burst of the "old times" when I offered to build a hackintosh for a friend.

I'm a linux and windows guy and the last time I used a mac was pre-OSX days so I knew nothing about modern Macs, the ecosystem, what the 'go to' apps were for it or even how to use one (thank god it has a terminal and a semblance of linux underneath!).

Anyway, my friend's a designer and he needed a new mac but didn't have the cash on hand to buy one. I offered one evening to google around and see what options there were for building your own machine and sticking OSX on it. From that, I discovered the hackintosh scene, did some reading up and decided to have a go.

I researched parts for his budget (~€800 with monitor), he bought them off Amazon and then I set about getting it working one weekend (side note: amazing how cheap hardware is these days - managed to get 16GB RAM, a 500GB SSD and a 24" monitor in there for that price!).

Turned out to be a lot of fun, doubly so with the constraint that I didn't have a mac on hand to bootstrap with (virtualbox to the rescue!).

It was a nice little nerdy project. Didn't take too long to do, provided a few interesting things to learn/geek out on, gave me some sweaty moments fighting with clover configs and kex files and yielded a happy, tangible result at the end.

So Apple managed to re-kindle my inner nerd for a weekend at least :)


> Windows 8... probably wouldn’t exist if it wasn’t for the iPhone.

I highly doubt Microsoft would have given up on new versions of Windows if iphone hadn't been released.


I suspect he means that without of the success of the iPhone and then the iPad the emphasis in Windows 8 of having a "touch" interface might not have been there.


Yes and no, I guess. Yes: iPhone/iPad showed what to do properly with a touch screen on a phone or some other portable device. No: if they wouldn't have done it, I am fairly - if not 100% - sure somebody else would have done it. I had an industrial touch screen PC more than 10 years before iPhone existed. Even then, using it with a non-touch aware OS, I reckoned there was future in it: that thing, but smaller and flatter and faster and fitting into your pocket wasn't too hard to imagine. I'm by far not the most creative and definitely not alone on the planet so if even I could imagine something like my TI calculator but with a touch screen and a mail reader and my GameBoy games, others could as well, and better. So that's what happened, and Apple was basically first. And immediately spot on.


This - so much this.

Touch-screen tech was coming along nicely before the iphone. I worked in a call center around 99 or 2000. We were using something that seemed to be a very powerful, blackberry reminiscent, touch-screen powered computer. Most of the time these were used to grade the rep's phone calls, adding notes as necessary. Smart phones seem like a natural outgrowth of this sort of technology merging with something folks were already beginning to carry with them - mobile phones.


Quite possibly we would have been saved the pain of the start screen, "Metro" apps, and maybe Cortana.


>Apple’s iPhone has been on the market for 10 years now, and it hasn’t experienced a single instance of a mass malware attack like we’ve seen twice in the past month on Windows PCs.

>Sure, there have been vulnerabilities, bugs, and near misses, but nobody has been forced to pay $300 to unlock their iPhone after a huge malware attack.

I feel the article is disingenuous by comparing a phone to a PC.


I dunno - your average iPhone is a powerful PC running a complex OS with a (largely) permanent network connection and the owner almost certainly will cough up $100 to unlock given how much people rely on their phones these days. That sounds like a perfect target for hackers to me.


If all the data on the phone is in the cloud (as it is for most users), unlocking would just be a factory reset of the device.

Also, while it is indeed a powerful networked computer, it does a lot less stuff in the background than a PC (and does no "legacy" stuff at all). Unlike a PC, where a thing like SMB is expected to work (even though it's old and buggy).


Yes, but it's not an analogous to a PC. Comparing an iPhone, an Android phone and a Windows phone would be more apt.


Remember that WRONG idea that windows have a lot of virus because is more popular?

This is why is valid.


I stopped tinkering when I grew up, which just coincided with when the iPhone became usable.


Well, not a single reference to Linux in the article. It bugs me because one of the reasons I use it as my primary OS is that it keeps my "inner nerd" alive. I believe similar essay could be written with the software in mind. Apps like Whatsapp/WeChat are reducing the pressure we used to feel to find a solution to a given problem via a specific tool. These apps are hammers we use for anything, even for fixing screws on the wall.


Why would he reference Linux? His point was that his needs are met without having to tinker.

His needs drove his inner nerd. Introducing Linux into his environment so that he would be forced to tinker would be backwards.


So maybe I didn't get his point at all. I thought that he was upset with the fact that new gadgets would be turning him into a plain user (and a consumer...) and reducing the tinker inner nerd facet of his life. If he is pleased that now he doesn't need to build/keep those 5 boxes in his room and just have an iPhone in his pocket, well, then I missed the point badly.


I read it as just a fact-of-life statement about how the iPhone has obviated his need to tinker. He may feel nostalgic about it, but I didn't read "upset" in there... certainly not that he's looking for ways to make his life more nerdy by contriving reasons to tinker.


Ok, fair enough.


I get it. I only started programming precisely when the first iphone came out, and I started tinkering a few years before. Even so I long for this sense of wonder that existed when everything was hard, or at least required some thought and ingenuity.

I know the raspberry pie and stuff are a way to do that, but it feels so artificial and fake.

(No, this comment really has no other point but to evoke diffuse nostalgic feelings.)


I started with a TI-99/4a back in 1983 at six years old; I've forgotten most of what I learned as a child tinkering with BASIC and later with DOS/Windows machines. I learned early on that programming wasn't for me, but I love hardware hacking and use those skills daily.

> I know the raspberry pie and stuff are a way to do that, but it feels so artificial and fake.

The Raspberry Pi came about as the result of obscure, unsupported, difficult to grok dev boards that polluted the landscape. With the release of the Pi, suddenly we had a dev board that had a maintained and supported Linux distro, GPIO that was thoroughly and correctly documented, and a community that actually cared about the device. Fast forward five years and it has stayed the dominant "consumer developer" device because it has opened up more and more, while still being 100% supported by its makers rather than closed and abandoned after every minor revision (looking at you, Hardkernel).

If anything, the RPi has revitalized programming as an educational path for young children, and I applaud them for that.


The BeagleBoard and BeagleBone predated the Raspberry Pi, and those were well documented and supported. They're actually better documented because they don't depend on proprietary Broadcom firmware to boot (a Free software replacement is in development but not yet usable for most people). The Raspberry Pi was more successful mostly because it was so much cheaper.


Yep, I was close to buying a BeagleBoard when the RPi was announced, and I went with the Pi instead. At the time, the BeagleBoard was the more powerful and entrenched option, and it was fine for experts, but it was a non-starter for beginners. That's the magic formula the RPi Foundation put together: Cheap (basically loss-leader), easy to understand and bootstrap, and fully supported by the creators.

I get that the Pi is looked down upon by some because of the closed nature of its GPU, but most of the other SBC/SoC manufacturers are also releasing closed source boards and some are even violating the GPL. I believe Bunnie Huang's Novena is the only 100% open source (hardware and software) dev board on the market today.


Raspberry pi and similar are not fake if you actually want something of this form factor (not full PC size, and not a smartphone you would build into the wall...) to run some particular job in your house or elsewhere.


You guys are all comparing hardware platforms. Isn't it the OS that makes the difference? Linux being the most developer friendly these days to the point its almost expected that you know the basics to get some stuff done.


Yeah, but it might feel fake if you just download a bunch of libraries, install them and call it a day.


On one hand you could take advantage of available tools to make your job easier. On the other you could do everything from scratch for personal satisfaction. Neither option discounts the other but I think the author comes off as whiny when he clearly chose the former over latter.


Yeah cause having an IA always on, connected to you entire house, to turn off the light is definitely easier than a switch.


Honestly, this recent nerd obsession with using hideously complex technology to control your home lighting is extremely funny.

It's the idea of going to all this effort just to be able to do some not-very-useful things. You're running an entire embedded computer to control a single light bulb. Just hilarious.


Similar arguments can be made for most new technologies. When the critique isn't over utility, it is almost certainly over disproportionately high costs. The important things are to understand why things are done the way they are and how it will progress in the future.

I suspect that the long term utility for home automation is for managing energy use, rather than the gimmicky stuff that we are being sold on today. The reason for cramming a microcontroller into it probably has a lot to do with cost. Dealing with automation from top to bottom centrally would mean rewiring buildings and dealing with mains. Dealing with automation at the appliance level means that you can tailor the design to the requirements (e.g. cheaper components that handle less power if the device needs less power). Having a microcontroller means that you can use off-the-shelf components for networking.


It's like the bip-bop phones vs the iphone.

I get that you want the iphone now. I didn't get why would you buy the bip-bop phone then.

Yet you need people that does by the unpractical, expensive, mostly useless, polluting first versions of a concept for it to develop into something society needs.

My first comment still stand though.


Author realizes he can't do anything worthwhile with his technical skills that isn't provided by the market (reading emails on phones). The market is working, but you lose that sense of elite-ness and accomplishment.

I'll bet that many people who used to get excited about building Linux based home media servers are now satisfied with Netflix or Prime.


Another thing not mentioned in the article or comments is the great Apple SDK APIs. Those “ruined” development for me, in the sense that most else looks terrible in comparison. (I was not familiar with macOS development before iPhone OS).


"have been impacted by the iPhone. Chromebooks are locked down with an app store". Windows 10S apparently as well. The damage done by Apple making this acceptable is huge.


This author is the founder of winrumors.

While he was running an exchange server, I was learning to program because I couldn't afford server software. I wanted to be the guy running an exchange server too. I wanted to run all sorts of stuff I couldn't afford. But I got my hands on visual studio 97 instead.

I home schooled, and worked as a tech at a sort of server junk shop in Dallas (what many may not know about Dallas is that it's city built on telecom). They would purchase computers from companies going out of business, refurbish, and resell them. I was 14, and paid in cash or store credit. My room was also full of servers and networking equipment. I bought as many of the servers that weren't selling as I could.

My friends and I watched the 1995 film "hackers" on repeat as if it were a new religion. We had LAN parties most weekends, and a lot of my equipment became "mobile". At any one time, I always had a "best" server that was stripped down for transport (which changed every time I could get my hands on something more powerful) and a bucket of switches and ethernet cables. Reminiscing is fun.

By the time the iPhone released, I was 7 years into my professional programming career writing desktop and server software. I completely ignored the iPhone. The idea was cool, it certainly looked cool, but the system itself was an appliance. I didn't believe in computing appliances. I still have little to no interest in mobile as it currently exists, and I don't regret it.

In my opinion, Apple's innovation wasn't the iPhone. It wasn't the app store. It was getting carriers to subsidize a $1,000 pocket computer and call it a phone. It makes sense, if you were building a $1k computer (that is actually worth a grand) to fit in consumer's pockets that you'd want to build in a store so you can make your money back. This business model already existed in gaming consoles (in consoles a majority of their money is made in licensing software). It was obvious it was going to sell like hotcakes. Unfortunately, this cemented the iPhone's appliance nature. They HAVE to be accessible. They HAVE to be secure. They HAVE to have long battery life. Those elements can only be maintained if they are viciously protected. I can get my windows machine to do whatever I want. The limit is physics and my imagination. I can get an iPhone to do what apple approves.

From a consumer perspective I understand that mobile is awesome. Everything is bite-sized and convenient. From a programmer's perspective, I can't help but see your phone as a toy computer, and "apps" as software appetizers. That may change, and I would welcome the change. As it stands, the walled garden, battery life, and touch inputs keep it at toy status for me.

I'm 34 now. When I started my career, the young programmers were 34. I was mentored by greybeards in their 40's and 50's. Now 34 is the beginning of being old. I don't meet many programmers over 40 anymore. I do the work the "young" guys are afraid of. I write the heavy math. I do the memory optimization. I do the complex multi-threading. I build the novel features. I write the code you can't copy and paste from stack overflow. I fix the fuck-ups. Maybe that's because the iPhone never killed my inner nerd.

When I was young I programmed because I couldn't afford the real stuff. Now when I program (outside of paid work) it's because the next real stuff doesn't exist yet.

I'll be really sad if the future truly is your phone as an interface streaming data to and from cloud processing. I can't help but feel like it's a step back.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: