Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's never OK. Machines with lots of memory run out of memory too.

To present a piece of research in such trollish terms, accusing "stubborn C developers" of causing security vulnerabilities, and then present an example of code for which "we didn't even have to bother reasoning about how exactly it did its pointer manipulations" that segfaults if it runs out of memory (it dereferences the result of malloc and writes to that location 2 lines down) is beyond hilarious.



The tone and example code is ludicrous. If the expressed attitude is how they perceive what they are doing, their attitudes are going to be the reason their project fails.


On Linux machines malloc()'s returning NULL doesn't mean you're out of memory, it means you're out of address space which is very different.


In a system with virtual memory there are a variety of limits you can hit that will prevent your process allocating more memory and cause malloc to return zero. It's normal to informally lump all these together as "running out of memory" when the distinction doesn't matter, as here.

Address space size is one of these limits, not the only one. Hitting it is rare on 64-bit operating systems, common on 32-bit ones.


Unfortunately, I’ve yet to come across a major application on linux that handles this condition gracefully.


If I understand it correctly, it is even worse, because malloc could return not NULL and still throw an exception when you try to write to the 'allocated' memory. See for example https://linuxembedded.fr/2020/01/overcommit-memory-in-linux




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: