Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"Be careful" is not very actionable. Here are some things you can actually do:

- Perform periodic external security reviews

- Use fuzzing for all uncontrolled/user-inputs

- Use static analysis tools

- Maintain a security bounty program

- Send your employees to security training

- Use a memory safe programming language



I'd add the "adopt an information theoretic approach" to security analysis. (Which is basically what taint analyzers perform.) Think through how systems/components/libraries/functions can and do interface with each other, and try to secure these points. (Make them type safe, make them strict, report meaningful errors ["expected this but got this" is a million times better than invalid input], so they will be easy to maintain and make even more secure.) Try to extract out these parameters as much as possible so you can avoid possible impedance mismatches across the interfaces.

Also, checklists. Checklists are good. And an inventory of used components, and their versions. (This makes it easy to do a CVE review from time to time, and then to automate the review eventually, so only the list maintenance will remain manual.)

Defense in depth, but not through obscurity. (There usually are low hanging fruits. Enforce use of password managers, invest in centralized credential storage, don't overdo password expiration and 2FA. Security training is also a good idea, but the real goal is to nurture a security aware office/team culture.)

Social engineering [or just plain old laziness] is still a serious threat.

Timeboxing. Set aside 1-2 days every month to work on meaningful security-conscious goals. So try to upgrade to lay the fundamentals for that library upgrade that is overdue for years, try to make systems reproducible (also good for DR), try to add a few simple validations here and there against local file inclusion (or whatever comes up during the month, or during the checklist review).

Also, accepting that maintaining network facing systems have an inherent ongoing cost. (Unless you want your iToaster to eventually end up as part of a botnet.) Sometimes we have to let things go, accept that some business models (or hobbyist projects) are not worth it to do sanely and securely.


That is a nice list and all, and I would add that if you can't use a memory safe programming language then you should look closely at the compiler flags in use as well.

But if you need all that to spot the obvious issue in the OP's original specification ... then wow.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: