Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Depends; the input data set could be massive, incorporating things like TV listings data (half-time kettle surges are a huge part of our power consumption profile), weather, publicly listed events, perhaps scraped from websites and so quite dirty.

I can see this transitioning quite easily from a 'we can sanitise this data ourselves' job to a 'screw it, let a massive neural net figure this out' one.



My initial best guess is that the style of analysis you're suggesting (TV guides, etc), doesn't offer sufficient safety margin in the event that a prediction is wrong. If reducing the safety margin was an option then there are probably lots of things that can be done, but if the goal is to maintain a safety margin akin to the current one then my instinct is that the gains to be had are minimal - and that the interesting work is in power storage (battery tech, cryogenic storage, etc.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: