Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Just about everyone is essentially using cartography tools to do large-scale spatiotemporal analysis of sensor and telemetry data. The gaps for both features and practical scalability are massive.

Could you point to any readings or resources that would explain these gaps? I'd be quite curious why our current spatiotemporal analysis techniques would be insufficient. Is it the analysis tools that just need new techniques or is the problem at the source (i.e. the sensors)? Or?



There aren’t really any problems. Some people see that because all of this data can be spatially and temporarily referenced, it can be accessed better, as if there could be one format and one application that could let you do anything you want across the space time, regardless of the source.

The reality is that a lots of models using this data are developed against a specific sensor or dataset, and just don’t work or scale.

I don’t think this will be solved in this domain by pangeo or any startup in particular.

There is an awesome new STACspec that every geo company should be adopting, and that’s the direction to move towards.

But each step towards standardizing the GIS process will require something that truly everyone can adopt, sort of like JSON.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: