Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Can this be viewed as a visualization of temporal data aligned according to their time stamps, and enable human observers to detect useful patterns of correlation?

What does the ROS2 support offer? I guess it's to display Ros messages or events in the same fashion?



Yes - most of our visualizations are aligned with scrubbing through time (with a playback bar, similar to Youtube but with customizable layouts). Some of the visualizations display data over time regardless of the current playback timestamp (e.g. plots, maps).

We support connecting to different data sources - ROS 1, ROS 2 (which has a completely different wire protocol), websockets, http, local files (.bag format), and we recently added native Velodyne connections (which means Foxglove Studio can also replace VeloView). Our data connection support is pluggable, so for robotics teams not using ROS it is easy to add support for other formats.

Edit - responding to "enable human observers to detect useful patterns of correlation": Humans can use this in many ways, it depends on what question you are trying to answer. For example, to drastically over-simplify - "why did my robot fall down the stairs?" - it might be because your sensors were dirty and the images or laser scans were bad, it might be because your ML incorrectly detected and classified objects, it might be due to bad predictions about other objects (probably not for stairs), it might be due to a bad planned path, it might be due to poor localization. All of these things are easier to diagnose visually rather than stepping through code in a traditional debugging sense.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: