We are a team of engineers who previously built the infrastructure and developer tools for self-driving car development at Cruise. We recently launched Foxglove Studio [0] as a better way for roboticists to inspect, visualize and debug robotics data.
At Cruise, we found existing visualization tools in the robotics space (rviz, rqt, etc) extremely lacking. They were typically desktop linux (QT) apps, and required the entire autonomous vehicle codebase to be checked out and built locally - meaning they were inaccessible to teams such as operations, QA, and PM/TPMs. In addition, robotics engineers found the workflow of downloading 50gb+ log files to their desktop just to triage an issue painful.
As a result, we invested heavily in a web-based replacement - building a better, faster UI for our robotics engineers, while at the same time opening data up to everyone in the organization, so that you can share a link to an event in a Slack thread or Jira ticket, and others can quickly see the exact same view on their own laptop or desktop. Cruise open-sourced part of this work, which some of you may be familiar with (Webviz) [1].
Earlier this year, we launched Foxglove Studio, which began as a fork of Webviz. Over the past 6 months, we have been extending Studio with many new visualization panels, first-class extension support, ROS 2 support, an optional desktop app, and internally a port from Flow to Typescript. While Studio began as a fork of Webviz, we see it as a separate product - we're focused on building an extensible, community-first project, and we are taking it in a different direction than Webviz (which is tightly coupled to many internal Cruise systems, and as such is difficult for the community to contribute to).
In the future, we're planning to introduce some paid features around team collaboration, data & view sharing, and private extensions. However, the core of Foxglove Studio will always remain free and open source for the robotics community to use.
Feel free to check out our GitHub [2] and Changelog [3] for more details on what we've been up to.
I hacked on the Foxglove sources a few months ago to add support for a visualization I needed, and found the codebase to be top notch. My thanks to the team for the project and for being so helpful on Slack.
The UI seems very thoughtful, it's clearly a high-quality effort! I'm pleasantly surprised by how intuitive it appears at first glance. It's one of a vanishingly few examples of a "modern"-looking UI with high information density and minimal whitespace.
Your team also picked the right hero image for the homepage, and I love that it opens up into a YouTube video. It did take me awhile to realize that it opens into a video though - the bright purple "Play Demo Video" button was treated as spam by my brain. I believe that's because the UI in the image is very busy, so there are a lot of higher-priority details that I can focus on/explore until my internal "look elsewhere on the site" timer expires and I scroll off the hero image.
It feels like this does deserve more attention. I suspect people who aren't currently working with self-propelled robots may be skipping this as they aren't able to envision where they'd use it. I'd enjoy exploring it with something like this starter kit, though: https://foxglove.dev/blog/building-and-visualizing-your-firs...
Will have a think about how to make the "Play demo video" option more obvious. Maybe an additional link under the hero image would be more likely to catch your eye.
Can this be viewed as a visualization of temporal data aligned according to their time stamps, and enable human observers to detect useful patterns of correlation?
What does the ROS2 support offer? I guess it's to display Ros messages or events in the same fashion?
Yes - most of our visualizations are aligned with scrubbing through time (with a playback bar, similar to Youtube but with customizable layouts). Some of the visualizations display data over time regardless of the current playback timestamp (e.g. plots, maps).
We support connecting to different data sources - ROS 1, ROS 2 (which has a completely different wire protocol), websockets, http, local files (.bag format), and we recently added native Velodyne connections (which means Foxglove Studio can also replace VeloView). Our data connection support is pluggable, so for robotics teams not using ROS it is easy to add support for other formats.
Edit - responding to "enable human observers to detect useful patterns of correlation": Humans can use this in many ways, it depends on what question you are trying to answer. For example, to drastically over-simplify - "why did my robot fall down the stairs?" - it might be because your sensors were dirty and the images or laser scans were bad, it might be because your ML incorrectly detected and classified objects, it might be due to bad predictions about other objects (probably not for stairs), it might be due to a bad planned path, it might be due to poor localization. All of these things are easier to diagnose visually rather than stepping through code in a traditional debugging sense.
Yes, AVS was open sourced by Uber ATG around the same that we open sourced Webviz at Cruise. I haven't used it myself, but from memory at the time we exchanged notes with their team, and I got the impression that Webviz had much more user-configurable layouts, whereas AVS had been designed for developers/admins to configure use-case specific layouts. I'm not sure whether is continuing to be developed after Aurora acquired ATG.
Compared to both projects (AVS and Webviz), Foxglove Studio is designed to be extensible, and much more use-case and data-source agnostic (for example, we support connecting directly to different data sources such as ROS1, ROS2, Websockets, HTTP, Velodyne Lidars, and can easily add more).
For non-ROS systems, we are working on an Extensions API that will eventually allow writing code to ingest data from a custom source. Currently it's just focused on custom visualizations, not the data pipeline piece yet: https://foxglove.dev/blog/announcing-foxglove-studio-extensi... However, if that's a strong need you have, we can show you where to modify the existing code to add a custom data source (it's possible, just not cleanly exposed into the extensions API yet).
I used to be a part of the local FRC robotics club (programmer/engineer/last-resort driver), and this is exactly the kind of tooling that I would have killed for back then. Great job on delivering to all 3 major platforms and keeping it open source!
We don't handle these very well today, but it's definitely something we could improve (and we do have some plans in the future for better management of .bag files).
If you have something specific in mind or can share an example of a non-properly indexed .bag file you're dealing with, feel free to file a Github issue or join us on Slack: https://foxglove.dev/community
We are a team of engineers who previously built the infrastructure and developer tools for self-driving car development at Cruise. We recently launched Foxglove Studio [0] as a better way for roboticists to inspect, visualize and debug robotics data.
At Cruise, we found existing visualization tools in the robotics space (rviz, rqt, etc) extremely lacking. They were typically desktop linux (QT) apps, and required the entire autonomous vehicle codebase to be checked out and built locally - meaning they were inaccessible to teams such as operations, QA, and PM/TPMs. In addition, robotics engineers found the workflow of downloading 50gb+ log files to their desktop just to triage an issue painful.
As a result, we invested heavily in a web-based replacement - building a better, faster UI for our robotics engineers, while at the same time opening data up to everyone in the organization, so that you can share a link to an event in a Slack thread or Jira ticket, and others can quickly see the exact same view on their own laptop or desktop. Cruise open-sourced part of this work, which some of you may be familiar with (Webviz) [1].
Earlier this year, we launched Foxglove Studio, which began as a fork of Webviz. Over the past 6 months, we have been extending Studio with many new visualization panels, first-class extension support, ROS 2 support, an optional desktop app, and internally a port from Flow to Typescript. While Studio began as a fork of Webviz, we see it as a separate product - we're focused on building an extensible, community-first project, and we are taking it in a different direction than Webviz (which is tightly coupled to many internal Cruise systems, and as such is difficult for the community to contribute to).
In the future, we're planning to introduce some paid features around team collaboration, data & view sharing, and private extensions. However, the core of Foxglove Studio will always remain free and open source for the robotics community to use.
Feel free to check out our GitHub [2] and Changelog [3] for more details on what we've been up to.
Happy to answer any questions!
[0] https://foxglove.dev/
[1] https://webviz.io/
[3] https://github.com/foxglove/studio
[4] https://github.com/foxglove/studio/releases