Very advanced type system which allows to move a lot of program correctness to typing system. So basically if your program compiles, it probably works.
It's also has GC which makes it better suited for most programs, compared to Rust with its manual memory management.
I hear this about both Haskell and Rust, and yet, when I tried both in the former I wrote a useless program because I didn't handle state (and yet passed all tests!) while in the latter I immediately wrote a deadlock.
Because it is also possible to write tests that don't adequately capture real-life requirements.
It was an MQTT server, and the tests basically went "if we have these subscriptions, then...", but no subscriptions ever got actually stored by the server.
I prefer the slogan without "probably", "If it compiles it works", because then at least it's clear it's a slogan and not a formal claim. Everyone knows that if you write
multiply x y = x + y
then it will compile but not work, so they don't take it literally. But it is a pithy statement of the lived experience of many users of strongly typed programming, which is more accurately described by something like "if it compiles then it will probably do something at least basically sensible and often be pretty close to what you actually wanted".
For things that run on Linux and other Unices yes.
For macOS UI programs and those that need specific permissions and for commercial programs stick with Homebrew but you can define what you want in homebrew in nix.
Is it possible to use Fil-C as a replacement for valgrind/address sanitizer/leak sanitizer? I.e. say I have a C program that does manual memory management already. Can I then compile it with Fil-C and have it panic/assert on heap use after free, uninitialized memory read (including stack), array out of bounds read, etc?
> image-manip squash: This is the key to reclaiming disk space and the core of our strategy to squash the image layers. The tool creates a temporary container, applies all 272 layers in sequence to an empty root filesystem, and then exports the final, merged filesystem as a single new layer. This flattens the image's bloated history into a lean, optimized final state.
Wouldn't a multistage Dockerfile have accomplished the same thing? smth like
Given that I was using multi-layers in 2020, when I finally got involved in projects with Docker, five years is already some time to learn about this stuff, and I bet is is much older, not bothering to look it up.
I used pg vector chunking on paragraphs. For the answers I saved in a flat text file and then parsed to what I needed.
For parsing and vectorizing of the GCP docs I used a Python script. For reading each quiz question, getting a text embedding and submitting to an LLM, I used Spring AI.
It was all roll your own.
But like I stated in my original post I deleted it without backup or vcs. It was the wrong directory that I deleted. Rookie mistake for which I know better.
reply