Got it. Yeah, bucketing by use-case would really not be that hard, you could have a system for rotating through them. I think Artifactory has some built in capabilities for aliasing, presenting multiple repos as if they are the same one, etc.
In any case, if I rolled my own hash package scheme with debs, I'd have to build this piece of the tooling regardless.
With S3 you can create file lifecycles that will move them to cheaper storage and eventually delete them.
You could potentially create two buckets. One for throw away pipeline builds, and another for when things graduate to something you want to keep.
It wouldn't be very hard to make some tooling, the files in the cache have almost all the metadata you need: http://cache.nixos.org/0ljamf3irbyahd00849b2v1cdddypn8a.nari...
But because it all hashed based, you would need something to read all that into a database. I am unaware of any tooling that does that today.