Here are two views I haven't seen discussed before.
"Latest is best": the latest version of the package is the best version. Users should always choose it, if possible.
The best example I can think of is probably lens. It still tries to support even a pre-release version of GHC 7.6 (see this bit). Well, or at least not break it on purpose.
"There is no best": there can be several versions for different usecases. Versions are not strictly ordered.
0.3.x
branch supports older compilers. The 0.4.x
branch supports newer compilers.0.4.x
and the API has changed. But you can use 0.3.x
just fine! It's still being maintained.All of this doesn't mean newer versions are completely as good as old ones. But old versions aren't the devil, either.
I think the latter view is more widely known outside of Haskell. Big projects (e.g. Postgres, any operating system distro, etc) have to do something like this. Conventions reflect this — you'd be very surprised to find that a known security bug has not been backported to an older version of an important, or even semi-important, library.
Haskell doesn't have enough big open-source projects for it to be a common case, even though it happens to GHC every once in a while. And a lot of Haskell infra is built with "latest is best" in mind:
LTS-n
stops being updated when LTS-(n+1)
begins.I am squarely in the "latest is best" camp, and I get emotional about it. So now you'll see some emotions.
Things like HVR's "v1.0.3.0 for GHC ≤8.6 and v1.0.3.1 for GHC 8.8" trick piss me off. I'm genuinely mad about this minor thing! For me, that trick is an ugly hack, violation of existing conventions, abuse of informal semantics — and worst of all, done simply because it makes the solver's life easier. Fuck the solver.
I don't care about things like, say, LTS distros trying to package old versions of Haskell libraries. I don't want distros to package Haskell libraries, period. Yes, there are benefits and drawbacks. No, I don't want it anyway.
For Haskell packages, I really like the assumption that "latest is best". I am relying on it. I hate when it's being violated, especially given that almost all packages I see tend to abide by it. (I haven't analyzed Hackage — maybe I'm missing some obvious counterexamples?)
Anything that doesn't let me blindly choose latest versions of anything has to go. If you need a revision, publish a patch release. If you want to experiment with a different API, create a new package. Etc. This is the underlying reason behind many different opinions I hold about how Haskell should work.
Let's take a topic like "package revisions y/n?" and analyze it from the two viewpoints.
I like "latest is best", and so I am against package revisions.
I dislike CPP. I would love to have a sound syntax-level alternative for CPP (something Rust-like?). Alternatively, CPP can be encapsulated and pushed to the edges with packages like transformers-compat, etc.
What I don't want is for versioning concerns to be spread over git branches, Hackage metadata, etc. I want to have a single set of source files that encodes all of those things, and ideally I want it to be in the master
branch of any package.
🔥Yes, all packages must use a VCS. Preferably Git. Preferably hosted on GitHub. I won't fight for GItHub though.
I don't know what a good solution could be. The smallest thing I can think of is for the solver to refuse to choose 0.x.y
if 0.x.(y+1)
is available. This doesn't solve the problem entirely, and is kinda unprincipled, so I don't like it that much.
Another solution could be — pervasive Stackage-like snapshots without opt-in from package authors, optimized for bleeding-edge instead of stability. This comes with its own problems, though.
The situation we have in Haskell today is — lots of small single-purpose libraries, and few drastic opinionated changes.
Frameworks like Tensorflow are different. Postgres is different. Etc.
Maybe when we get something Tensorflow-like in our ecosystem, "there's no single best" will make sense to me.