Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Any reasonably complicated project has some pre-processing that has to be done for a source release. e.g. creating the configure script for autotools repos, or building the documentation. These steps can and should be automated, but github is not providing hooks to perform these automated tasks.


Wouldn't the output of that be the "distribution code" rather than the "source code"?

Do your projects not have a README telling people how to build? Are you targeting a nontechnical or beginner audience?


There's a long history of distributing source code already partially processed, and for decades, this has been what people expected when they download a source tarball. Newer generations were introduced to open source through github and are less familiar with this, but they also are rarely interested in source tarballs — they're interested in either the source code to contribute to (in which case they'll git clone) or in a binary distribution.

Really the main purpose of source distributions is for packagers. Especially when you're creating pkgbuild scripts like for archlinux's AUR or gentoo or slackbuilds or macports or BSD, all the tools needed to build the source code become build-time dependencies for your package, which is annoying for people installing it.


Not having to add autotools to a BuildRequires is nice, but hardly the end of the world as a package maintainer. I agree though, the contents of the repository at a git tag is not necessarily a source DISTRIBUTION - which is what most people who are compiling your software would want - especially when many of them probably don't know the autotools incantations required to generate the configure script.


We probably need to do away with partially processed source and distribute just the git repo (or a tarball without .git but with info about which git rev it is) and binary packages for end-users.


Gotcha, thanks for the explanation!


Shouldn't the person downloading the source code be expected to run those scripts? And topically those scripts are included in the source.


Github isn't a build server. You shouldn't expect it to perform tasks for you. You should provide a CI server to your users if you want that functionality. I've never heard of downloading "source code" that has been preprocessed. The entire point of the repo is that it IS the source code. If you want it to be preprocessed for users, why even have the original form in the repo?


This is explained in reply to your sibling comment, but the convention on linux for many years is that the source distribution has automake/autotools already run. That way, when you download the source tarball, you know the process is `./configure && make` to make the final binary. The way package maintainers for distributions customize things is (hopefully) by providing flags for configure. The idea is that autotools should only be necessary for developers, not for maintainers.


It is unfortunate, but just create a new branch and run autotools, and mark a release there.


If by "unfortunate" you mean "totally crazy"... I'd rather deal with the occasional person confused by GitHub kids' ignorance of what source tarballs are meant to be that doing such wild and dirty dances in the repository.


TL;DR source code to install from and source code to hack on are often different beasts.


Yeah but those steps should be part of the normal build process. If you have to use weird tools (e.g. autotools) to generate your "real" source code, I'd say the problem is those tools. Also. Seriously. Autotools? In 2016?


Who's going to volunteer porting, say, Wine's 4000 line autotool script[1] to CMake, exactly? You, maybe? (I'm not being sarcastic, if you actually came up with a patch it might be considered...)

[1] http://source.winehq.org/git/wine.git/blob/0f8a0fd4002f9d5d1...


Autotools? In my Linux?

It's more likely than you think.


So what would you suggest in lieu of autotools? The only real mature alternative is cmake, and cmake's main advantage is that it is a cross platform build system. Basically that means it works on windows without cygwin or its equivalents.

If you use something other than cmake or autotools, you're going to end up making the lives of packagers more difficult. If they have to change anything, as they usually do, they're going to have to know how the build system works, and they dont necessarily want to deal with whatever new-fangled build system you're using.

Usually projects using cmake distribute their source code with cmake build scripts because they're taking advantage of cmake's cross plaform features, and cmake generates different source distributions for different platforms. But this does place a burden on the user to have cmake installed. There's a good argument to be made that that's no longer much of a burden, after all, it's just one package manager invocation away. But you have to think about who's downloading a source distribution (as opposed to binaries or checking out the repository). They're people who are compiling from source for esoteric platforms, perhaps and embedded system, where a cmake package might not already exist. They're people like me who are writing their own pkgbuild script because they like to keep their system well organized and not have too many useless packages (like cmake) lying around. They are people creating a new linux distribution who haven't yet packaged cmake but want to package your software.

And in terms of software design, cmake isn't very good, at least in the eyes of many people writing unix systems software, which as far as mature open source software projects written in C/C++ go, is a lot. Don't get me wrong, cmake certainly has the cross platform advantage, and if you're making software that you want to compile on windows, you'd be remiss to not at least seriously look into using cmake (not that it's the only option, you could also opt to lean on cross-compiliation from a unix platform to windows, distributing binaries only, or requiring users to install cygwin, which isn't worse than having to install visual studio if you don't already use it).

But if you're not taking advantage of the cross-platform features, I'd say cmake is pretty awful. Precisely because it's cross-platform, cmake rejects the typical design principles of unix software, instead offering a monolithic build system that requires special scripts to be written for any different tools you want to use, or even any complicated dependencies (for example: SDL). This is necessary because cmake needs to establish a consistent internal interface that can work for both the windows and unix tools. As a result, everyone has to learn how to use the monolithic cmake system from the ground up, and those skills won't be useful for anything else.

In contrast, autotools, known for having a high learning curve, is really not that bad for people who already know unix. If you understand how to use a unix system, you can figure out autotools faster than you can cmake. And you can really know it, to the extent that you feel confident diving in and messing with its defaults, much sooner than you would with cmake.

But of course, ease of learning isn't the only factor autotools integrates with external tools using their standard interface while cmake demands you essentially write a wrapper for them. Autotools is simply more flexible.

I don't like autotools, for the record, but given the options available, I'd probably choose it over any other build system in almost all the places it's used currently and more.


> In contrast, autotools, known for having a high learning curve, is really not that bad for people who already know unix. If you understand how to use a unix system, you can figure out autotools faster than you can cmake. And you can really know it, to the extent that you feel confident diving in and messing with its defaults, much sooner than you would with cmake.

I've been using unix since 1992 and find the entire autotools suite to be almost incomprehensible. The low point was in 2014 when I was trying to compile a package that intentionally did not ship a configure file and made the user explicitly call autoconf. This would have been ok except that in the 3 or so years between when the code was released and I was trying to build it, autotools made some backward-incompatible changes and autoconf/automake errored out, with no indication of how to fix those. Eventually I just gave up and used a different package.

Conversely, when I started using cmake (motivated by clion using it as a build system) I was able to get it up and running basically immediately, and 12 months later I appear to have become the cmake guru for my company.


"The low point was in 2014 when I was trying to compile a package that intentionally did not ship a configure file and made the user explicitly call autoconf."

At the risk of having to admit I've had my share of bad experiences with autotools, don't ever do that. That is a big sign with flashing lights and sirens, saying that you should put that package down and leave it alone. Because...

"This would have been ok except that in the 3 or so years between when the code was released and I was trying to build it, autotools made some backward-incompatible changes and autoconf/automake errored out, with no indication of how to fix those."

The version of autotools someone was using when they wrote the scripts is very closely tied to the scripts. It's not at all a good idea to use a different version of the autotools.


> At the risk of having to admit I've had my share of bad experiences with autotools, don't ever do that. That is a big sign with flashing lights and sirens, saying that you should put that package down and leave it alone. Because...

And indeed, the exact issue the OP is complaining about is how the zipfile automatically created by GitHub has NOT had autotools run on it yet, and the maintainer cannot even manually overwrite that file!


I have to respectfully disagree. Cmake isn't super amazing and has some flaws and a less than ideal design and language, but it gets the job done. Autotools is more like something that was needed once in the 1980's when you had to support 300 different proprietary UNIX's and compilers / stdlib's weren't very standardized and had lots of bugs and warts, and now only exist because of inertia.


I too have to disagree. Cmake is crap compared to Autotools. It's makefiles don't work if the end user who does the compilation doesn't have cmake installed. Autotools doesn't have any runtime dependencies - that's a design goal that yes, still matters 2016.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: