Re: [dev] [OFFTOPIC] Recommended meta-build system

From: pancake <pancake_AT_youterm.com>
Date: Fri, 29 Jan 2010 11:32:05 +0100

Anselm R Garbe wrote:
> Well I've heared these reasons before and I don't buy them. There are
> toolchains like the BSD ones and they proof pretty much that the
> "everything is a Makefile approach" is the most portable and
> sustainable one. Running a configure script from 10 years ago will
> fail immediately.
>
>
Dont mix configure plus makefile approach. They are different stages.

The makefiles work quite well, but its not its task to detect or setup
options. There are many approaches from the makefiles to do this.

In suckless we use the config.mk approach which is quite good to
setup the options, but this requires no optional dependencies or
compilation features (or at least not so many).

The problem I see in makefiles is that they dont follow a strict usage
rules and there are no 'standards' on their usage. And this is pretty
anoying. Because with the makefile approach you end up by implementing
everything from scratch, there's no set of .mk files to do it magically
or anything else. And of course that magical approach will only cover
the build, install and distribution stages.

The configure stage is in many situations innecessary, but its just a way
to do different actions to 'configure' the project. This is:
 - detect libraries, programs
 - check for endianness
 - check cpu
 - check for include files
 - check OS (target,host,..) useful for crosscompiling
 - check for system-related libraries to fit certain features (-lsocket
in solaris. fex)

If you have to do all this job for every project you do.. you will probably
fail at some point. This is why is good to have a centralized project that
generates the proper checks to get all this information in a way that
works on most of systems and most of arquitectures.

If you find a problem in a certain arquitecture you just need to fix it
in one point, not in all your projects.

The problem is that GNU configure sucks, is slow, is dirty, is bloated, is
full of weird hidden features and lot of collateral issues can easily appear
which are really hard to debug.

I wrote ACR for fun. I just wanted to see how hard would be to replace
such features in shellscript (yeah, acr is written in shellscript). So
the code
is unreadable, but it does its job. Something that started as an experiment
is something that it solves me many KBs in my repositories and sources,
and saves lot of time in compilation process, because the checks are faster
than in autoconf.

The generated code is almost human readable and can be easily debugged,
so that's why I use it.

I know that there are better approaches for this, but there are no books, no
standards and one promoting them to replace the current used build systems.

When building you have to think on different compilers, different
arquitectures,
different platforms, different library dependencies, different
compilation and
installation paths, ways to find the dependencies, etc. It's all a full
mess actually.

I would like to see a clean solution for all those problems by just
using a single
tool (mk?) but without having to maintain many files or having to type
useless
or repetitive things along the projects.
> I know that your problem vector is different, but I think reinventing
> square wheels like autoconf again is not helping us any further. And I
> really believe that sticking to mk or make files in large projects
> saves you a lot of headaches in the long term (think years ahead, like
> 10 years or so).
>
Well. ACR was reinvented some years ago, i just do few commits lately,
i'm happy
with it, because it makes the users and packagers feel comfortable while
compiling
the software because it follows a 'standard' format, which saves many
time for
me to fix issues in their building environments (debian, ..) and for
them without
having to mess into the deep gnu shit.

--pancake
Received on Fri Jan 29 2010 - 10:32:05 UTC

This archive was generated by hypermail 2.2.0 : Fri Jan 29 2010 - 10:36:02 UTC