On Fri, Jan 29, 2010 at 11:32 AM, pancake <pancake_AT_youterm.com> wrote:
> Anselm R Garbe wrote:
>>
>> Well I've heared these reasons before and I don't buy them. There are
>> toolchains like the BSD ones and they proof pretty much that the
>> "everything is a Makefile approach" is the most portable and
>> sustainable one. Running a configure script from 10 years ago will
>> fail immediately.
>>
>>
>
> Dont mix configure plus makefile approach. They are different stages.
>
> The makefiles work quite well, but its not its task to detect or setup
> options. There are many approaches from the makefiles to do this.
>
> In suckless we use the config.mk approach which is quite good to
> setup the options, but this requires no optional dependencies or
> compilation features (or at least not so many).
>
> The problem I see in makefiles is that they dont follow a strict usage
> rules and there are no 'standards' on their usage. And this is pretty
> anoying. Because with the makefile approach you end up by implementing
> everything from scratch, there's no set of .mk files to do it magically
> or anything else. And of course that magical approach will only cover
> the build, install and distribution stages.
Plan 9 solves this, the standard set of mkfiles you can use are
described here: http://doc.cat-v.org/plan_9/4th_edition/papers/mkfiles
> The configure stage is in many situations innecessary, but its just a way
> to do different actions to 'configure' the project. This is:
> - detect libraries, programs
> - check for endianness
If your program needs to check for endianness, it is broken, period.
> - check cpu
If your program needs to check for cpu, it is probably broken, and if
not, it will break cross compiling and it should allow building
versions for each architecture no matter what the current environment
is, ideally as the Plan 9 compilers do actually implemented as
independent programs.
> - check for include files
This is hopeless, the only proper solution is to provide somewhere for
the user to manually define where to find include files and libraries,
otherwise your program will be unportable, unless magically it can
predict where any system ever created in the past and future will have
its headers, which is impossible and is why auto*hell ends up failing
miserably at finding shit.
> - check OS (target,host,..) useful for crosscompiling
Why the fuck does one need a configure step for crosscopiling?
Actually not having one makes crosscompiling infinitely more sane and
convenient.
> - check for system-related libraries to fit certain features (-lsocket in
> solaris. fex)
If you depend on system-specific features your code is not portable,
period, and pretending otherwise is madness.
Also note that all kinds of build time 'configuration' exponentially
increase the difficulty of testing and debugging as you basically stop
being a single version of your program, and instead you have one
version based on what features happen to be 'enabled'.
Also this encourages #ifdef 'pseudo-portability', which is *always*
the *wrong* way to do it.
> If you have to do all this job for every project you do.. you will probably
> fail at some point. This is why is good to have a centralized project that
> generates the proper checks to get all this information in a way that
> works on most of systems and most of arquitectures.
>
> If you find a problem in a certain arquitecture you just need to fix it
> in one point, not in all your projects.
>
> The problem is that GNU configure sucks, is slow, is dirty, is bloated, is
> full of weird hidden features and lot of collateral issues can easily appear
> which are really hard to debug.
This is completely wrong, the problem with auto*hell is not the
implementation, but the concept and idea itself, the implementation
sucks so much mostly as consequence of how idiotic and retarded the
idea is.
uriel
>
> I wrote ACR for fun. I just wanted to see how hard would be to replace
> such features in shellscript (yeah, acr is written in shellscript). So the
> code
> is unreadable, but it does its job. Something that started as an experiment
> is something that it solves me many KBs in my repositories and sources,
> and saves lot of time in compilation process, because the checks are faster
> than in autoconf.
>
> The generated code is almost human readable and can be easily debugged,
> so that's why I use it.
>
> I know that there are better approaches for this, but there are no books, no
> standards and one promoting them to replace the current used build systems.
>
> When building you have to think on different compilers, different
> arquitectures,
> different platforms, different library dependencies, different compilation
> and
> installation paths, ways to find the dependencies, etc. It's all a full mess
> actually.
>
> I would like to see a clean solution for all those problems by just using a
> single
> tool (mk?) but without having to maintain many files or having to type
> useless
> or repetitive things along the projects.
>>
>> I know that your problem vector is different, but I think reinventing
>> square wheels like autoconf again is not helping us any further. And I
>> really believe that sticking to mk or make files in large projects
>> saves you a lot of headaches in the long term (think years ahead, like
>> 10 years or so).
>>
>
> Well. ACR was reinvented some years ago, i just do few commits lately, i'm
> happy
> with it, because it makes the users and packagers feel comfortable while
> compiling
> the software because it follows a 'standard' format, which saves many time
> for
> me to fix issues in their building environments (debian, ..) and for them
> without
> having to mess into the deep gnu shit.
>
> --pancake
>
>
Received on Sat Jan 30 2010 - 14:58:06 UTC
This archive was generated by hypermail 2.2.0 : Sat Jan 30 2010 - 15:00:02 UTC