Re: makefile(.abi) vs. autoconf/automake/libtool/etc.


Subject: Re: makefile(.abi) vs. autoconf/automake/libtool/etc.
From: Sam TH (sam@uchicago.edu)
Date: Fri Feb 16 2001 - 08:58:19 CST


On Thu, Feb 15, 2001 at 11:47:23PM -0800, Paul Rohr wrote:
> At 06:29 PM 2/15/01 -0600, Sam TH wrote:
> >On Thu, Feb 15, 2001 at 11:03:11AM -0800, Paul Rohr wrote:

> >9. Allows the elimination of lines 176-188 of ut_types.h.
> >
> >That's the section where ICONV_CONST is defined. It's the worst hack
> >I've ever written, and it's going to break some system sometime. But
> >worse than all of those is this:
> > It's a solved problem. People have dealt with just that sort
> >of issue before, and figured out how to deal with it. Dynamic tests
> >can be constructed, so that we don't have to keep adding defines. And
> >that's available with autoconf.
>
> Oh dear. Sam, you've got too much time on your hands. If that hack alone
> is enough to get you to switch build environments...
>
> I personally would never work that hard to save so little. That's exactly
> the right kind of hack -- it makes clear that just about every "reasonable"
> platform concurs about the constness of that library call. ;-)

I think you need to go read that hack again, if it make that clear to
you.

What it says is that there are some platforms where that argument to
iconv is a const char *, and some where it's just a char *. And that
there isn't any rhyme or reason to which one is which. [1]

The one that you think of as "reasonable" is just the one that's
right on my platform, which is why it's the default. But for every
new platform, that has to be checked, and if it isn't like the
default, another ifdef has to be used.

If that's not broken, I don't know what is.

>
> Not that I'm objecting, mind you. More power to you.
>
> >This point is related to 1, but I'll make it here. The other major
> >requirement for the Makefile system is the Bourne Shell. This is a
> >requirement for running sed and make, and also for a number of the
> >tests. It's why we don't work on MPW (since that isn't a compatible
> >shell, although I'm not sure why it can't be ported. Leonard?)
> >
> >The shell is present or available on basically every non-(classic Mac)
> >computer in the world today.
>
> Thanks for the clarification. I forgot about the bash/sh distinction here.
>
> To be honest, I just tend to install as little of Cygwin as I can get away
> with and then ignore everything else about that shell except my few favorite
> commands, all of which Just Work:
>
> cd
> make
> make clean
> make tidy
> make canonical
> make realclean
>
> All my real work happens elsewhere.
>

Really, you're missing out. :-) Actually, since I am trying to get
these sorts of things working on Win32 platforms, I hope I can
persuade one of our win32 developers to install some more
sophisticated Unix tools. But that doesn't have to be you, Paul.

One question: what does make tidy do? I've never used it.
Furthermore, there isn't a top-level make tidy available.
?

> >However, the reason that it's not usually neccessary is that this
> >problem is solved in a very different way by autotools build systems.
> >The standard way of building with a configure script is to do it in
> >different directory than the source tree.
> >
> > [...]
> >
> >Then, when you want to build with gnome enabled, or pspell, or
> >debugging, or whatever, you create a different build directory, and
> >repeat the process.
> >
> >making clean in one of the build directories clean that directory, and
> >so on.
>
> Gotcha. I knew there had to be a way. So do all the build targets get
> sprinkled throughout that shadow copy of the tree?

Yes. You end up with object files just like where they would be if it
was the source tree, only without the source.

>
> >This poses a few complexities having to do with our peer directories,
> >but I think they are solvable.
>
> Yeah, I guess they'd need to be built in a per-config set of directories,
> too, huh? Could we just have a script that setup an appropriate subtree and
> fired off configure runs in each? For example:
>
> buildabi/gnome_debug/abi
> buildabi/gnome_debug/expat
> buildabi/gnome_debug/psiconv
> ...
> buildabi/win32_rel/abi
> etc.
>

That just might work. Thanks, I hadn't thought of that.

> >It should be noted that the current system involves exactly one
> >Makefile.abi, the one in wv. Both psiconv and expat use their native
> >build systems, as do the various other libraries (I believe).
>
> It looks like there's still a Makefile.abi in libiconv, and I'm pretty sure
> we modified the Makefiles in our copies of libpng and zlib as well.
>

Ok, I must have missed those. (And I don't think about those
libraries much anyway, since they never build on my box.)

> >> 7. rebuild speed
> >> -----------------
> >> (strength) Because this is a diving make system, rebuilds can be
> localized
> >> by diving to the appropriate level of the tree and doing the appropriate
> >> make variants (tidy, clean, realclean) there.
> >>
> >> The scale factors are nice here, because this mirrors and reinforces the
> >> modularity of the code. Localized API changes which only affect a small
> >> part of the tree can be rebuilt quickly. API changes which affect the
> >> entire tree require massive clean rebuilds of the tree (and usually get
> >> mentioned as such during commits).
> >
> >Fundamentally, it works like this: if you change any significant
> >header file, you have to rebuild the whole abi tree. If you don't,
> >and it segfaults on something, then you rebuild the whole tree just to
> >make sure. At least that's how it ends up for me.
>
> What? You mean the rest of you don't keep the dependencies in your head?
>
> Silly me. Jeff and I were the ones who originally decided which headers
> should be exposed outside of the relevant modules, which is where the
> capital vs. lowercase prefix originated.

I always wondered what that was about. :)

>
> Thus, I got used to being aware of the potential impact of any header file
> change. I'd just go rebuild whatever portions of the tree needed it. Most
> of the time, it wasn't much, and we always sent commit mails to each other
> with that info. Full rebuilds of, say, af/xap or text/fmt were pretty
> common, but that goes pretty quickly. Full rebuilds of the entire tree were
> actually pretty rare.
>
> However, there hasn't been anyone policing the spread of #includes for quite
> a while now, so I wouldn't be surprised if rebuilds are needed more often.
>

Does anyone know of a nice graphing program of include dependencies?
It would be nice to live in the world Paul is describing.

> >Actually, they can be used to handle the presence or abscence of
> >virtually any feature of the target system. For example, instead of
> >defining ABI_OPT_LIBXML2, an autoconf macro could detect what xml
> >libraries you had on your system, define the appropriate symbols, and
> >build with that library. You could, of course, override that choice.
> >For a real look a lots of different macros, for lots of different
> >things, check out
> > http://cryp.to/autoconf-archive/
>
> Yeah. The Unix world has a tradition of multiple slight incompatibilities
> of this sort, which is what autoconf is all about. From outside that world,
> even the most XP savvy of us tend to see that kind of "maybe this or maybe
> that" phenomenon as just plain broken.
>

Well, the proliferation of different and incompatible changes in basic
libraries on Unix was unfortunate. But I still think you're
short-changing the autotools. They can be used to conditionally
compile extra libraries that you provide in case the system doesn't
have what you need. They can be used to test what kind of XML library
you have on your system, or whether your C++ compiler accepts the
-Werror option.

I think that you're used to a world where there's only one toolchain
vendor, and only one library vendor, and where the basic system never
really differs that much. And from that perspective, autoconf doesn't
make sense. It would be crazy of a single vendor to require that
amount of work to compile on different system versions.

But autoconf was (partly) originally designed for gcc, which today
compiles on your machine, the BeOS/PPC machine at SourceGear, and
SunOS/M68k machines. It couldn't be done without autoconf.

> >Well, personally I think that our hardest-to-run-makefiles-on platform
> >is Windows,
>
> Really? If you're willing to run command-line make at all, then there's
> very little functional difference (to us Win32 heathen) between opening a
> DOS box to type nmake and opening a Cygwin shell to type make. They both do
> the same job, and you get out of there as soon as the compiler has rendered
> its verdict on your code. ;-)

Big mistake. :-)

My observation was only based on the fact that the Win32 build breaks
more often for makefile-related reasons than other builds.

>
> All the annoyance comes from the fact that Cygwin keeps moving around its
> Unix-ish view of a working filesystem and sending new weirdness to uname,
> but we wind up begging Unix folks for help on that anyhow.
>
> >but definitely a close second is BeOS/PPC. And while it
> >still had telnet running (which for some unknown reason was closed,
> >causing us to lose a shipping platform), I compiled GNU Make, GNU
> >Autoconf, GNU Automake, GNU Libtool, and several projects using all of
> >the above.
> >
> >Furthermore, people make significant effort to get the autotools to
> >work with Cygwin, both with gcc and with MSVC.
>
> That's promising.
>
> >Well, I have no idea what dependency tracking looks like for
> >cl.exe. If it has facilities to support this, they could be worked
> >in.
>
> I'll let the compiler gurus confirm this, but AFAIK, dependencies are
> tracked externally via project files. I've never looked to see whether
> cl.exe is used to extract the dependency information, though.
>

It wouldn't surprise me if it couldn't be done. But MS was never that
big a fan of the command line anyway.

> >Since I'm the one who seems to do most of the Makefile hacking
> >recently, that sounds good to me. And the generated makefiles are
> >quite simple. Here's src/wp/ap/GNUmakefile.am:
> >
> >LIBTOOL = @LIBTOOL@ --silent
> >
> >SUBDIRS= xp unix
> >
> >noinst_LTLIBRARIES= libAp.la
> >
> >libAp.la: xp/libWpAp_xp.la @PLATFORM@/libWpAp_@PLATFORM@.la
> > $(LIBTOOL) --mode=link $(CXX) -o libAp.la xp/libWpAp_xp.la \
> > @PLATFORM@/libWpAp_@PLATFORM@.la
> >
> >Pretty easy.
>
> Yeah, the equivalent src/wp/ap/Makefile is more verbose, because each of the
> OBJs gets mentioned explicitly. It's brain-dead simple work, though.
>
> Is the difference as pronounced when you go down another level?
>
> abi/src/wp/ap/xp/Makefile
> abi/src/wp/ap/<platform>/Makefile
>

Well, these files are basically just:

include $(top_srcdir)/includes.mk

INCLUDES= $(AF_INCLUDES) $(WP_INCLUDES) $(TEXT_INCLUDES) \
        $(OTHER_INCLUDES)

noinst_LTLIBRARIES = libWpAp_xp.la

libWpAp_xp_la_SOURCES= <lots of files here>

and then the cdump stuff at the end. Still significantly less
(although less noticeable, cause of all the .cpp lines).

> >Passing those enviroment variables is definitely busted. We want to
> >pass options to configure that say "please use cl.exe as the
> >compiler", among other things. I'm sure this can be done.
>
> Cool.
>
> >The real speed increase, however, comes because auto* generates
> >Makefiles which can be run in parallel. This means that you can use
> >the -j option to make, which allows it to run multiple processes at
> >once. On a fast machine, where lots of time is spent in I/O, this can
> >be a significant speedup.
>
> Hmm. Is that what all those lock files are about?
>

Beats me. I just know it works, and it's faster. :)

> >Well, that was long. I hope it was informative, too.
>
> Very. Thanks.

I'm glad.
           
        sam th
        sam@uchicago.edu
        http://www.abisource.com/~sam/
        GnuPG Key:
        http://www.abisource.com/~sam/key




This archive was generated by hypermail 2b25 : Fri Feb 16 2001 - 08:56:02 CST