> Simple rule: include files should never include include files. If
> instead they state (in comments or implicitly) what files they need to
> have included first, the problem of deciding which files to include is
> pushed to the user (programmer) but in a way that's easy to handle and
> that, by construction, avoids multiple inclusions. Multiple inclusions
> are a bane of systems programming. It's not rare to have files included
> five or more times to compile a single C source file. The Unix
> /usr/include/sys stuff is terrible this way.
>
> There's a little dance involving #ifdef's that can prevent a file
> being read twice, but it's usually done wrong in practice - the #ifdef's
> are in the file itself, not the file that includes it. The result is
> often thousands of needless lines of code passing through the lexical
> analyzer, which is (in good compilers) the most expensive phase.
>
> Just follow the simple rule.
>
> ----------------------------------------------------------------------
>
> This is a little surprising to me as I'm used to putting includes in
> include files all the time. I do use #ifdef header guards, and I've
> never really had any problems violating this rule. So my first question
> is, has anybody actually ran into problems due to violating this rule?
> And secondly, does this rule apply to C++? For example, if I'm defining
> a class that std::vector members, I ordinarily add a #include<vector> in
> the header.
You won't run in to any problems if you do it right, when you do as
you say. The time it takes to parse the header is probably negligible
compared to the rest of compilation in a modern compiler and a modern
system with disk cache etc. At least I've never felt it to be a
problem. Back on my Amiga, it could actually take some noticeable
time, when the source was on floppies.
// pipe
Received on Sat Jan 16 2010 - 15:55:06 UTC
This archive was generated by hypermail 2.2.0 : Sat Jan 16 2010 - 16:00:04 UTC