Nokia nmake Product Builder
Customer Support Newsletter

http://www.bell-labs.com/project/nmake/newsletters/

Issue No. 7 - January 31, 2001
Highlights
  1. nmake Ports
  2. Dependency-based Java Build Support
  3. Newsletter Feedback
Technical Notes
  1. Sun Forte C++ - Database is Locked
  2. HP Investigates Prefixinclude
Tidbits
  1. `eval ... end' --- some Applications
  2. Building Shared Libraries
  3. Linking with Shared Libraries
Contacts
email: nmake@alcatel-lucent.com (tech support)
email: software@alcatel-lucent.com (licenses/orders)
web: http://www.bell-labs.com/project/nmake/


Highlights

Alcatel-Lucent nmake Ports

Alcatel-Lucent nmake lu3.3 has been ported to the Amdahl UTS 2.1.2 platform. It can be found on the download page. The UTS port is missing fix 990072 as outlined in the lu3.3 release notes:

  1. 990072 - cpp problem with macro replacement using # operator. cpp now handles the "#" operator when preprocessing in the ANSI dialect.

index

Dependency-based Java Build Support

We have been investigating the problem of support for implicit dependencies for the Java programming language. This investigation has proceeded through several phases. We first ask whether implicit dependency support for Java would be useful. If so, we must determine what is involved in providing this support, and if there are any special requirements or complications imposed by the Java language that must be addressed in a solution. Finally, we propose a solution that meets the needs of projects using Java. In this issue of the newsletter, we present some of our findings on the first two questions. Information on our proposed solution, and how to use this solution, will follow in subsequent newsletters.

The usefulness of implicit dependencies for the C and C++ languages is well known. In C/C++, there is a compile-time dependency upon the contents of external header files and external definitions of preprocessor variables. When header files or preprocessor variable definitions change, C or C++ files which are dependent upon these files or definitions must be recompiled even if the source files themselves did not change; otherwise, the previously compiled object files may contain inconsistencies that result in link-time or run-time errors. These errors may be especially difficult to debug since they arise from mixing multiple versions of the same source files in a single executable.

We first ask whether a similar problem arises in Java development. We believe that the answer to this question is ``yes'' (at least in the case of development using a traditional edit-compile-test development environment rather than an IDE.) The fundamental problem is version skew, as discussed in The JavaTM Virtual Machine Specification in the chapter ``Verification of Class Files.'' Further discussion may be found in chapter 13 of the Java Language Specification. Essentially, during compilation a Java compiler refers to previously compiled .class files for definitions of types and other information referenced in a compilation unit. It is possible, and likely during Java development, to introduce binary inconsistencies between Java classes in the course of arbitrary edits to the source .java files. These inconsistencies may result in compile time and runtime errors. It is therefore necessary to ensure that prerequisite .java files are recompiled whenever necessary, and to order Java compilations such that dependent .java files are compiled after or along with the files upon which they depend. This should be done incrementally, since in addition to maintaining consistency, we always want to minimize the time required to rebuild after arbitrary source file edits in order to speed up the development cycle.

Here is an example illustrating the problem. This example was run using JDK1.2.2 and the java compiler found at /opt/exp/java/jdk1.2.2/bin/javac. Suppose we have source files A.java, B.java, C.java, and D.java:

$ cat A.java
public class A {
    public A() {
        B b;
        System.out.println("A()");
        b = new B();
    }
    public static void main(String args[]) {
        A a = new A();
    }
}
$ cat B.java
public class B {
    public B() {
        C c;
        c = new C();

        D d;
        d = new D();

        System.out.println("B()"); 
    }
}
$ cat C.java
public class C {
    public C() {
        System.out.println("C()"); 
    }
}
$ cat D.java
public class D {
    public D() {
        System.out.println("D()"); 
    }
}

These files have a simple dependency relationship: A.java depends on B.java, and B.java depends on C.java and D.java. Graphically, we have:

Graphs of dependencies

If we now bring everything up-to-date by compiling all the source files and then run A.class, we get:

$ javac A.java B.java C.java D.java
$ java A
A()
C()
D()
B()

Now suppose we update C.java. recompile the root class A.java only, and rerun:

$ cp Cnew.java C.java
$ cat C.java
public class C {
    public C(int x) {
        x = 3;
        System.out.println("C(int x)"); 
    }
}
$ ls -l --full-time A.class B.class C.class
-rw-r--r--   1 gms      c++           444 Thu Feb 01 16:09:06 2001 A.class
-rw-r--r--   1 gms      c++           380 Thu Feb 01 16:09:06 2001 B.class
-rw-r--r--   1 gms      c++           332 Thu Feb 01 16:09:06 2001 C.class
$ sleep 2
$ javac A.java
$ ls -l --full-time A.class B.class C.class
-rw-r--r--   1 gms      c++           444 Thu Feb 01 16:09:15 2001 A.class
-rw-r--r--   1 gms      c++           380 Thu Feb 01 16:09:06 2001 B.class
-rw-r--r--   1 gms      c++           332 Thu Feb 01 16:09:06 2001 C.class
$ java A
A()
C()
D()
B()

C.java was not automatically recompiled, leading to an incorrect run-time result. If we now recompile C.java, we get a run-time error:

$ javac C.java
$ java A
A()
Exception in thread "main" java.lang.NoSuchMethodError: C: method <init>()V not found
        at A.main(A.java, Compiled Code)

To make this work, we really needed to update and recompile B.java due to its dependency on C.java. Inadvertently removing B.class would also have lead to a run time error in this situation. The javac -Xdepend option will automatically regenerate needed .class files, but is typically time consuming to run.

If a makefile is used, then the dependencies will have to be explicitly specified as shown above. In this case, make automatically keep all .class files up to date. However, in a large project (and even in small projects) maintaining these dependencies is time consuming and error prone. Automatic derivation and maintenance of class dependencies is a better approach.

The possibility of circular dependencies introduces an additional complication. Suppose D.java is modified to refer to an entity in A.java. We then have the following dependency graph:

Dependency cycle.

We have introduced a dependency cycle involving A, B, and D. In this situation, if any of A.java, B.java, or D.java is modified, all 3 must be recompiled. Furthermore, to avoid inconsistencies all 3 source .java files must be passed to the same invocation of javac. Any automatic dependency generation and management scheme must handle cycles. Of course, dependency cycles may cross package boundaries, further complicating the problem.

For completeness, we mention that the dependency graphs we have presented above are actually a bit oversimplified. These dependency graphs actually represent the relationships among .class files only. This is because the Java compiler needs to reference dependent .class files of referenced entities during compilation. Each .class file is also dependent upon its corresponding .java file. So, a more complete representation of the graph containing the cycle presented above is:

java and class dependencies

These relationships must be represented in the Makefile (or implicitly derived) to properly maintain these 4 Java files. In addition, the build engine must ensure that .java files in the same cycle must be compiled in a single invocation of javac.

As with traditional make programs, nmake is fully capable of representing the required dependency relationships in a straightforward manner and ordering Java compilations in accordance with these relationships, while recompiling the minimum number of files required to bring the specified target file up-to-date. However, nmake goes a step beyond this basic capability by providing automatic dependency generation and maintenance for languages it supports. This removes a significant source of error in program development as a result of failure to bring all files up-to-date during a build, due to outdated dependency information.

Unfortunately, automatic dependency management for Java programs is substantially different, and quite a bit more complex, than C/C++. Following are known complicating characteristics of Java file dependencies that need to be taken into account in designing a solution.

  1. Extraction of Java file dependencies requires a complete parse of the Java source file. This is unlike C/C++ where a simple lexical scan suffices. Therefore, the built-in nmake scanner cannot be used to extract file dependencies. Dependencies may originate in any expression in the program, and may not even be explicitly mentioned in the program! According to the javac manual page:

    When compiling a source file, the compiler often needs information about a type it does not yet recognize. The compiler needs type information for every class or interface used, extended, or implemented in the source file. This includes classes and interfaces not explicitly mentioned in the source file but which provide information through inheritance.

    For example, when you subclass java.applet.Applet, you are also using Applet's ancestor classes: java.awt.Panel, java.awt.Container, java.awt.Component, and java.awt.Object.

  2. Implicitly dependent files are themselves compiled files (.class files) that must be compiled from separate program source files. This is unlike the typical case for C/C++ where implicit file dependencies (header files) are not generated files. This should work in principle, although we have had problems in the past with support of metarule dependency chaining from implicit dependencies.
  3. Java dependencies may form cycles of any length. The only way to compile .java files that are part of a dependency cycle is to pass all the .java files in the cycle together to a single invocation of the Java compiler. Dependency cycles cannot occur in the C/C++ case. Support for cyclic dependencies implies the need for global analysis of graph structures that has never before been required by nmake.
  4. The currently most widely used Java compiler (Sun's javac(1)) has a very high startup overhead, making compiler invocation for individual .java files undesirable. Unfortunately, nmake (like other make tools) is fundamentally designed to compile files one by one, as the engine traverses its dependency graph.
  5. Certain constructs in .java source file give rise to ``implicit targets,'' which are not normally encountered in C/C++ development. This occurs when a .java file defines multiple classes at package scope (at most one of these classes must be public), and in the case of inner classes. In both of these cases, each class generates a separate .class file, resulting in multiple .class files generated from a single .java file. If an ``implicit target'' is deleted, runtime errors can result, so it is desirable to check existence of these .class files and regenerate them if required. In addition, these ``implicit targets'' may contain a `$' character in the file name, leading to a complication in the representation of these names in Makefiles.
  6. The widely used javac compiler performs implicit compilation of some dependent files, even if the the -Xdepend option is not used. This complicates build tool design, and especially complicates support of distributed/concurrent builds.

Note that dependency generation cannot be totally automated. Dependencies may be induced as a result of programmatic constructs(such as java.lang.class forname()), which results in dynamically loaded classes whose names cannot be determined by the compiler from the dependency graph. The author of the code using those classes should make sure that the dependency relations for these classes is correctly specified to the build tool. Consequently support for manual addition of dependencies is required.

Dependencies on jar file members also needs support.

We are planning to implement fully-integrated support for Java implicit dependencies in nmake. However, due to the complexities of the problem, this fully integrated solution will not be available for some time. In order to provide a near-term solution, and to give us additional opportunity to more fully explore the problem and to define the required functionality, we are planning an interim solution using an external Java dependency generator program. Using this simpler approach, dependency management will not be quite as automatic as the eventual fully integrated solution, but will be available much sooner. We will describe this interim solution and how to use it in subsequent newsletters. If you are interested in trying out this solution when it becomes available (probably in the near future), please let us know.

index


Technical Notes

Sun Forte C++ 6.x - Database is Locked

When using the Sun Forte C++ 6.0 or 6.1 compiler in a viewpathing environment the compiler may get stuck, repeatedly issuing the following message:

+ /opt/SUNWspro/bin/CC -O -ptrSunWS_cache -ptr/build/nightly/v3/src/SunWS_c
ache -I- -o test /build/nightly/v3/src/a.o b.o
SunWS_cache: Information: Database is locked, waiting....
SunWS_cache: Information: Database is locked, waiting....
SunWS_cache: Information: Database is locked, waiting....

This is caused by linking a target with object files that exist in a node of the viewpath that is not writable. This is a very typical scenario. For example, a developer will generally have their build node first in the viewpath followed by an official or nightly build node which is updated periodically by the project load builder. In this scenario only the developer's local build node is writable, all other nodes are read only since they are not owned by the developer. In fact it is a basic policy that only the first node of the viewpath be considered writable.

With release 6.0, Sun's C++ compiler no longer uses the -ptr command line flags to find the template repositories. Instead, the compiler ignores the -ptr flags and looks for a template repository (SunWS_cache) in the same directory as each object file being linked. If you are linking with object files down the viewpath then the compiler tries to use the SunWS_cache directory down the viewpath. This seems good, but the compiler insists on locking the repository before referencing it. Of course if the repository is not writable then the compiler will fail to create the lock file, which causes the compiler to assume the repository is already locked. That's when the above message shows up.

We have contacted Sun and they have accepted an RFE against the compiler to introduce an option to use a different locking protocol. This feature has been scheduled for a future release. According to Sun the modifications are too extensive to provide a patch for the current release.

A Work-aournd

The work-around we have at this time is to prevent the SunWS_cache directories down the viewpath from being accessed. Remember the -ptr flags are ignored and the SunWS_cache directories are found by the paths to the object files being linked. In order to ignore the SunWS_cache directories down the viewpath all the object files need to be in the top node. If your project keeps source and object files in different viewpath nodes then the best work-around is to only viewpath through the source nodes. If source and objects are combined in the nodes then you can do one force (nmake -F) build so all the .o's are rebuilt in the top node. Subsequent builds can be incremental without the -F. The result of either technique is that source code will be picked up from down the viewpath but objects will be in the top node so only the local SunWS_cache directory will be used.

index

HP Investigates Prefixinclude

We have been in contact with HP regarding the addition of the prefixinclude feature to HP's aCC compiler. aCC already supports -I- but under certain conditions problems may arise which the prefixinclude feature can handle correctly. The problems occur when a source file is included using a path prefix (ie. #include "prefix/f1.h") and that file quote-includes a second file without specifying the prefix but which actually exists in the same directory (ie. #include "f2.h"). When -I- is used the directory of the includee (f1.h) is not automatically searched, so the second included file (f2.h) will not be picked up. More details are provided in the prefixinclude section of our cpp documentation. The work-around for this is to modify the code to reference the second include file using the prefix (ie. #include "prefix/f2.h") or to add the prefix directory to .SOURCE.h so it will be searched (though this latter method may not handle complex situations where several different prefix directories contain a header file of the same name.)

HP is open to the idea of prefixinclude and is investigating the feature. Hopefully this will make it into a future release of aCC and will close any remaining gaps in using aCC with nmake. We would like to thank HP for discussing this issue with us!

index


Tidbits

`eval ... end' --- some Applications

WHAT `eval ... end' does.
1. It causes the statements between `eval ... end' to be expanded an additional time.
2. `eval ... end' pairs can nest to cause additional expansions.

Application — TO CONSTRUCT VARIABLE NAMES FROM AUTOMATIC VARIABLES

NOTE: without the `eval ... end' the variable name would be an invalid nmake variable name.

without eval-end:

":JOINT:" : .MAKE .OPERATOR
	$(>).$(>) = a.z

make: debug-8: :JOINT::1:data: `$(>).$(>) = a.z'
make: debug-7: assignment: lhs=`$(>).$(>)' rhs=`a.z'
make: ":JOINT:", line 1: $(>).$(>): invalid variable name

with eval-end

":JOINT:" : .MAKE .OPERATOR
	eval
	$(>).$(>) = a.z
	end

make: debug-8: :JOINT::1:test: `eval'
make: debug-8: :JOINT::2:data: `$(>).$(>) = a.z'
make: debug-10:expand(var=>,ops=(null),lev=1): `a.z'
make: debug-10:expand(var=>,ops=(null),lev=1): `a.z'
make: debug-7: assignment: lhs=`a.z.a.z' rhs=`a.z'
make: debug-8: :JOINT::3:test: `end'

index

Building Shared Libraries

nmake uses the :LIBRARY: operator to make shared libraries. There is no built-in support to make a shared library with the double colon (::) operator. By default the :LIBRARY: operator makes an archive library. The shared library is made if the value of $(CC.PIC) is in $(CCFLAGS), and if :ALL: is in the makefile or nmake is run with the install common action.

The $(CC.PIC) variable contains the compiler flag to produce position independent code (pic). It is set automatically from the probe file. Some compilers support more than one pic flag, so it is important to use $(CC.PIC) when setting $(CCFLAGS). The default makerules checks $(CCFLAGS) for $(CC.PIC) and if a different flag is hard-coded in $(CCFLAGS) the check will fail and no shared library will be made.

Here is a simple example makefile:

CC = cc
CCFLAGS += $$(CC.PIC)

:ALL:

test :LIBRARY: a.c b.c c.c

The left-hand-side of :LIBRARY: specifies the simple name of the library. You should never specify the .a or .so suffix, or the leading string "lib".

Notice CC.PIC is referenced with two $-signs. It is necessary to delay the expansion of CC.PIC when "+=" is used. If using "=" then use a single $-sign, such as: CCFLAGS=$(CC.PIC). This is due to nmake's variable expansion and is a whole topic in itself.

The build output may look different depending on what compiler is being used. The following shows Sun's C compiler on Solaris:

$ nmake
+ ppcc -i /tools/nmake/sparc5/lu3.3/lib/cpp cc -O -KPIC -I-D/tools/nmake/sp
arc5/lu3.3/lib/probe/C/pp/D586199Dobincc -I- -c a.c
+ ppcc -i /tools/nmake/sparc5/lu3.3/lib/cpp cc -O -KPIC -I-D/tools/nmake/sp
arc5/lu3.3/lib/probe/C/pp/D586199Dobincc -I- -c b.c
+ ppcc -i /tools/nmake/sparc5/lu3.3/lib/cpp cc -O -KPIC -I-D/tools/nmake/sp
arc5/lu3.3/lib/probe/C/pp/D586199Dobincc -I- -c c.c
+ ar r libtest.a a.o b.o c.o
ar: creating libtest.a
+ rm -f a.o b.o c.o
+ nm -p libtest.a
+ sed -e /[     ][TDBC][        ][      ]*[_A-Za-z]/!d -e s?.*[         ][T
DBC][        ][      ]*?? -e /_STUB_/d -e s/^/-u /
+ ld -G -o libtest.so.1.0 -u amain -u bmain -u cmain libtest.a

You can see libtest.a is made first, then the shared library libtest.so.1.0 is created. On platforms that use something other than .so to denote shared libraries (ie. HP-UX uses .sl) nmake will automatically use the appropriate string so the makefile can be used cross-platform without adding a special case for this difference.

It is standard for nmake to add a major.minor version string of .1.0 to the shared library filename. The major.minor string can be specified with :LIBRARY:. The following example shows a change to the above example makefile:

test 2.1 :LIBRARY: a.c b.c c.c

$ nmake
...
+ ld -G -o libtest.so.2.1 -u amain -u bmain -u cmain libtest.a

Installing the libraries

When the install common action is used, both the archive and shared libraries will be installed into $(LIBDIR). In addition some symbolic links are setup to link the major.minor library name to a filename without the major.minor string.

$ nmake install
+ cp libtest.a ../lib/libtest.a
+ /usr/bin/rm -f ../lib/libtest.so
+ /usr/bin/ln -s libtest.so.1.0 ../lib/libtest.so
+ /usr/bin/cp libtest.so.1.0 ../lib/libtest.to.1.0
+ /usr/bin/rm -f ../lib/libtest.so.1.0
+ /usr/bin/ln -s libtest.to.1.0 ../lib/libtest.so.1.0
+ /usr/bin/ln ../lib/libtest.to.1.0 ../lib/libtest.no.1.0
+ /usr/bin/rm -f ../lib/libtest.so.1.0
+ /usr/bin/ln -s libtest.no.1.0 ../lib/libtest.so.1.0
+ /usr/bin/rm ../lib/libtest.to.1.0

A Little More Control

Release lu3.3 introduced a feature to give the user more control over the behavior of :LIBRARY: when making shared libraries. With this feature you can eliminate the major.minor string from the filename and/or turn off the symbolic links that are created during the install operation.

Turn off symbolic links by setting sharedliblinks=0, which changes the above install action to the following:

$ nmake install
+ cp libtest.a ../lib/libtest.a
+ cp libtest.so.1.0 ../lib/libtest.so.1.0

Turn off the major.minor string by setting sharedlibvers=0. This will also eliminate the symbolic links since it is no longer necessary to link to the major.minor filename:

$ nmake
...
+ ld -G -o libtest.so -u amain -u bmain -u cmain libtest.a

$ nmake install
+ cp libtest.a ../lib/libtest.a
+ cp libtest.so ../lib/libtest.so

index

Linking with Shared Libraries

To link with a shared library specify the library using the -lname format as a prerequisite and specify the library's location with the .SOURCE.a special atom. For example:

.SOURCE.a : /opt/somepackage/lib

target :: x.c y.c z.c -ltest

nmake will first search for a shared library of the specified name. If none is found, a corresponding archive library will be searched. Then nmake will pass the appropriate arguments on the linker command line to create the executable. When a shared library is found, it will pass a combination of -Ldir and -lname flags to the linker to specify the directory to find the library and the library name respectively. When only an archive library is found nmake specifies the path to the archive on the linker command line.

Where Fore Art Thou

Specify all the library search paths using .SOURCE.a. nmake will search for both .a and .so/.sl files using .SOURCE.a. It is not necessary to set .SOURCE.so, it is not used.

It Doesn't Seem to Re-link

You may notice when a shared library is updated your executables may not re-link with the library. By the nature of shared libraries executables do not necessarily need to be re-linked when the library changes. nmake takes advantage of this feature by default. However, if you do want executables to re-link when shared libraries change then set the variable force_shared=1.

What's in a Name

Using the -lname format implies the library filename follows the standard convention of libname.so (.sl for HP-UX) or libname.a. If you have a shared library that does not start with the string "lib" then you must specify the full name of the library as a prerequisite. However, nmake assumes the standard naming convention is used and will not pass the shared library to the linker as might be expected since this works for archive libraries. In this case the best thing to do is change the name of the shared library to follow the standard naming convention. If this is not possible (say it is a third-party library) then you can get this to work by assigning the .ARCHIVE attribute to .so files and adding a .SOURCE.so search path. For example:

.ATTRIBUTE.%.so : .ARCHIVE
.SOURCE.so : $(LIBDIR)

target :: x.c y.c z.c silly.so

The .ARCHIVE attribute must only be used in the makefiles that need it, do not set this globally. Specifically, setting the .ARCHIVE attribute for .so files in a makefile that builds a shared library will cause errors when nmake tries to scan the library for .o files.

But I Really Want the Archive

If you prefer archive libraries over shared libraries then specify the library using +lname instead of -lname. Using +lname nmake will search first for an archive library and then for a shared library if no archive is found:

target :: x.c y.c z.c +ltest

Limitations

UPDATE: The following was resolved in release lu3.5 (see release notes) and is no longer an issue.

There is currently a limitation in nmake that does not allow you to link with a shared library in the same makefile that makes the library. For example, using the following makefile nmake will pick up the archive library, libtest.a, even though a shared library is also made:

CCFLAGS += $$(CC.PIC)
:ALL:

test :LIBRARY: a.c

mycommand :: b.c -ltest

In this case we recommend building the library and executable in separate makefiles.

index


Newsletter Feedback

We are always interested in feedback! Let us know what you think of the newsletter, how we may improve, or ideas you have for future issues. Send us a note at nmake@alcatel-lucent.com. All ideas, suggestions, and comments are welcome.

index

<<home / newsletters