We apologize for missing the April issue of the newsletter. Our intention is to publish quarterly but we were consumed with the lu3.2 release plus some consulting work and were unable to put the newsletter together. We apologize and will keep trying to make a timely quarterly release in the future.
nmake release lu3.2 (lu stands for Lucent, to distinguish our nmake from AT&T's research version of nmake 3.2) is now available for Tier 1 and Tier 2 platforms plus a few extra. This includes Solaris, SunOS, HP-UX, NCR, SGI, AIX, Unixware, and Linux® (i386). Check our download and availability pages for details. nmake lu3.2 offers the following new features:
- Output Serialization for Parallel Jobs
- Atom Dependency Reporting Tool
- Local Probe File Support
- Support of Instrumentation Tools
- New recurse_begin_message and recurse_end_message
- New .ACTIONWRAP Special Atom
The full release notes detail the new features and all the bug fixes and enhancements.
The manuals have also been updated for release lu3.2 and are available for download. The manuals are in PDF format and are indexed and search-able, very handy for on-line reference when using Adobe Acrobat Reader. Of course the PDF files may also be printed to make hard copies.
Just what you wanted, yet another year 2000 reminder.
Only two versions of nmake are officially supported for Y2K-compliance. They are the Y2K build of 3.1.2 dated 07/01/1998, and the latest release, lu3.2.
There is an older release of 3.1.2 dated prior to 07/01/1998 which was built on non-y2k-compliant operating systems. The only 3.1.2 releases officially supported for year 2000 were compiled on y2k-compliant operating systems and are dated 07/01/1998 or later. All lu3.2 releases are compiled on y2k-compliant operating systems and are y2k-compliant.
To find the nmake version and date strings do one of the following:
$ what $(whence nmake) | grep make /tools/nmake/sparc5/lu3.2/bin/nmake: nmake (Lucent Technologies Bell Laboratories) lu3.2 04/15/99
If you see a release earlier than 3.1.2, or release 3.1.2 dated before 07/01/1998, then you do not have a y2k-compliant version of nmake. You can download release lu3.2 or the y2k build of 3.1.2. All nmake packages currently available for download are y2k-compliant.
For more information on our year 2000 policy see the nmake year 2000 faq.
With the addition of the Bell Labs China developers to the team we have established a helpdesk for our Asian users. nmake users located in Asia can send email to firstname.lastname@example.org to contact our BLC team directly. This will help alleviate time zone differences between our users in Asia and the US helpdesk.
To aid in migrating from older releases of nmake, such as 2.2, to current releases we have added the nmake 3.0 release notes to our web site. The 3.0 release notes identify changes made from 2.2 to 3.0, and shows makefile changes necessary to move from 2.2 to 3.0. The availability of the 3.0 release notes will provide a missing link for folks upgrading from 2.1 or 2.2 to current releases. All the release notes, 3.0, 3.1, 3.1.1, 3.1.2, and lu3.2, are available on the documentation page for aiding a project in the upgrade of nmake releases. Of course, if you run into problems while upgrading feel free to contact us with details at email@example.com so we can help.
For the next release we would like to improve probe's reliability in probing various compiler environments. We would like to identify key compilers, verify proper configuration with probe, and make any required modifications for the next release.
We need help from our users. In order to compile a list of key compilers and probe issues we are asking our users to email us the following information if any problems have been experienced with probe:
- Hardware/OS platform
- nmake version
- Compiler being used, including the vendor and the version
- Probe problems experienced with the compiler, corrections made to the probe files, and any other details available
If you have anything to contribute please send the above information to us at firstname.lastname@example.org. Thank you!
For HP-UX 10.01, 10.10, and 10.20, HP provides the file /usr/lib/year2000.o to link your application with in order to make it year 2000 compliant. The following text is excerpt from HP's /usr/share/doc/libc_y2k.txt which contains all the details. We recommend you read through this file to understand the impact on your application:
When you install the libc 2000 rollover patches for either of the patched releases (10.01, 10.10, or 10.20), your applications will get the old libc behavior by default. This allows existing applications to maintain binary compatibility. The libc patch installs the /usr/lib/year2000.o file in addition to the standard libc variants. You need to create your software with this patch installed; you also may need year 2000 safe patches which your product depends upon.
If you want to get the new behavior for your software, you will be required to link against the file /usr/lib/year2000.o, in addition to compiling and linking against the patched libraries. If you don't link against the year2000.o file, the patches will assume you wish to get the old behavior, and your application will not be year 2000 safe with respect to these routines.
If you have a need to link all your HP-UX executables with /usr/lib/year2000.o you may use the following in your project's global makefile:
LDLIBRARIES += /usr/lib/year2000.o
If you would like to link only some of your executables with year200.o you may put the above LDLIBRARIES definition in local makefiles as necessary, or specify /usr/lib/year2000.o as a prerequisite to the targets which need it.
Two sets of viewgraphs used by the team have been made available on our web site. They can be found in the viewgraph section of the documentation page. Other viewgraphs may be added as they are available. Currently we have the following:
- An Overview of the nmake Product Builder in PDF format. This gives an overview of main features for people who may not be familiar with the product.
- nmake Release lu3.2 in both HTML and PDF formats. This one focuses on the new features in release lu3.2.
In the previous issue we talked about using both a C and C++ compiler in a single makefile. If you have a compiler that already compiles both C and C++ code you may think you have nothing to worry about, but unfortunately it isn't that easy. Remember, nmake comes with a C preprocessor to facilitate viewpathing for include files. Since this preprocessor is interfaced with compilers from many vendors and handles both C and C++ code it is necessary for the preprocessor to know how to do the proper job in different environments. To facilitate this nmake probes the compilation environment, and the preprocessor uses the results to preprocess source code in a specific manor for a given compiler.
Part of the probe process determines if source code should be preprocessed as C++ code or as C code. If your compiler compiles both C and C++ code, it will most likely look like a C compiler during the probe process. This means the preprocessor will end up preprocessing code for a C compiler even when building C++ code.
In order for nmake to understand you want to compile C++ code the compiler needs to be treated like two different compilers. Most compilers of this type compile for C or C++ depending on the source filename suffix, so file.c would be considered C code, and file.C or file.cc might be considered C++. However there is usually an option to either force the compiler into C++ mode or to tell the compiler to treat a .c file as C++. The trick is to use this option as part of the $(CC) nmake variable when using the compiler for C++. For example, let's say the compiler is cc and it has an option -C++ to force it into C++ mode. You would use the following setting when building C++ code:
CC = cc -C++
By making the option part of $(CC) the compiler will act as a C++ compiler during the probe process. This will allow the nmake preprocessor to preprocess for C++ code when using this CC setting.
When you want to compile C code, then you would use the normal CC setting with no options:
CC = cc
Now what happens when you want to build both C and C++ code in the same makefile? You use the following setting, and use :cc: to identify the C code as described in the previous newsletter.
CC = cc -C++ cc = cc :cc: one.c three.c
Note that some of these compilers use an option where the file suffix to treat as C++ must be specifically identified, rather than using a simple option to force the compiler into C++ mode. In this case you will have to at least specify .c files, but in some cases (such as with NCR's High Performance C/C++ compiler) it is also necessary to specify .i files. If ppcc is being used for your compiles (you would see ppcc on the compile line output) then ppcc is actually calling the nmake preprocessor to create a .i which is then given to the compiler. When this happens, if the compiler is not told to treat a .i as C++ code you can get compile errors when building C++. For this reason both .c and .i files must be identified as C++ in the compiler option.
- &= is used to add an auxiliary value to a variable to which a primary value has already assigned with the =, :=, or += assignment operator. The auxiliary value is not saved in the statefile.
- &= is useful for writing Custom Scan Rules:
Let's say we have defined a custom scan rule for .bc suffixed files, where the option used to specify the location of an included file, is "-d dir"
.ATTRIBUTE.%.bc : .SCAN.bc .SOURCE.%.SCAN.bc : .FORCE $$(*.SOURCE.bc) $$(*.SOURCE) .SCAN.bc : .SCAN . . .
Assuming we are using a flag, BCCFLAGS for the tool that processes .bc files, then we would add the "-d dir" options using the following,
BCCFLAGS &= $$(.INCLUDE. bc -d)
Note, the use of "&=" instead of "+="!!!!
The value of BCCFLAGS (which would be defined as a state variable in the assertion that operates on .bc files) obtained from the scan of the files is really an auxiliary. The real determinant of whether the files are out-of-date is not the value of BCCFLAGS obtained from scanning (i.e. the "-d dir" arguments) but rather the time stamps of the files found from scanning versus the time stamp of the file scanned.
It is significant to use &= instead of +=, because in a VPATHed environment, where nothing has changed, the value of BCCFLAGS may change when the same file is referenced by its relative directory path value, versus its absolute path value. Now, since &= is not stored in the statefile, the scenario above does not cause unnecessary rebuilds.
We are looking for feedback! Let us know what you think of the newsletter, how we may improve, or ideas you have for future issues. Send us a note at email@example.com. All ideas, suggestions, and comments are welcome (so far we have gotten zero, are you out there?).