The definitive set of testing instructions is given at . The material below is cut and pasted in markdown format from as of 2014-10-08. We encourage those who build PLplot to test both their build-tree version and their installed-examples-tree version and report any problems back to either the [plplot-general](https://lists.sourceforge.net/lists/listinfo/plplot-general "wikilink") or [plplot-devel](https://lists.sourceforge.net/lists/listinfo/plplot-devel "wikilink") mailing lists. ### Testing Prerequisites The legacy test for the build tree requires ctest (available as part of CMake) and bash on Windows platforms such as [Cygwin](https://www.cygwin.com/) or [MSYS](http://www.mingw.org/wiki/msys) or on Unix platforms. In contrast to our legacy testing system, the new testing framework is essentially identical for both the build tree and installed examples tree. It just requires CMake (using any generator that is suitable for the Unix or Windows platform that is being used for the test), and bash on Windows or Unix platforms. ### Build-tree tests Build-tree tests can only be performed if cmake is invoked with the -DBUILD_TEST=ON option. Such tests are done from the top-level directory of the build tree (the directory where you invoke cmake command that configures the build of PLplot). The methods for invoking these tests are given below both for the legacy and new testing methods. These tests include executing all our 31 standard examples for all language interfaces and non-interactive device drivers that we currently support. This is a comprehensive test of the PLplot build. For example, our standard examples exercise virtually all of the PLplot API. Furthermore, this series of tests generates more than 2GB of plot files in various formats. The tests also include a comparison of PostScript (-dev psc) results and stdout from each language interface to PLplot with the corresponding results for C. In general, these results are identical which is a stringent test of our language bindings. (Note it is the user's responsibility to insure the locales are consistent for all languages since inconsistent locales can produce inconsistent stdout results which have nothing to do with PLplot bindings or examples issues.) You should search test results for obvious errors such as segfaults. In addition, you can test for rendering errors by viewing the file results (in the ctest_examples_output_dir subdirectory in the build tree for the ctest case, and the examples/test_examples_output_dir subdirectory in the build tree for the test_noninteractive target case) using an appropriate viewer. Results for -dev psc are a special case. To illustrate this, here are typical -dev psc results for example 1. x01a.psc, x01c.psc, x01cxx.psc, x01d.psc, x01f.psc, x01j.psc, x01lua.psc, x01o.psc, x01ocaml.psc, x01p.psc, x01pdl.psc, and x01t.psc. These correspond to Ada, C, C++, D, Fortran, Java, Lua, Octave, OCaml, Python, Perl/PDL, and Tcl -dev psc (colour PostScript) results for our standard example 1. The test referred to above compares everything in this list, but x01c.psc against that file so rendering errors only need to be looked for in x01c.psc (with your favorite Postscript viewing application), and any of these files which the report shows are different from x01c.psc. And similarly for the -dev psc results for the rest of our 31 standard examples. Here are typical plot file results from our standard example 1 for devices other than -dev psc. x01c.pdfcairo, x01c.ps, x01c.psttf, x01c.psttfc, x01c01.bmpqt, x01c01.epsqt, x01c01.jpgqt, x01c01.pdfqt, x01c01.pngcairo, x01c01.pngqt, x01c01.ppmqt, x01c01.svg, x01c01.svgcairo, x01c01.svgqt, x01c01.tiffqt, and x01c01.xfig. Since different devices are involved in all cases, these should be looked at individually for rendering errors (and similarly for the remaining 31 standard examples). Note such visual inspection is a huge job so we certainly don't expect it of our testers very often, but once every year or so and especially for newer examples that haven't been checked before is worth doing. On Unix platforms a general all-purpose viewer for all these file formats is the [ImageMagick](http://www.imagemagick.org/) display application. #### Invocation of legacy tests in the build tree After running "make all" from the top-level of the build tree, then run `ctest --verbose -j4 >& ctest.out` This creates test plot file results in the ctest_examples_output_dir subdirectory of the build tree, and ctest.out should contain a table of comparisons of Postscript results from each of our standard examples and each of our language bindings against the corresponding C versions. ctest is capable of parallel execution controlled by the -j option to decide how many jobs to run in parallel. A good rule of thumb is to specify a number which is 2 more than the number of cpu's on your computer. So -j3 would be suitable for a computer with one processor, -j4 would be suitable for a computer with 2 processors, etc. #### Invocation of tests in the build tree using the new testing framework The new testing framework is implemented simply as normal targets for the build system that are invoked, e.g., for "Unix Makefiles" generators by `make -j4 test_diff_psc >& test_diff_psc.out` and similarly for other generators. Note for generators like "Unix Makefiles" which have parallel computation capability, the -j parameter controls how many jobs are run simulataneously, and this can save a large amount of time on computers with more than one cpu. For further remarks about setting the -j parameter see discussion above of this parameter for ctest parallel results. Three comprehensive test targets are test_diff_psc, test_noninteractive and test_interactive. test_diff_psc generates all -dev psc results (which are stored in examples/test_examples_output_dir in the build tree) and compares them with the same report that is obtained from ctest. Note this target excludes anything but -dev psc results. test_noninteractive runs test_diff_psc as well as every other PLplot example that produces a file. Note that test_noninteractive is somewhat more comprehensive than legacy ctest and considerably more comprehensive than the test_diff_psc target. All file results from the test_noninteractive target are stored in examples/test_examples_output_dir in the build tree. test_interactive runs all interactive devices for the standard C examples as well as all special interactive examples. Very little user-intervention is required to run these tests because, where possible, the PLplot -np (no-pause) command-line option is used for these tests. The traditional ctest-based testing framework has one important advantage over the new testing framework which is that ctest is a [Dart](http://public.kitware.com/dart/HTML/Index.shtml) client that can be used to automatically report nightly test results to a Dart server. (For further details see [the ctest man page](http://www.cmake.org/cmake/help/v3.0/manual/ctest.1.html)). Currently the PLplot development team does not use this automatic test reporting functionality of ctest, but we plan to in the future. The new testing framework also has advantages over the traditional ctest-based testing framework. - It is implemented for both the PLplot build tree and installed examples tree CMake-based build systems. The implementation is done with common files for the two different build systems so that any changes in the new testing framework are immediately reflected for both build systems. It is in our future plans to implement ctest for the installed examples build system as well, but currently it is just limited to the build system for the build tree. - Full dependencies are implemented (unlike ctest which requires the "all" target to be run before ctest is run). - There are many individual test targets implemented which allow users to test PLplot with extremely fine-grained control. To obtain a list of such targets run the "help" target and select target names that have "test" in the target name. For example, on Unix systems you can run `make help |grep test` to obtain the following results: ... test ... test-drv-info ... test_cairo_dyndriver ... test_dyndrivers ... test_mem_dyndriver ... test_ntk_dyndriver ... test_null_dyndriver ... test_pdf_dyndriver ... test_ps_dyndriver ... test_psttf_dyndriver ... test_qt_dyndriver ... test_svg_dyndriver ... test_tk_dyndriver ... test_tkwin_dyndriver ... test_wxwidgets_dyndriver ... test_xfig_dyndriver ... test_xwin_dyndriver ... test_ada_psc ... test_all_cairo ... test_all_qt ... test_c_bmpqt ... test_c_epscairo ... test_c_epsqt ... test_c_jpgqt ... test_c_ntk ... test_c_pdf ... test_c_pdfcairo ... test_c_pdfqt ... test_c_pngcairo ... test_c_pngqt ... test_c_ppmqt ... test_c_ps ... test_c_psc ... test_c_pscairo ... test_c_psttf ... test_c_psttfc ... test_c_qtwidget ... test_c_svg ... test_c_svgcairo ... test_c_svgqt ... test_c_tiffqt ... test_c_tk ... test_c_wxwidgets ... test_c_xcairo ... test_c_xfig ... test_c_xwin ... test_cxx_psc ... test_d_psc ... test_diff_psc ... test_extXdrawable ... test_extcairo ... test_fortran_psc ... test_interactive ... test_java_psc ... test_lua_psc ... test_noninteractive ... test_ocaml_psc ... test_octave_ntk ... test_octave_psc ... test_octave_qtwidget ... test_octave_tk ... test_octave_wxwidgets ... test_octave_xcairo ... test_octave_xwin ... test_plserver_runAllDemos ... test_plserver_standard_examples ... test_pltcl_standard_examples ... test_pyqt4_example ... test_python_psc ... test_qt_example ... test_tcl ... test_tcl_psc ... test_tclsh_standard_examples ... test_tk ... test_tk_01 ... test_tk_02 ... test_tk_03 ... test_tk_04 ... test_tk_plgrid ... test_wish_runAllDemos ... test_wish_standard_examples ... test_wxPLplotDemo ... ext-cairo-test ... test_plend ... clean_ctest_plot_files The "test" target is supplied automatically by CMake and merely runs ctest. The rest of the target names are largely self-explanatory. For example, the "test_c_svg" target runs all our standard examples implemented with "C" using -dev svg to thoroughly test that device. Some general test targets such as the alrady-mentioned test_c_diff, test_noninteractive, and test_interactive targets as well as others such as test_all_cairo, test_all_qt, test_tcl, and test_tk invoke other tests as dependencies. ### Tests of the PLplot installation After PLplot has been configured (with "cmake"), built (with the "all" target), and installed (with the "install" target), you can test the installation using a legacy test system (implemented with Make, pkg-config, and bash) or our new test framework (implemented with CMake and bash). #### Legacy tests of the PLplot installation You can test the PLplot installation on Unix systems by doing the following commands: `cp -a $prefix/share/plplot$plplot_version/examples /tmp` `cd /tmp/examples` `make test_noninteractive >& make_test.out` `make test_interactive` where "\$prefix" is the installation prefix chosen at the configuration stage, and \$plplot_version is the PLplot version (currently 5.10.0). The effect of the above "cp" and "cd" commands is to copy the examples subtree of the install tree to /tmp and build and test the examples in the copied subtree to keep a clean install tree. However, an alternative is to replace those two commands with `cd $prefix/share/plplot$plplot_version/examples` and build and test the install-tree examples right in the examples subtree of the install tree with the above "make" commands. Regardless of whether you build and test the examples in a copy of the examples subtree of the install tree or directly in that subtree, check all the \*.out files for any errors. N.B. the above legacy "make test_noninteractive" command does essentially the same tests for the installed PLplot version as ctest does for the build-tree version of PLplot, and the "test-noninteractive target for the CMake-based build systems for the build tree and installed examples tree. Note, the ctest method requires only requires only the ctest command (bundled with CMake) and bash, and the CMake-based build systems should work for any CMake generator on any platform with access to bash. In comparison, the "make" command which is available on all Unix platforms but is only available on certain Windows platforms such as Cygwin and MSYS and pkg-config and bash are specifically required for the legacy installed-examples implementation. N.B. the above legacy "make test_interactive" command executes our interactive examples. The results are similar to those of the test_interactive target implemented with our new test framework, but the implementation is very different (depending on pkg-config and a special bash script rather than standard CMake cross-platform commands). #### Cross-platform tests of the PLplot installation using the new test framework Here is an example under Unix of how to test the PLplot installation using the new testing framework. (Those using Windows platforms should be able to infer the equivalent of these commands.) `mkdir /tmp/build_dir` `cd /tmp/build_dir` `cmake $prefix/share/plplot$plplot_version/examples ` `make -j4 test_diff_psc >& make_test_diff_psc.out` `make -j4 test_noninteractive >& make_test_noninteractive.out` `make -j4 test_interactive >& make_test_interactive.out` Note these targets are essentially identical to the targets described above for the new test framework for the build tree because the same bash scripts and essentially the same CMake logic is used to set up these targets. Similarly, all other fine-grained targets (which you can discover using the `make help |grep test` command) for the new test framework are available in this context as well. N.B. the test_noninteractive and test_interactive targets available here are more comprehensive than the same-named targets in the legacy installation test case and are implemented in a quite different (cross-platform) way with much better dependencies. ### Testing Reports (in reverse chronological order) ||| ---|---|---|--- Tester|Alan W. Irwin Notes|(a), (b), (c), (d), (e), (A) Date|2014-09-05 PLplot commit|d6c5b9 CMake version|2.8.12.1 Platform|Debian stable with epa_built libraries Pango/Cairo version|1.35/1.12.14 Qt version|5.3.1 Shared libraries?|Dynamic drivers? Yes|Yes Yes|No No|No CMake-based build tree?|test_noninteractive?|test_interactive?|ctest? Yes|Yes|Yes|Yes CMake-based installed examples?|test_noninteractive?|test_interactive Yes|Yes|Yes Traditional Installed examples?|test_noninteractive?|test_interactive Yes|Yes|Yes ||| ---|---|---|--- Tester|Alan W. Irwin Notes|(a), (b), (c), (d), (e), (f), (A) Date|2014-03-12 PLplot commit|3e817e CMake version|2.8.12.1 Platform|MinGW-4.7.2/MSYS/Wine-1.6.1 with epa_built libraries Pango/Cairo version| none Qt version|none Shared libraries?|Dynamic drivers? Yes|Yes Yes|No No|No CMake-based build tree?|test_noninteractive?|test_interactive?|ctest? Yes|Yes|Yes|Yes CMake-based installed examples?|test_noninteractive?|test_interactive Yes|Yes|Yes Traditional Installed examples?|test_noninteractive?|test_interactive Yes|Yes|Yes Testing notes (lower case notes concern configuration and build options while upper case notes concern errors): * (a) Testing done on 64-bit (AMD64) hardware. * (b) Used parallel make option (-j4) for all builds, installs, and test targets. * (c) Suitable dependent libraries have been installed on the system so there are no device drivers from the default list for this platform that are missing from this test. * (d) Suitable compilers and bindings-related development packages have been installed on the system so there are no default bindings that are missing from this test. * (e) Java, Python, Octave, Lua, and OCaml bindings/examples require shared PLplot libraries in order to work so were not available for testing for the static PLplot libraries case. * (f) For this MinGW/MSYS/Wine platform test the build_plplot_lite epa_build target was used which simplifies testing by excluding the qt, cairo, and wxwidgets devices from tests. This was a "lite" test in other respects as well; i.e., the D, Java, OCaml, and Octave languages were not available on this platform. * (A) No obvious configure, build, or install errors. No run-time errors in tests other than those noted in additional "upper-case" notes (if any).