Complexity in computer systems architectures and programming environments has been increasing during the last decade. Not only the number of cores has considerably increased, but also the adoption of heterogenous computation. Heterogenous computation uses acceleration devices to speed up segments of computation that fit their capabilitites. In spite of the clear performance advantages that heteorgenous computation brings, it requires additional orchestration between the different hardware architectures and resources, which results in a considerable burden for the programer. As a result, directive based programming models and frameworks have been created and adapted to support heterogenous computation. In particular, OpenMP, as one of the most used parallel programming frameworks, introduced heterogenous computation in its specifications in version 4.5.
However, it is expected that an increased complexity in hardware and programming models introduces new challenges to software testing and reliability. Fortunately for the case of programming models, programming languages and programming frameworks, it is common to find specifications that work as a set of “rules to follow” for implementation developers. These rules are not only used by users to easily move from implementation to implementation, but they are also a great starting point for testing, verification and validation of compliance with such specifications. In the case of OpenMP, there is a current need of a common testing framework that allows compiler implementations and users to meassure their level of compliance with the specifications.
As part of the Exascale Computing Project, the SOLLVE project for Scaling OpenMP Via LLVM for Exascale Performance and Portability is currently working on a framework that, through the use of simple test cases (e.g. unit tests, functional tests and micro-applications) allows to assess compiler implementations and compliance for developers and system architectures. This website summarizes this effort. It presents OpenMP Validation and Verification (OMPVV), which comprises not only the set of tests, but also the community around developing and ensuring the quality of these tests. This website contains publications and documentations, as well as different results that through the development of this project we have been obtaining.
For this project related questions contact us: Thomas Huber (email@example.com), Swaroop Pophale (firstname.lastname@example.org), Nolan Baker (email@example.com), Jaydon Reap, Kristina Holsapple, Michael Carr, Nikhil Rao, Seyong Lee, David Bernholdt and Sunita Chandrasekaran (firstname.lastname@example.org).
Past members: Jose Diaz, Josh Davis, Oscar Hernandez
See Acknowledgement and citation of this project.