An object (dataset, group, attribute, named datatype) in an HDF5 file can be opened, and it can be opened more than once. When an object is opened, the HDF5 library returns a unique identifier to the application. Every object that is opened must be closed. If an object was opened more than once, each identifier that was returned to the application must be closed. For example, if a dataset was opened twice, both dataset identifiers must be released (closed) before the dataset can be considered closed. Suppose an application has opened a file, a group in the file, and two datasets in the group. In order for the file to be totally closed, the file, group, and datasets must each be closed. Closing the file before the group or the datasets will not effect the state of the group or datasets: the group and datasets will still be open.
There are several exceptions to the above general rule:
One is when the
H5close function is used.
H5close causes a general shutdown of the library: all data is written to disk, all identifiers are closed, and all memory used by the library is cleaned up.
Another exception occurs on parallel processing systems. Suppose on a parallel system an application has opened a file, a group in the file, and two datasets in the group. If the application uses the
H5Fclose function to close the file, the call will fail with an error. The open group and datasets must be closed before the file can be closed.
A third exception is when the file access property list includes the property H5F_CLOSE_STRONG. This property is specified when opening the file, and it closes all open objects when the file is closed with
H5Fclose. For more information, see the H5P_SET_FCLOSE_DEGREE function in the HDF5 Reference Manual.