Page tree

The license could not be verified: License Certificate has expired!

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Next »

 

JAVA

FORTRAN

C++

C

 

Link

H5D_GET_STORAGE_SIZE

Returns the amount of storage allocated for a dataset.

Procedure:

H5D_GET_STORAGE_SIZE(dataset_id)

Signature:

hsize_t H5Dget_storage_size( hid_t dataset_id )

SUBROUTINE h5dget_storage_size_f(dset_id, size, hdferr)
  IMPLICIT NONE
  INTEGER(HID_T), INTENT(IN) :: dset_id  ! Dataset identifier  
  INTEGER(HSIZE_T), INTENT(OUT)  :: size ! Amount of storage required 
                                         ! for dataset
  INTEGER, INTENT(OUT) :: hdferr         ! Error code  
                                         ! 0 on success and -1 on failure
END SUBROUTINE h5dget_storage_size_f

Parameters:
hid_t dataset_id    IN: Identifier of the dataset to query.

Description:

H5Dget_storage_size returns the amount of storage, in bytes, that is allocated in the file for the raw data of the dataset specified by dataset_id.

Note that the amount of storage in this case is the storage allocated in the written file, which will typically differ from the space required to hold a dataset in working memory.

  • For contiguous datasets, the returned size equals the current allocated size of the raw data.
  • For unfiltered chunked datasets, the returned size is the number of allocated chunks times the chunk size.
  • For filtered chunked datasets, the returned size is the space required to store the filtered data. For example, if a compression filter is in use, H5Dget_storage_size returns the total space required to store the compressed chunks.

H5Dget_storage_size reports only the space required to store the data; the report does not include any metadata.

The return value may be zero if no data has been stored.

Note that H5Dget_storage_size is not generally an appropriate function to use when determining the amount of memory required to work with a dataset. In such circumstances, you must determine the number of data points in a dataset and the size of an individual data element. H5Sget_simple_extent_npoints and H5Tget_size can be used to get that information.

Returns:

Returns the amount of storage space, in bytes, allocated for the dataset, not counting metadata; otherwise returns 0 (zero).

Note that H5Dget_storage_size does not differentiate between 0 (zero), the value returned for the storage size of a dataset with no stored values, and 0 (zero), the value returned to indicate an error.

Example:

examples / h5_subset.c [32:42]  1.10/master  HDFFV/hdf5
int
main (void)
{
    hsize_t     dims[2], dimsm[2];   
    int         data[DIM0][DIM1];           /* data to write */
    int         sdata[DIM0_SUB][DIM1_SUB];  /* subset to write */
    int         rdata[DIM0][DIM1];          /* buffer for read */
 
    hid_t       file_id, dataset_id;        /* handles */
    hid_t       dataspace_id, memspace_id; 

     PROGRAM COMPOUNDEXAMPLE

     USE HDF5 ! This module contains all necessary modules

     IMPLICIT NONE

     CHARACTER(LEN=11), PARAMETER :: filename = "compound.h5" ! File name
     CHARACTER(LEN=8), PARAMETER :: dsetname = "Compound"     ! Dataset name
     INTEGER, PARAMETER :: dimsize = 6 ! Size of the dataset

     INTEGER(HID_T) :: file_id       ! File identifier

c++ / examples / create.cpp [33:43]  1.10/master  HDFFV/hdf5
int main (void)
{
   /*
    * Data initialization.
    */
   int i, j;
   int data[NX][NY];          // buffer for data to write
   for (j = 0; j < NX; j++)
   {
      for (i = 0; i < NY; i++)

public class H5Ex_D_Chunk {
    private static String FILENAME = "H5Ex_D_Chunk.h5";
    private static String DATASETNAME = "DS1";
    private static final int DIM_X = 6;
    private static final int DIM_Y = 8;
    private static final int CHUNK_X = 4;
    private static final int CHUNK_Y = 4;
    private static final int RANK = 2;
    private static final int NDIMS = 2;

History:
Release    Fortran90
1.4.5Function introduced in this release.

--- Last Modified: November 03, 2017 | 08:20 AM