HDF5 throws errors, unless I inline subroutine - fortran

I recently started exploring the HDF5 library for Fortran, and I came across some curious behaviour that I don't understand. The code below compiles just fine, but when I try to run the code, I get a very long list of errors originating from the subroutine HDF_init_ot. When I place this subroutine inside the caller subroutine HDF_init (as I did in HDF_init2), everything works perfectly fine. Why does this happen?
The code:
!===============================================================================
! This subroutine makes a call to an external subroutine HDF_init_ot,
! which throws an error
!===============================================================================
subroutine HDF_init()
use hdf5
character(len=11), parameter :: filename = "output.hdf5"
character(len=2), parameter :: group_ot = "ot"
integer(HID_T) :: file_handle, ot_handle
integer :: error
! Create a new HDF file
call h5open_f(error)
call h5fcreate_f(filename, H5F_ACC_TRUNC_F, file_handle, error)
! Create a new group
call h5gcreate_f(file_handle, group_ot, ot_handle, error)
! Call to subroutine
call HDF_init_ot(ot_handle)
! Close group
call h5gclose_f(ot_handle, error)
! Close file
call h5fclose_f(file_handle, error)
call h5close_f(error)
end subroutine HDF_init
!===============================================================================
! The culprit subroutine
!===============================================================================
subroutine HDF_init_ot(ot_handle)
use hdf5
character(len=1), parameter :: dset_t = "t"
integer(HID_T) :: ot_handle, ot_space_handle
integer(HID_T) :: data_handle
integer(HSIZE_T), dimension(1) :: ot_dims, ot_max_dims
integer :: error, rank
! Define rank and dimensions
rank = 1
ot_dims = (/0/)
ot_max_dims = (/H5S_UNLIMITED_F/)
! Create a data space
call h5screate_simple_f(rank, ot_dims, ot_space_handle, error, ot_max_dims)
! Create a data set within the space
call h5dcreate_f( ot_handle, dset_t, H5T_NATIVE_DOUBLE, ot_space_handle, &
data_handle, error)
! Close the data set
call h5dclose_f(data_handle, error)
! Close the data space
call h5sclose_f(ot_space_handle, error)
end subroutine HDF_init_ot
!===============================================================================
! When I inline HDF_init_ot into the subroutine below, everything works
!===============================================================================
subroutine HDF_init2()
use hdf5
character(len=11), parameter :: filename = "output.hdf5"
character(len=2), parameter :: group_ot = "ot"
character(len=1), parameter :: dset_t = "t"
integer(HID_T) :: file_handle, ot_handle, ot_space_handle, data_handle
integer(HSIZE_T), dimension(1) :: ot_dims, ot_max_dims
integer :: error, rank
rank = 1
ot_dims = (/0/)
ot_max_dims = (/H5S_UNLIMITED_F/)
call h5open_f(error)
call h5fcreate_f(filename, H5F_ACC_TRUNC_F, file_handle, error)
call h5gcreate_f(file_handle, group_ot, ot_handle, error)
call h5screate_simple_f(rank, ot_dims, ot_space_handle, error, ot_max_dims)
call h5dcreate_f( ot_handle, dset_t, H5T_NATIVE_DOUBLE, ot_space_handle, &
data_handle, error)
call h5dclose_f(data_handle, error)
call h5sclose_f(ot_space_handle, error)
call h5gclose_f(ot_handle, error)
call h5fclose_f(file_handle, error)
call h5close_f(error)
end subroutine HDF_init2
!===============================================================================
! Main program
!===============================================================================
program test
implicit none
! This one throws errors
call HDF_init()
! This one does not
call HDF_init2()
end program test
The error:
HDF5-DIAG: Error detected in HDF5 (1.8.16) thread 140274482874112:
#000: ../../../src/H5D.c line 194 in H5Dcreate2(): unable to create dataset
major: Dataset
minor: Unable to initialize object
#001: ../../../src/H5Dint.c line 453 in H5D__create_named(): unable to create and link to dataset
major: Dataset
minor: Unable to initialize object
#002: ../../../src/H5L.c line 1638 in H5L_link_object(): unable to create new link to object
major: Links
minor: Unable to initialize object
#003: ../../../src/H5L.c line 1882 in H5L_create_real(): can't insert link
major: Symbol table
minor: Unable to insert object
#004: ../../../src/H5Gtraverse.c line 861 in H5G_traverse(): internal path traversal failed
major: Symbol table
minor: Object not found
#005: ../../../src/H5Gtraverse.c line 641 in H5G_traverse_real(): traversal operator failed
major: Symbol table
minor: Callback failed
#006: ../../../src/H5L.c line 1685 in H5L_link_cb(): unable to create object
major: Object header
minor: Unable to initialize object
#007: ../../../src/H5O.c line 3016 in H5O_obj_create(): unable to open object
major: Object header
minor: Can't open object
#008: ../../../src/H5Doh.c line 293 in H5O__dset_create(): unable to create dataset
major: Dataset
minor: Unable to initialize object
#009: ../../../src/H5Dint.c line 1056 in H5D__create(): unable to construct layout information
major: Dataset
minor: Unable to initialize object
#010: ../../../src/H5Dcontig.c line 422 in H5D__contig_construct(): extendible contiguous non-external dataset
major: Dataset
minor: Feature is unsupported
HDF5-DIAG: Error detected in HDF5 (1.8.16) thread 140274482874112:
#000: ../../../src/H5D.c line 415 in H5Dclose(): not a dataset
major: Invalid arguments to routine
minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.16) thread 140274482874112:
#000: ../../../src/H5D.c line 194 in H5Dcreate2(): unable to create dataset
major: Dataset
minor: Unable to initialize object
#001: ../../../src/H5Dint.c line 453 in H5D__create_named(): unable to create and link to dataset
major: Dataset
minor: Unable to initialize object
#002: ../../../src/H5L.c line 1638 in H5L_link_object(): unable to create new link to object
major: Links
minor: Unable to initialize object
#003: ../../../src/H5L.c line 1882 in H5L_create_real(): can't insert link
major: Symbol table
minor: Unable to insert object
#004: ../../../src/H5Gtraverse.c line 861 in H5G_traverse(): internal path traversal failed
major: Symbol table
minor: Object not found
#005: ../../../src/H5Gtraverse.c line 641 in H5G_traverse_real(): traversal operator failed
major: Symbol table
minor: Callback failed
#006: ../../../src/H5L.c line 1685 in H5L_link_cb(): unable to create object
major: Object header
minor: Unable to initialize object
#007: ../../../src/H5O.c line 3016 in H5O_obj_create(): unable to open object
major: Object header
minor: Can't open object
#008: ../../../src/H5Doh.c line 293 in H5O__dset_create(): unable to create dataset
major: Dataset
minor: Unable to initialize object
#009: ../../../src/H5Dint.c line 1056 in H5D__create(): unable to construct layout information
major: Dataset
minor: Unable to initialize object
#010: ../../../src/H5Dcontig.c line 422 in H5D__contig_construct(): extendible contiguous non-external dataset
major: Dataset
minor: Feature is unsupported
HDF5-DIAG: Error detected in HDF5 (1.8.16) thread 140274482874112:
#000: ../../../src/H5D.c line 415 in H5Dclose(): not a dataset
major: Invalid arguments to routine
minor: Inappropriate type
Edit:
I found that when I remove ot_max_dims from the h5screate_simple_f call, everything runs as expected. Curiously, if I define ot_max_dims within HDF_init and pass this as an extra parameter to HDF_init_ot, things also work. This suggests to me that H5S_UNLIMITED_F is a parameter that is private to HDF_init(), possibly tied to the API instance (i.e. to h5open_f)?

Related

HDF5 error: unable to lock file,Resource temporarily unavailable

I have an application which writes a new HDF file at every 100th iteration of a loop.
There is normal output at steps 0, 100, 200, 300, and 400, but at step 500 i get
HDF5-DIAG: Error detected in HDF5 (1.10.5) thread 0:
#000: H5F.c line 444 in H5Fcreate(): unable to create file
major: File accessibilty
minor: Unable to open file
#001: H5Fint.c line 1567 in H5F_open(): unable to lock the file
major: File accessibilty
minor: Unable to open file
#002: H5FD.c line 1640 in H5FD_lock(): driver lock request failed
major: Virtual File Layer
minor: Can't update object
#003: H5FDsec2.c line 959 in H5FD_sec2_lock(): unable to lock file, errno = 11, error message = 'Resource temporarily unavailable'
major: File accessibilty
minor: Bad file ID accessed
The error happens in the call to H5Fcreate.
No matter how many outputs i make before step 500 (e.g. output at every step, or no output at all, or one output at step 499) - all outputs are correct, but at step 500 i get the above error message.
I checked: all HDF files that get created, are closed again immediately after writing.
(all HDF5 calls go through wrapper functions which write to a logfile the handles of the files being created, opened and closed)
Under what circumstances does this error message occur?
Is there a probability to find out what exactly the problem is?

Reading large HDF using with Fortran

I have some HDF data that has been created with PyTables. This data is very large, an array of 3973850000 x 8 double precision values, but with PyTables compression this can easily be stored.
I want to access this data using Fortran. I do,
PROGRAM HDF_READ
USE HDF
IMPLICIT NONE
CHARACTER(LEN=100), PARAMETER :: filename = 'example.h5'
CHARACTER(LEN=100), PARAMETER :: dsetname = 'example_dset.h5'
INTEGER error
INTEGER(HID_T) :: file_id
INTEGER(HID_T) :: dset_id
INTEGER(HID_T) :: space_id
INTEGER(HSIZE_T), DIMENSION(2) :: data_dims, max_dims
DOUBLE PRECISION, DIMENSION(:,:), ALLOCATABLE :: dset_data
!Initialize Fortran interface
CALL h5open_f(error)
!Open an existing file
CALL h5open_f(filename, H5F_ACC_RDONLY_F, file_id,error)
END PROGRAM HDF_READ
!Open a dataset
CALL h5dopen_f(file_id, dsetname, dset_id, error)
!Get dataspace ID
CALL h5dget_space_f(dset_id, space_id, error)
!Get dataspace dims
CALL h5sget_simple_extent_dims_f(space_id, data_dims,max_dims, error)
!Create array to read into
ALLOCATE(dset_data(data_dims(1), data_dims(2)))
!Get the data
CALL h5dread_f(dset_id, H5T_NATIVE_DOUBLE, dset_data, data_dims,error)
However, this creates an obvious problem, in that the array cannot be allocated to such a large size with double precision floats as it becomes greater than the system memory.
What is the best method for accessing this data? My current thoughts are for some sort of chunking method? Or is there a way to store the array on disk? Does HDF have methods for dealing with large data like this - I have read around but can find nothing pertaining to my case.

Multiple communicators in MPI

The background of this question is in some computational areas such as Computational Fluid Dynamics (CFD). We often need finer mesh/grid in some critical regions while the background mesh can be coarser. For example the adaptive refine mesh to track shock waves and nesting domains in meteorology.
The Cartesian topology is used and domain decomposition is shown in the following sketch. In this case, 4*2=8 processors are used. The single number means the processor's rank and (x,y) means its topological coordinate.
Assume the mesh is refined in the regions with ranks 2, 3, 4, 5 (in the middle) and the local refinement ratio is defined as R=D_coarse/D_fine=2 in this case. Since the mesh is refined, so the time advancement should also be refined as well. This needs in the refined region the time steps t, t+1/2*dt, t+dt should be computed while only time steps t and t+dt are computed in global regions. This requires a smaller communicator which only includes ranks in the middle for extra computation. A global rank + coordinate and correspondent local ones (in red) sketch is shown as following:
However, I have some errors in implementation of this scheme and a snippet code in Fortran (not complete) is shown:
integer :: global_comm, local_comm ! global and local communicators
integer :: global_rank, local_rank !
integer :: global_grp, local_grp ! global and local groups
integer :: ranks(4) ! ranks in the refined region
integer :: dim ! dimension
integer :: left(-2:2), right(-2:2) ! ranks of neighbouring processors in 2 directions
ranks=[2,3,4,5]
!---- Make global communicator and their topological relationship
call mpi_init(ierr)
call mpi_cart_create(MPI_COMM_WORLD, 2, [4,2], [.false., .false.], .true., global_comm, ierr)
call mpi_comm_rank(global_comm, global_rank, ierr)
do dim=1, 2
call mpi_cart_shift(global_comm, dim-1, 1, left(-dim), right(dim), ierr)
end do
!---- make local communicator and its topological relationship
! Here I use group and create communicator
! create global group
call mpi_comm_group(MPI_COMM_WORLD, global_grp, ierr)
! extract 4 ranks from global group to make a local group
call mpi_group_incl(global_grp, 4, ranks, local_grp, ierr)
! make new communicator based on local group
call mpi_comm_create(MPI_COMM_WORLD, local_grp, local_comm, ierr)
! make topology for local communicator
call mpi_cart_create(global_comm, 2, [2,2], [.false., .false.], .true., local_comm, ierr)
! **** get rank for local communicator
call mpi_comm_rank(local_comm, local_rank, ierr)
! Do the same thing to make topological relationship as before in local communicator.
...
When I run the program, the problem comes from ' **** get rank for local communicator' step. My idea is to build two communicators: global and local communicators and local one is embedded in the global one. Then create their correspondent topological relationship in global and local communicators respectively. I do not if my concept is wrong or some syntax is wrong. And thank you very much if you can give me some suggestions.
The error message is
*** An error occurred in MPI_Comm_rank
*** reported by process [817692673,4]
*** on communicator MPI_COMM_WORLD
*** MPI_ERR_COMM: invalid communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
You are creating a 2x2 Cartesian topology from the group of the global communicator, which contains eight ranks. Therefore, in four of them the value of local_comm as returned by MPI_Cart_create will be MPI_COMM_NULL. Calling MPI_Comm_rank on the null communicator results in the error.
If I understand your logic correctly, you should instead do something like:
if (local_comm /= MPI_COMM_NULL) then
! make topology for local communicator
call mpi_cart_create(local_comm, 2, [2,2], [.false., .false.], .true., &
local_cart_comm, ierr)
! **** get rank for local communicator
call mpi_comm_rank(local_cart_comm, local_rank, ierr)
...
end if

Can't write HDF5 file with vector bigger than 2^13

I'm using C++ & HDF5 to write a file. But run into problems with it. This is the code I use:
void fileRead::writeFile(string name, const vector<double>* data) {
int dimn = data->size();
hsize_t dim[1] = {data->size()}; //-> 2^13!!!
hid_t sid = H5Pcreate(H5P_DATASET_CREATE);
hid_t didProp = H5Screate_simple(1,dim,NULL);
H5Pset_layout(sid, H5D_COMPACT);
hid_t did = H5Dcreate(fid, name.c_str(),H5T_IEEE_F64LE, didProp, H5P_DEFAULT, sid,H5P_DEFAULT);
H5Dwrite (did, H5T_NATIVE_DOUBLE, H5S_ALL, H5S_ALL, H5P_DEFAULT, &(data->at(0)));
H5Dclose(did);
H5Sclose(didProp);
H5Pclose(sid);
}
But this gives me this error message:
HDF5-DIAG: Error detected in HDF5 (1.8.10) thread 0: #000: /pub/devel/hdf5/hdf5-1.8.10-1/src/hdf5-1.8.10/src/H5D.c line 170 in H5Dcreate2(): unable to create dataset
major: Dataset
minor: Unable to initialize object #001: /pub/devel/hdf5/hdf5-1.8.10-1/src/hdf5-1.8.10/src/H5Dint.c line 439 in H5D__create_named(): unable to create and link to dataset
major: Dataset
minor: Unable to initialize object #002: /pub/devel/hdf5/hdf5-1.8.10-1/src/hdf5-1.8.10/src/H5L.c line 1638 in H5L_link_object(): unable to create new link to object
major: Links
minor: Unable to initialize object #003: /pub/devel/hdf5/hdf5-1.8.10-1/src/hdf5-1.8.10/src/H5L.c line 1882 in H5L_create_real(): can't insert link
major: Symbol table
minor: Unable to insert object #004: /pub/devel/hdf5/hdf5-1.8.10-1/src/hdf5-1.8.10/src/H5Gtraverse.c line 861 in H5G_traverse(): internal path traversal failed
major: Symbol table
minor: Object not found #005: /pub/devel/hdf5/hdf5-1.8.10-1/src/hdf5-1.8.10/src/H5Gtraverse.c line 641 in H5G_traverse_real(): traversal operator failed
major: Symbol table
minor: Callback failed #006: /pub/devel/hdf5/hdf5-1.8.10-1/src/hdf5-1.8.10/src/H5L.c line 1685 in H5L_link_cb(): unable to create object
major: Object header
minor: Unable to initialize object #007: /pub/devel/hdf5/hdf5-1.8.10-1/src/hdf5-1.8.10/src/H5O.c line 3015 in H5O_obj_create(): unable to open object
major: Object header
minor: Can't open object #008: /pub/devel/hdf5/hdf5-1.8.10-1/src/hdf5-1.8.10/src/H5Doh.c line 293 in H5O__dset_create(): unable to create dataset
major: Dataset
minor: Unable to initialize object #009: /pub/devel/hdf5/hdf5-1.8.10-1/src/hdf5-1.8.10/src/H5Dint.c line 1044 in H5D__create(): unable to construct layout information
major: Dataset
minor: Unable to initialize object #010: /pub/devel/hdf5/hdf5-1.8.10-1/src/hdf5-1.8.10/src/H5Dcompact.c line 212 in H5D__compact_construct(): compact dataset size is bigger than header message maximum size
major: Dataset
minor: Unable to initialize object HDF5-DIAG: Error detected in HDF5 (1.8.10) thread 0: #000: /pub/devel/hdf5/hdf5-1.8.10-1/src/hdf5-1.8.10/src/H5D.c line 391 in H5Dclose(): not a dataset
major: Invalid arguments to routine
minor: Inappropriate type
This happens for all vector sizes >= 2^13 (8192). Which is puzzeling me since reading in is no problem with bigger files and 2^13 is still a rather small number so something must be fishi with my code.
Any help would be appreatiated.
yours
magu_
From the documentation for the H5D_COMPACT parameter for H5Pset_layout:
Store raw data in the dataset object header in file. This should only
be used for datasets with small amounts of raw data. The raw data size
limit is 64K (65520 bytes). Attempting to create a dataset with raw
data larger than this limit will cause the H5Dcreate call to fail.
So if your doubles are 8 bytes, you've run into that limit.
You need to use one of the other storage options, contiguous or chunked.

Error: Two main PROGRAMs at (1) and (2)

I'm using the Simply Fortran compiler and when I try to compile I get the error:
prog.f95:35.13:
1 Implicit None
prog.f95:53.65:
2 open (unit=1,file='in',status='OLD') ! opens file with parameters
Error: Two main PROGRAMs at (1) and (2)
I have included only the parts of the code in which the errors occur since the whole thing is quite long. This begins at the very beginning of the program. Let me know if I should include more.
Implicit None
Integer :: i,j,iter
real(8) :: Elow,Ehigh,chi,B_NS,Vbrprof,Neprof,taues
real(8) :: Xcyclave,a
character(8) systemdate
character(10) systemtime
character(5) timezone
integer dateandtime(8)
character(8) systemdate2
character(10) systemtime2
character(5) timezone2
integer dateandtime2(8)
character(len=40) :: infname,outfname,comm
include 'common.f95'
open (unit=1,file='in',status='OLD') ! opens file with parameters
read (1,1) ! comment line
read (1,1) outfname
read (1,*) Elow,Ehigh ! lower and higher energy
read (1,*) Eminf,Emind,Emaxf ! min and max energy for fedd
read (1,*) Rin, Rout ! inner and outer radii
read (1,*) profpar(1) ! for Ne
read (1,*) profpar(2) ! Te in keV
read (1,*) profpar(3) ! for absorption+emission
read (1,*) profpar(4) ! T_bb for neutron star in keV
read (1,*) profpar(5) ! for bulk velocity
read (1,*) profpar(6) ! other parameter for model
read (1,*) profpar(10) ! magnetic moment in 10^27 CGS
1 format (A10)
close (1)
The compiler is probably seeing an END xxx statement in the file common.f95. The file common.f95 is possibly not meant to be used as an INCLUDE file - it may be a program unit in its own right.