HDF5 -> H5Ldelete, H5F_FSPACE_STRATEGY_FSM_AGGR and free space - c++

I am facing quite a pickle to understand the space management of HDF5 files.
I have written a code that creates a file that contains a group which contains a series of datasets. The user can decide to remove one or several datasets after they have been added to the file.
This is how I create my HDF5 file:
hid_t fcpl = H5Pcreate(H5P_FILE_CREATE);
hsize_t fsm_size = 0;
H5Pset_file_space_strategy(fcpl, H5F_FSPACE_STRATEGY_FSM_AGGR, 0, fsm_size);
hid_t fapl = H5Pcreate(H5P_FILE_ACCESS);
H5Pset_libver_bounds(fapl, H5F_LIBVER_V110, H5F_LIBVER_LATEST);
H5::H5File data_file(filename, H5F_ACC_TRUNC, fcpl, fapl);
Then, I create a group with:
H5::Group data_group = data_file.createGroup("MyData");
Finally, I add a series of dataset (similar data in this simple case) to this group using:
std::vector<float> array = {1, 2, 3, 4, 5, 6};
for (uint8_t idx = 0; idx < 10; idx++)
{
std::string name = std::string("mydata_") + std::to_string(idx);
hsize_t dims_pol[1] = {array.size()};
H5::DataSpace dataspace_x(1, dims_pol);
H5::IntType datatype_x(H5::PredType::IEEE_F32LE);
H5::DataSet dataset_x;
dataset_x = data_group.createDataSet(name.c_str(), datatype_x, dataspace_x);
dataset_x.write(array.data(), H5::PredType::IEEE_F32LE);
dataset_x.close();
}
Doing so, I do have a file that is filled with the correct data as expected. The (correct from my poor understanding) superblock of the file is:
SUPER_BLOCK {
SUPERBLOCK_VERSION 3
FREELIST_VERSION 0
SYMBOLTABLE_VERSION 0
OBJECTHEADER_VERSION 0
OFFSET_SIZE 8
LENGTH_SIZE 8
BTREE_RANK 16
BTREE_LEAF 4
ISTORE_K 32
FILE_SPACE_STRATEGY H5F_FSPACE_STRATEGY_FSM_AGGR
FREE_SPACE_PERSIST FALSE
FREE_SPACE_SECTION_THRESHOLD 0
FILE_SPACE_PAGE_SIZE 4096
USER_BLOCK {
USERBLOCK_SIZE 0
}
}
The problem arises when I try to delete the datasets. To do that (while the file is still opened), I use the following procedure:
for (uint8_t idx = 0; idx < 10; idx++)
{
std::string name = std::string("mydata_") + std::to_string(idx);
H5Ldelete(data_group.getId(), name.c_str(), H5P_DEFAULT);
}
data_file.close();
From my understanding, the space should be freed when closing the file. However, if the file doesn't contain any reference to the data when opened with the viewer, the size doesn't decrease...
I also tried with unlinking the dataset. This is not working...
Any help would be greatly appreciated...
Thanks

Related

A problem in saving "long double" variables to hdf5 file

I'm new to hdf5. I'm trying to save the results of a simulation in hdf5 file. The variables are long double. So, I mapped them into NATIVE_LDOUBLE. However, the saved values are completely wrong (fluctuating between very small and very large values).
When I save them with NATIVE_DOUBLE, everything is ok. But I need to save long double.
My question is how to properly save long double variables, and moreover quadruple precision variables?
I deeply appreciate your help. Examples are also appreciated.
here is the code
void createHDF5_2DProjectionFile(char* file_name,
CarGrid1D3V<long double>& ph_grid,
std::string first_dim,
std::string second_dim,
long double *x1, int size_x1,
long double *x2, int size_x2)
{
try
{
/* define the size of the datasets containing the coordinates x1
and x2
*/
PredType h5Int = PredType::NATIVE_INT;
PredType h5DoubleL = PredType::NATIVE_LDOUBLE;
PredType h5Double = PredType::NATIVE_DOUBLE;
/* Define the parameters of grid space
DS --> Data Space
*/
hsize_t x1_dims[1], x2_dims[1];
x1_dims[0] = size_x1;
x2_dims[0] = size_x2;
H5File *file_id = new H5File(file_name, H5F_ACC_TRUNC);
/* Saving string attribute
Create dataspace with H5S_SCALAR
Create string datatype of specific length of characters
Create attribute and write to it
*/
DataSpace attr_stringDS = DataSpace(H5S_SCALAR);
StrType strdatatype(PredType::C_S1, 64);
Attribute original_DistFun = file_id->createAttribute("/OriginalDistFun",
strdatatype, attr_stringDS);
original_DistFun.write(strdatatype, "1D3V");
Attribute projection = file_id->createAttribute("/Projection",
strdatatype, attr_stringDS);
projection.write(strdatatype, first_dim + " - " + second_dim);
/* Create the data spaces for grid points along each direction */
DataSpace* first_dimDS_id = new DataSpace(1, x1_dims, NULL);
DataSpace* second_dimDS_id = new DataSpace(1, x2_dims, NULL);
/* Create and fille the datasets for grid points along each direction */
DataSet *data_dim1 = new DataSet(file_id->createDataSet(first_dim,
h5DoubleL, *first_dimDS_id));
data_dim1->write(x1, h5DoubleL);
DataSet *data_dim2 = new DataSet(file_id->createDataSet(second_dim,
h5DoubleL, *second_dimDS_id));
data_dim2->write(x2, h5DoubleL);
/* Important attributes added to the file */
long double x_minmax[2], px_minmax[2],
py_minmax[2], pz_minmax[2], mom_steps[3],
ph_vols[3], spatial_steps[1];
x_minmax[0] = ph_grid.x_min_;
x_minmax[1] = ph_grid.x_max_;
px_minmax[0] = ph_grid.px_min_;
px_minmax[1] = ph_grid.px_max_;
py_minmax[0] = ph_grid.py_min_;
py_minmax[1] = ph_grid.py_max_;
pz_minmax[0] = ph_grid.pz_min_;
pz_minmax[1] = ph_grid.pz_max_;
mom_steps[0] = ph_grid.dpx_;
mom_steps[1] = ph_grid.dpy_;
mom_steps[2] = ph_grid.dpz_;
ph_vols[0] = ph_grid.dvs_;
ph_vols[1] = ph_grid.dvp_;
ph_vols[2] = ph_grid.dv_;
spatial_steps[0] = ph_grid.dx_;
ph_grid.print_characteristics();
std::cout << x_minmax[0] << " , " << x_minmax[1] << "\n";
/* define attributes configuration */
hsize_t space_1[1];
space_1[0] = 1;
hsize_t space_2[1];
space_2[0] = 2;
hsize_t space_3[1];
space_3[0] = 3;
DataSpace attr_space_1 = DataSpace(1, space_1);
DataSpace attr_space_2 = DataSpace(1, space_2);
DataSpace attr_space_3 = DataSpace(1, space_3);
Attribute x_interval = file_id->createAttribute("[x_min,x_max]",
h5DoubleL, attr_space_2);
x_interval.write(h5DoubleL, x_minmax);
Attribute px_interval = file_id->createAttribute("[px_min,px_max]",
h5DoubleL, attr_space_2);
px_interval.write(h5DoubleL, px_minmax);
Attribute py_interval = file_id->createAttribute("[py_min,py_max]",
h5DoubleL, attr_space_2);
py_interval.write(h5DoubleL, py_minmax);
Attribute pz_interval = file_id->createAttribute("[pz_min,pz_max]",
h5DoubleL, attr_space_2);
pz_interval.write(h5DoubleL, pz_minmax);
Attribute MomVolumes = file_id->createAttribute("[dpx,dpy,dpz]",
h5DoubleL, attr_space_3);
MomVolumes.write(h5DoubleL, mom_steps);
Attribute PhVolumes = file_id->createAttribute("[dv_s, dv_m, dv_t]",
h5DoubleL, attr_space_3);
PhVolumes.write(h5DoubleL, ph_vols);
Attribute SpatialVolumes = file_id->createAttribute("[dx]", PredType::NATIVE_DOUBLE,
attr_space_1);
SpatialVolumes.write(h5DoubleL, spatial_steps);
/* Free memory */
delete data_dim1;
delete data_dim2;
delete first_dimDS_id;
delete second_dimDS_id;
delete file_id;
}
catch(DataSetIException error)
{
error.printErrorStack();
}
catch(DataSpaceIException error)
{
error.printErrorStack();
}
catch(FileIException error)
{
error.printErrorStack();
}
}
Update
A great discussion and explanations are available on HDF5 forum where I posted the same question
https://forum.hdfgroup.org/t/a-problem-when-saving-native-ldouble-variables/9504
Also, Steven Varga provided examples to answer this question on his GitHub by constructing a user-defined datatype (see this link).

How to correctly extend .text section in CLR compatible PE file

I want to correctly extend .text section in PE file according to a File Alignment or Section Alignment.
Details:
Let's start from scratch.
I have .net v4.8 project with several small classes like
public class A1
{
public int IntProperty {get;set;}
}
public class A2
{
public string StringProperty {get;set;}
}
It doesn't really matter what these classes are.
After my compilation using MSBuild v16 I have a PE file. (exe or dll dosn't matter either).
Then I expand the .text section with File- or Section alignment(I tried both).
So, if compiled PE file have 512, 1024 or 2048 alignment - that works fine.
But when it have 4096 or 8192 bytes - it doesn't work.
This is the code example, how I do it:
BYTE* dos_header_ptr = reinterpret_cast<BYTE*>(dosHeader2);
IMAGE_NT_HEADERS64* nt_header2 = reinterpret_cast<TNT_HEADERS*>(dos_header_ptr + dosHeader2->e_lfanew);
const IMAGE_FILE_HEADER* file_header = &nt_header2->FileHeader;
IMAGE_OPTIONAL_HEADER64* optional_headers2 = &nt_header2->OptionalHeader;
DWORD file_alignment = optional_headers2->FileAlignment;
file_alignment = optional_headers2->SectionAlignment;
std::vector<IMAGE_SECTION_HEADER*> sections = ExtractSections(nt_header2);
IMAGE_SECTION_HEADER* section_header_text2 = sections[0];
IMAGE_SECTION_HEADER* section_header_rsrc2 = sections[1];
file_info* old_file_info = file_info2;
file_info2 = RelocateBufferWithDisplacement(old_file_info, file_alignment, section_header_text2);
// Update pointers and VA
dosHeader2 = reinterpret_cast<IMAGE_DOS_HEADER*>(file_info2->buffer);
dos_header_ptr = reinterpret_cast<BYTE*>(dosHeader2);
nt_header2 = reinterpret_cast<IMAGE_NT_HEADERS64*>(dos_header_ptr + dosHeader2->e_lfanew);
file_header = &nt_header2->FileHeader;
sections.clear();
sections = ExtractSections<IMAGE_NT_HEADERS64>(nt_header2);
section_header_text2 = sections[0];
section_header_rsrc2 = sections[1];
section_header_text2->SizeOfRawData += file_alignment;
section_header_text2->Misc.VirtualSize = section_header_text2->SizeOfRawData;
for(int i=1; i<sections.size(); i++)
sections[i]->PointerToRawData += file_alignment;
optional_headers2 = &nt_header2->OptionalHeader;
optional_headers2->SizeOfCode += file_alignment;
delete old_file_info;
FlushToFile(input_params.file_name_output, file_info2);
RelocateBufferWithDisplacement function
file_info* RelocateBufferWithDisplacement(file_info* file_info2, DWORD file_alignment, IMAGE_SECTION_HEADER* text_section_header)
{
// COPY BUFFERS
DWORD new_file_size = file_info2->file_size + file_alignment;
byte* new_buffer = new byte[new_file_size];
DWORD first_part_of_file_size = text_section_header->PointerToRawData + text_section_header->SizeOfRawData;
byte* end_first_part_old = file_info2->buffer + first_part_of_file_size;
byte* start_free_zone_of_file_ptr_new = new_buffer + first_part_of_file_size;
byte* second_part_of_file_ptr_new = start_free_zone_of_file_ptr_new + file_alignment;
std::copy(file_info2->buffer, end_first_part_old, new_buffer);
std::fill(start_free_zone_of_file_ptr_new, second_part_of_file_ptr_new, 0);
std::copy(end_first_part_old, file_info2->buffer + file_info2->file_size, second_part_of_file_ptr_new);
return new file_info(new_file_size, new_buffer);
}
ExtractSections method
std::vector<IMAGE_SECTION_HEADER*>& ExtractSections(IMAGE_NT_HEADERS64* nt_headers)
{
std::vector<IMAGE_SECTION_HEADER*>* result = new std::vector<IMAGE_SECTION_HEADER*>();
for(int i=0; i < nt_headers->FileHeader.NumberOfSections; i++)
result->push_back(reinterpret_cast<IMAGE_SECTION_HEADER*>(reinterpret_cast<BYTE*>(nt_headers) + sizeof(TNT_HEADERS) + i * sizeof(IMAGE_SECTION_HEADER)));
return *result;
}
That works the same way of both x32/x64 files, but now only x64 version is considered to simplification.
I also noticed:
If I use a larger project (e.g. up to 30000 code lines) I can't do it even for 2048 file alignment. Only for 512 and 1024. But if the project is almost clear - only 8192 fails.
I expect that something should be also updated, but I don't know what exactly.
Just a high level vision of the code:
So, what I'm doing wrong?
Thanks.

Overlap hundreds of histograms macro question

I have a directory trial which contains hundreds of histograms in it and a macro. Each is called in a way hists09876_blinded.root or hists12365_blinded.root. The order, however, is not like that. There are some missig histograms like hists10467_blinded.root hists10468_blinded.root hists10470_blinded.root. The ultimate goal is to get one histogram on a canvas which represents all of those combined together. The tricky thing is that each hists*****_blinded.root has around 15 1D histos in it, I need to pull out just one from each called sc*****.
I have 2 ideas, but I guess I should combine them together to get the final result.
First idea was to open histo by histo, but since there are some missed histos in the order, that does not work well.
void overlap()
{
TCanvas *time = new TCanvas("c1", "overlap", 0, 0, 800, 600);
const char* histoname = "sc";
const int NFiles = 256;
for (int fileNumber = 09675; fileNumber < NFiles; fileNumber++)
{
TFile* myFile = TFile::Open(Form("hists%i_blinded.root", fileNumber));
if (!myFile)
{
printf("Nope, no such file!\n");
return;
}
TH1* h1 = (TH1*)myFile->Get(histoname);
if (!h1)
{
printf("Nope, no such histogram!\n");
return;
}
h1->SetDirectory(gROOT);
h1->Draw("same");
myFile->Close();
}
}
After having read multiple posts on the pretty much the same question (1, 2, and this one) I have figured out what was wrong with my answer here: I did not know the file name may contain a zero if the number in its name is < 10000. Also, I failed to understand that the asterisks in the histogram name, which you refer to as sc*****, actually hide the same number as in the file name! I thought this was something completely different. So in that case I suggest you construct the file name and the histogram name you should be after in the same loop:
void overlap_v2()
{
TCanvas *time = new TCanvas("c1", "overlap", 0, 0, 800, 600);
const int firstNumber = 9675;
const int NFiles = 100000;
for (int fileNumber = firstNumber; fileNumber < firstNumber+NFiles; fileNumber++)
{
const char* filename = Form("trial/hists%05i_blinded.root", fileNumber);
TFile* myFile = TFile::Open(filename);
if (!myFile)
{
printf("Can not find a file named \"%s\"!\n", filename);
continue;
}
const char* histoname = Form("sc%05i", fileNumber);
TH1* h1 = (TH1*)myFile->Get(histoname);
if (!h1)
{
printf("Can not find a histogram named \"%s\" in the file named \"%s\"!\n", histoname, filename);
continue;
}
h1->SetDirectory(gROOT);
h1->Draw("same");
myFile->Close();
}
}
Since it is expected that some files are "missing", I suggest not to try to guess the names of the files that actually exist. Instead, use a function that lists all files in a given directory and from that list filter out those files that match the pattern of files you want to read. See for example these links for how to read the content of a directory in C++:
How can I get the list of files in a directory using C or C++?
http://www.martinbroadhurst.com/list-the-files-in-a-directory-in-c.html

How to read a HDF5 scalar attribute whose value is an array of variable length char* (i.e. c_strings?)

I have successfully created a scalar valued attribute whose value is a variable length array of const char*. I do not understand how to read this attribute however!
This is code I used to create the attribute:
void create_attribute_with_vector_of_strings_as_value()
{
using namespace H5;
// Create some test strings.
std::vector<std::string> strings;
for (int iii = 0; iii < 10; iii++)
{
strings.push_back("this is " + boost::lexical_cast<std::string>(iii));
}
// Part 1: grab pointers to the chars
std::vector<const char*> chars;
for (auto si = strings.begin(); si != strings.end(); ++si)
{
std::string &s = (*si);
chars.push_back(s.c_str());
}
BOOST_TEST_MESSAGE("Size of char* array is: " << chars.size());
// Part 2: create the variable length type
hvl_t hdf_buffer;
hdf_buffer.p = chars.data();
hdf_buffer.len = chars.size();
// Part 3: create the type
auto s_type = H5::StrType(H5::PredType::C_S1, H5T_VARIABLE);
auto svec_type = H5::VarLenType(&s_type);
try
{
// Open an existing file and dataset.
H5File file(m_file_name.c_str(), H5F_ACC_RDWR);
// Part 4: write the output to a scalar attribute
DataSet dataset = file.openDataSet(m_dataset_name.c_str());
std::string filter_names = "multi_filters";
Attribute attribute = dataset.createAttribute( filter_names.c_str(), svec_type, H5S_SCALAR);
attribute.write(svec_type, &hdf_buffer);
file.close();
}
Here is the dataset with attribute as seen from h5dump:
HDF5 "d:\tmp\hdf5_tutorial\h5tutr_dset.h5" {
GROUP "/" {
DATASET "dset" {
DATATYPE H5T_STD_I32BE
DATASPACE SIMPLE { ( 4, 6 ) / ( 4, 6 ) }
DATA {
(0,0): 1, 7, 13, 19, 25, 31,
(1,0): 2, 8, 14, 20, 26, 32,
(2,0): 3, 9, 15, 21, 27, 33,
(3,0): 4, 10, 16, 22, 28, 34
}
ATTRIBUTE "multi_filters" {
DATATYPE H5T_VLEN { H5T_STRING {
STRSIZE H5T_VARIABLE;
STRPAD H5T_STR_NULLTERM;
CSET H5T_CSET_ASCII;
CTYPE H5T_C_S1;
}}
DATASPACE SCALAR
DATA {
(0): ("this is 0", "this is 1", "this is 2", "this is 3", "this is 4", "this is 5", "this is 6", "this is 7", "this is 8", "this is 9")
}
}
}
}
}
I do not understand how to read this data. The code I've experimented with so far is below. It compiles, but I've hardwired the array-size to the known length and the variable-length cstrings are empty? Does anyone have any suggestions as to where I'm going wrong? In particular, how do I query for the length of the array of const char* and how do I read the actual const char* cstrings contained in the array?
void read_attribute_with_vector_of_strings_as_value()
{
using namespace H5;
std::vector<std::string> strings;
try
{
// Open an existing file and dataset readonly
H5File file(m_file_name.c_str(), H5F_ACC_RDONLY);
// Part 4: Open the dataset
DataSet dataset = file.openDataSet(m_dataset_name.c_str());
// Atribute_name
std::string filter_names = "multi_filters";
Attribute attribute = dataset.openAttribute(filter_names.c_str());
size_t sz = attribute.getInMemDataSize();
size_t sz_1 = attribute.getStorageSize();
auto t1 = attribute.getDataType();
VarLenType t2 = attribute.getVarLenType();
H5T_class_t type_class = attribute.getTypeClass();
if (type_class == H5T_STRING)
BOOST_TEST_MESSAGE("H5T_STRING");
int length = 10;
std::vector<char*> tmp_vec(length);
auto s_type = H5::StrType(H5::PredType::C_S1, H5T_VARIABLE);
auto svec_type = H5::VarLenType(&s_type);
hvl_t hdf_buffer;
hdf_buffer.p = tmp_vec.data();
hdf_buffer.len = length;
attribute.read(svec_type, &hdf_buffer);
//attribute.read(s_type, &hdf_buffer);
//attribute.read(tmp_vec.data(), s_type);
for(size_t x = 0; x < tmp_vec.size(); ++x)
{
fprintf(stdout, "GOT STRING [%s]\n", tmp_vec[x] );
strings[x] = tmp_vec[x];
}
file.close();
}
If you are not required to use specific technologies to implement what you have in mind, you may consider HDFql (http://www.hdfql.com) which is a high-level language to manage HDF files easily (think SQL). That way you can be alleviated from all the low-level details of manipulating HDF files that you describe. Using HDFql in C++, reading an array of variable-length char is done like this:
// include HDFql C++ header file (make sure it can be found by the C++ compiler)
#include <iostream>
#include "HDFql.hpp"
int main(int argc, char *argv[])
{
// create an HDF file named "example.h5" and use (i.e. open) it
HDFql::execute("CREATE AND USE FILE example.h5");
// create an attribute named "multi_filters" of type varchar of one dimension (size 5)
HDFql::execute("CREATE ATTRIBUTE multi_filters AS VARCHAR(5)");
// insert (i.e. write) values "Red", "Green", "Blue", "Orange" and "Yellow" into attribute "multi_filters"
HDFql::execute("INSERT INTO multi_filters VALUES(Red, Green, Blue, Orange, Yellow)");
// select (i.e. read) attribute "multi_filters" into HDFql default cursor
HDFql::execute("SELECT FROM multi_filters");
// display content of HDFql default cursor
while(HDFql::cursorNext() == HDFql::Success)
{
std::cout << "Color " << HDFql::cursorGetChar() << " has a size of " << HDFql::cursorGetSize() << std::endl;
}
return 0;
}

Writing dataset of type H5T_ARRAY

I'm trying to write data in HDF5 using the C++ API.
I work on Windows XP 64 bits with Visual Studio 2010. I use version 1.8.9.
The target is set to X86 so I had to use the 32 bits version of HDF (to be honest, I'm very new to programming with Windows and VS and didn't configure the whole thing myself, so I'm really not sure it was the right choice).
My issue happens when trying to write a part of a dataset of type H5T_ARRAY.
The HDF5 file structure I want to achieve is a dataset of 4 dimensions (i1,i2,i3,i4), which datatype is : array of double with 2 dimensions (a1,a2).
Here is the DDL to sum it up :
HDF5 "result.h5" {
GROUP "/" {
DATASET "mydata" {
DATATYPE H5T_ARRAY { [a1][a2] H5T_IEEE_F64LE }
DATASPACE SIMPLE { ( i1,i2,i3,i4) / ( i1,i2,i3,i4 ) }
DATA { <my data> }
}
}
Due to my program structure, I write this dataset element by element, ie H5T_ARRAY by H5T_ARRAY.
I've defined a class OutputFile to manage all the HDF5 I/O. It contains these attributes :
H5::H5File *_H5fileHandle ; // HDF5 file
H5::DataSpace *_dataspaceHandle ; // Handle of the Dataspace of the datasets
int _dataspaceRank ; // Rank of the dataspace
H5::ArrayType *_datatypeHandle ; // Handle of the datatype of the datasets (= array of N dimensions)
int _datatypeRank ; // Rank of the datatype
H5::DataSet *_datasetHandle ; // Handle of the dataset
The file is open right at the beginning of the program, and all the handles (dataspace, datatype and dataset) are set then :
void OutputFile ::createFile(std::string filename,
std::vector<int> dsdims,
std::vector<int> adims,
std::vector<std::string> datasetName) {
_filename = filename ;
_H5fileHandle = new H5::H5File(_filename.c_str(), H5F_ACC_TRUNC);
// Defining the dataspace
_dataspaceRank = dsdims.size() ;
hsize_t *h5dsdims = new hsize_t[_dataspaceRank] ;
for (int iDim=0 ; iDim < _dataspaceRank ; iDim++) h5dsdims[iDim] = hsize_t(dsdims[iDim]) ;
_dataspaceHandle = new H5::DataSpace(_dataspaceRank, h5dsdims, NULL);
// Defining the datatype = array type
_datatypeRank = adims.size() ;
hsize_t *h5adims = new hsize_t[_datatypeRank] ;
for (int iDim=0 ; iDim < _datatypeRank ; iDim++) h5adims[iDim] = hsize_t(adims[iDim]) ;
_datatypeHandle = new H5::ArrayType(H5::PredType::IEEE_F64LE, _datatypeRank, h5adims);
// Creating the dataset
_datasetHandle = _H5fileHandle->createDataSet( _datasetName.c_str(),*_datatypeHandle, *_dataspaceHandle );
// Clean up
delete h5dsdims ;
delete h5adims ;
}
Then, I write the data each time I get an element ready (i.e a H5T_ARRAY) :
void OutputFile::writeMyData(double **Values, int *positionInDataSet) {
// set the element position
hsize_t position[1][4] ;
position[0][0] = hsize_t(positionInDataset[0]);
position[0][1] = hsize_t(positionInDataset[1]);
position[0][2] = hsize_t(positionInDataset[2]);
position[0][3] = hsize_t(positionInDataset[3]);
_fileDataspace->selectElements( H5S_SELECT_SET, 1, (const hsize_t *)position);
//Set the memory dataspace
hsize_t memdims[] = {1} ;
H5::DataSpace memspace(1, memdims, NULL);
// set the memory datatype
hsize_t memTypeRank = 2 ;
hsize_t *memTypedims = new hsize_t[memTypeRank] ;
for (int iDim=0 ; iDim < memTypeRank ; iDim++) memTypedims[iDim] = hsize_t(dataDims[iDim]) ;
H5::ArrayType memtypeHandle(H5::PredType::IEEE_F64LE, memTypeRank, memTypedims);
_datasetHandle->write(Values, memtypeHandle, memspace, *_dataspaceHandle);
_H5fileHandle->flush(H5F_SCOPE_GLOBAL) ;
}
The Values argument is allocated in the calling function, with size [a1][a2].
Unfortunately, it doesn't work properly. I get invalid data in my HDF5 file, and all elements are equal (meaning that all H5T_ARRAY contains the same values).
Exemple :
(0,0,0,0): [ 5.08271e-275, 5.08517e-275, -7.84591e+298, -2.53017e-098, 0, 2.18992e-303,
5.08094e-275, 0, 2.122e-314, -7.84591e+298, 5.08301e-275, 5.08652e-275,
-7.84591e+298, -2.53017e-098, 0, 2.18994e-303, 5.08116e-275, 0,
2.122e-314, -7.84591e+298, 5.08332e-275, 5.08683e-275, -7.84591e+298, -2.53017e-098,
0, 2.18995e-303, 5.08138e-275, 0, 2.122e-314, -7.84591e+298 ],
... and so on for every element.
For now, I have :
checked that the content of the "Value" array in writeMyData() is correct and contains valid data
checked that, if I only write one element, then this element, and only this one, contains invalid data in the HDF5 files (the other ones contain only zeroes)
used those additional type combinations, without success :
memType = NATIVE_DOUBLE, fileType = IEEE_64LE
memType = NATIVE_DOUBLE, fileType = NATIVE_DOUBLE
memType = IEEE_32LE, fileType = IEEE_32_LE
checked that double-value attributes are written correctly, using the type IEEE_F64LE
tried to close the file at the end of writeMyData(), and open it at the beginning, to force writing data on the disk. The results are the same.
passed &Values instead of Values in the call to DataSet::write() (the results are the same).
I'm a bit at my wits' end. I've found examples for partial I/0 of a dataset and others for array datatypes, but nothing for partial writing of array-type datasets.
I guess it's a memory issue, my feeling is that I do something wrong when passing the "Values" array to DataSet::write(), but I can't pinpoint the problem.
Thanks in advance for any pointers you have.