I work on Fortran code which uses HDF5 libraries for dumping output data. I have been running the code on a supercomputer without any issues. Recently, I tried the code on our local cluster with HDF5 libraries installed on the machine. However, the code runs fine, except on the output part throws the following error.
I have searched this issue on the Internet quite extensively. But, most of the solutions I see are highly specific to their codes. I believe the error is due to some fundamental issue. Can someone explain me, why the error is coming up?
15:34:05 - Dumping OD...
HDF5-DIAG: Error detected in HDF5 (1.8.17) MPI-process 0:
#000: H5F.c line 522 in H5Fcreate(): unable to create file
major: File accessibilty
minor: Unable to open file
#001: H5Fint.c line 992 in H5F_open(): unable to open file: time = Mon Dec 12 15:34:05 2016
, name = './Production/od_out_t00000010-0067858932.h5', tent_flags = 13
major: File accessibilty
minor: Unable to open file
#002: H5FD.c line 993 in H5FD_open(): open failed
major: Virtual File Layer
minor: Unable to initialize object
#003: H5FDmpio.c line 1059 in H5FD_mpio_open(): MPI_File_open failed
major: Internal error (too specific to document in detail)
minor: Some MPI function failed
#004: H5FDmpio.c line 1059 in H5FD_mpio_open(): MPI_ERR_FILE: invalid file
major: Internal error (too specific to document in detail)
minor: MPI Error String
HDF5-DIAG: Error detected in HDF5 (1.8.17) MPI-process 0:
#000: H5D.c line 165 in H5Dcreate2(): not a location ID
major: Invalid arguments to routine
minor: Inappropriate type
#001: H5Gloc.c line 253 in H5G_loc(): invalid object ID
major: Invalid arguments to routine
minor: Bad value
HDF5-DIAG: Error detected in HDF5 (1.8.17) MPI-process 0:
#000: H5D.c line 460 in H5Dget_space(): not a dataset
major: Invalid arguments to routine
minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.17) MPI-process 0:
#000: H5Dio.c line 228 in H5Dwrite(): not a dataset
major: Invalid arguments to routine
minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.17) MPI-process 0:
#000: H5S.c line 392 in H5Sclose(): not a dataspace
major: Invalid arguments to routine
minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.17) MPI-process 0:
#000: H5D.c line 415 in H5Dclose(): not a dataset
major: Invalid arguments to routine
minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.17) MPI-process 0:
#000: H5F.c line 774 in H5Fclose(): not a file ID
major: Invalid arguments to routine
minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.17) MPI-process 0:
#000: H5F.c line 604 in H5Fopen(): unable to open file
major: File accessibilty
minor: Unable to open file
#001: H5Fint.c line 992 in H5F_open(): unable to open file: time = Mon Dec 12 15:34:05 2016
, name = './Production/od_out_t00000010-0067858932.h5', tent_flags = 1
major: File accessibilty
minor: Unable to open file
#002: H5FD.c line 993 in H5FD_open(): open failed
major: Virtual File Layer
minor: Unable to initialize object
#003: H5FDsec2.c line 339 in H5FD_sec2_open(): unable to open file: name = './Production/od_out_t00000010-0067858932.h5', errno = 2, error message = 'No such file or directory', flags = 1, o_flags = 2
major: File accessibilty
minor: Unable to open file
HDF5-DIAG: Error detected in HDF5 (1.8.17) MPI-process 0:
#000: H5D.c line 340 in H5Dopen2(): not a location
major: Invalid arguments to routine
minor: Inappropriate type
#001: H5Gloc.c line 253 in H5G_loc(): invalid object ID
major: Invalid arguments to routine
minor: Bad value
HDF5-DIAG: Error detected in HDF5 (1.8.17) MPI-process 0:
#000: H5A.c line 247 in H5Acreate2(): not a location
major: Invalid arguments to routine
minor: Inappropriate type
#001: H5Gloc.c line 253 in H5G_loc(): invalid object ID
major: Invalid arguments to routine
minor: Bad value
HDF5-DIAG: Error detected in HDF5 (1.8.17) MPI-process 0:
#000: H5A.c line 591 in H5Awrite(): not an attribute
major: Invalid arguments to routine
minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.17) MPI-process 0:
#000: H5A.c line 1602 in H5Aclose(): not an attribute
major: Invalid arguments to routine
minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.17) MPI-process 0:
#000: H5D.c line 415 in H5Dclose(): not a dataset
major: Invalid arguments to routine
minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.17) MPI-process 0:
#000: H5F.c line 774 in H5Fclose(): not a file ID
major: Invalid arguments to routine
minor: Inappropriate type
15:34:05 - Finished dumping HDF5 data.
My suggestion is to add
CALL h5dclose_f(dset_id, ierr) ! terminate create dataset;
for each data writing action. It works for me.
I had the same problem (meaning, the very same).
The issue was triggered only when running in parallel, when one process was trying to write to the file, but the other was not. The error message is not really helpful, unfortunately. Bottom line: check your synchronisation.
Related
I try to set up sonarqube for all my c++ projects. It works fine for all projects except one.
This project is the only one to have a submodule in the first place.
I have an src folder to analyze with this script:
#!/bin/bash
curl -sSLo build-wrapper-linux-x86.zip https://openshift.lumiplan.com/sonarqube/static/cpp/build-wrapper-linux-x86.zip
unzip -o build-wrapper-linux-x86.zip
chmod +x build-wrapper-linux-x86/build-wrapper-linux-x86-64
build-wrapper-linux-x86/build-wrapper-linux-x86-64 --out-dir bw-output make
ls
cat bw-output/build-wrapper-dump.json
cd ../../src
wget https://binaries.sonarsource.com/Distribution/sonar-scanner-cli/sonar-scanner-cli-4.2.0.1873-linux.zip
mkdir sonar-scanner
unzip sonar-scanner-cli-4.2.0.1873-linux.zip
ls
mv sonar-scanner-4.2.0.1873-linux/* sonar-scanner/
rm sonar-scanner/conf/sonar-scanner.properties
cp ../sonar-project.properties sonar-scanner/conf/sonar-scanner.properties
sonar-scanner/bin/sonar-scanner -Dsonar.scm.exclusions.disabled=true -Dsonar.cfamily.build-wrapper-output=../projects/linux-arm/bw-output -Dsonar.projectKey=products_panel_lumiplay-engine_AYKR3Op6FUISUZul32ck -Dsonar.sources=. -Dsonar.login=14214a83196d3ae8f3507cff910c5da818e280a2 -Dsonar.host.url=https://openshift.lumiplan.com/sonarqube -Dsonar.branch.name=$1
but in src folder I have the following content:
"api_asrcore" folder which is a git submodule
"api_boost" folder which is a git submodule
and all my code in different folders which is not git submodules
the logs:
NFO: Using 1 thread for analysis according to value of "sonar.cfamily.threads" property.
15:59:20 WARN: Invalid character encountered in file /usr/src/lumiplan/src/specific/configuration_specific/source/configuration_specific.cpp at line 121 for encoding US-ASCII. Please fix file content or configure the encoding to be used using property 'sonar.sourceEncoding'.
15:59:20 WARN: Invalid character encountered in file /usr/src/lumiplan/src/specific/protocol/public/msg_plot_idbrndef.h at line 157 for encoding US-ASCII. Please fix file content or configure the encoding to be used using property 'sonar.sourceEncoding'.
15:59:20 WARN: Invalid character encountered in file /usr/src/lumiplan/src/specific/review/source/review_application_control.cpp at line 152 for encoding US-ASCII. Please fix file content or configure the encoding to be used using property 'sonar.sourceEncoding'.
15:59:20 WARN: Invalid character encountered in file /usr/src/lumiplan/src/specific/protocol/source/msg_plot_idbrndef.cpp at line 579 for encoding US-ASCII. Please fix file content or configure the encoding to be used using property 'sonar.sourceEncoding'.
15:59:20 WARN: Invalid character encountered in file /usr/src/lumiplan/src/specific/decodage/source/decodage_plot.cpp at line 294 for encoding US-ASCII. Please fix file content or configure the encoding to be used using property 'sonar.sourceEncoding'.
15:59:20 WARN: Invalid character encountered in file /usr/src/lumiplan/src/specific/decodage/public/decodage_plot.h at line 164 for encoding US-ASCII. Please fix file content or configure the encoding to be used using property 'sonar.sourceEncoding'.
INFO: PCH: unique=0 use=0 (forceInclude=0,throughHeader=0,firstInclude=0) out of 0 (forceInclude=0,throughHeader=0)
15:59:20 INFO: SE: 0 out of 0
15:59:20 INFO: Z3 refutation rate: 0 out of 0
15:59:20 INFO: Subprocess(es) done in 46ms
15:59:20 INFO: 0 compilation units analyzed
15:59:20 INFO: ------------------------------------------------------------------------
15:59:20 INFO: EXECUTION FAILURE
15:59:20 INFO: ------------------------------------------------------------------------
15:59:20 INFO: Total time: 48.891s
15:59:20 INFO: Final Memory: 63M/220M
15:59:20 INFO: ------------------------------------------------------------------------
15:59:20 ERROR: Error during SonarQube Scanner execution
15:59:20 java.lang.IllegalStateException: The "build-wrapper-dump.json" file was found but 0 C/C++/Objective-C files were analyzed.
in other projects there is at least one folder which is not a git submodule before "api_asrcore" in the following example there is an "action_manager" and an "annoncesonore_html" folders before submodules
logs:
INFO: Using 1 thread for analysis according to value of "sonar.cfamily.threads" property.
14:49:17 WARN: Invalid character encountered in file /usr/src/lumiplan/src/specific/decodage/source/decodage_plot.cpp at line 310 for encoding US-ASCII. Please fix file content or configure the encoding to be used using property 'sonar.sourceEncoding'.
14:49:17 WARN: Invalid character encountered in file /usr/src/lumiplan/src/specific/protocol/source/msg_plot_idbrndef.cpp at line 632 for encoding US-ASCII. Please fix file content or configure the encoding to be used using property 'sonar.sourceEncoding'.
14:49:17 WARN: Invalid character encountered in file /usr/src/lumiplan/src/specific/protocol/public/msg_plot_idbrndef.h at line 165 for encoding US-ASCII. Please fix file content or configure the encoding to be used using property 'sonar.sourceEncoding'.
14:49:17 WARN: Invalid character encountered in file /usr/src/lumiplan/src/specific/decodage/public/decodage_plot.h at line 166 for encoding US-ASCII. Please fix file content or configure the encoding to be used using property 'sonar.sourceEncoding'.
14:49:17 WARN: Invalid character encountered in file /usr/src/lumiplan/src/specific/synchronisation/source/synchronisation.cpp at line 68 for encoding UTF-8. Please fix file content or configure the encoding to be used using property 'sonar.sourceEncoding'.
14:49:17 WARN: Invalid character encountered in file /usr/src/lumiplan/src/specific/configuration_specific/source/configuration_specific.cpp at line 216 for encoding US-ASCII. Please fix file content or configure the encoding to be used using property 'sonar.sourceEncoding'.
14:49:17 WARN:
14:49:17 CFamily plugin supports incremental analysis with the use of a cache:
14:49:17
14:49:17 * if you do not want to enable cache
14:49:17 please explicitly disable it
14:49:17 by setting the following property to your analysis:
14:49:17 sonar.cfamily.cache.enabled=false
14:49:17
14:49:17 * to enable cache please specify the following 2 options:
14:49:17 sonar.cfamily.cache.enabled=true
14:49:17 sonar.cfamily.cache.path=relative_or_absolute_path_to_cache_location
14:49:17
14:49:17 * visit the documentation page for more information
14:49:17 https://openshift.lumiplan.com/sonarqube/documentation/analysis/languages/cfamily/
14:49:17
14:49:17 INFO: [pool-3-thread-1] /usr/src/lumiplan/projects/linux-arm/../../src/specific/action_manager/source/action_handler.cpp
14:49:17 INFO: [pool-3-thread-1] /usr/src/lumiplan/projects/linux-arm/../../src/specific/annoncesonore_html/sources/annoncesonore.cpp
14:49:17 INFO: PCH: unique=0 use=0 (forceInclude=0,throughHeader=0,firstInclude=0) out of 2 (forceInclude=0,throughHeader=0)
14:49:17 INFO: SE: 2 out of 2
14:49:17 INFO: Z3 refutation rate: 0 out of 3
in this case the sonar analysis are complete do you know why? (there are no specific sonar-project.properties in both cases)
I need to use Japanese characters with vector searches in Django / Postres.
I am trying to install django-pgroonga and keep getting the same encoding error with cp1252.py:
PS C:\JGRAM\JLPT> pip install django-pgroonga
Collecting django-pgroonga
Using cached django-pgroonga-0.0.1.tar.gz (3.7 kB)
Preparing metadata (setup.py) ... error
error: subprocess-exited-with-error
× python setup.py egg_info did not run successfully.
│ exit code: 1
╰─> [10 lines of output]
Traceback (most recent call last):
File "<string>", line 2, in <module>
File "<pip-setuptools-caller>", line 34, in <module>
File "C:\Users\61458\AppData\Local\Temp\pip-install-21w4o7u8\django-pgroonga_87013717bf0e4bcca83db91a993082b4\setup.py", line 17, in <module>
long_description=read('README.rst'),
File "C:\Users\61458\AppData\Local\Temp\pip-install-21w4o7u8\django-pgroonga_87013717bf0e4bcca83db91a993082b4\setup.py", line 6, in read
return open(os.path.join(os.path.dirname(__file__), fname)).read()
File "C:\Users\61458\AppData\Local\Programs\Python\Python310\lib\encodings\cp1252.py", line 23, in decode
return codecs.charmap_decode(input,self.errors,decoding_table)[0]
**UnicodeDecodeError: 'charmap' codec can't decode byte 0x81 in position 112: character maps to <undefined>**
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed
× Encountered error while generating package metadata.
╰─> See above for output.
note: This is an issue with the package mentioned above, not pip.
hint: See above for details.
PS C:\JGRAM\JLPT>
Can you help? I cannot find a solution online that outlines how to resolve this error when it occurs 'during installation'. I have tried updating the cp1252.py file, copying and pasting new versions, etc. but nothing works. I've also tried downloading unzipped pgroonga into the python site-packages folder but no luck. (All the other modules I have installed with pip before this have run successfully.)
Is the problem pgroonga?
If so, is there another module / tool that will solve the vector search with Japanese characters requirement?
I have a issue when using OpenCV dnn module.
Here are my settings:
Building OpenCV 4.5.5 with extra module opencv_contrib-4.x (clone from github)
Downloading EDSR_x4.pb and EDSR_x3.pb from EDSR_tensorflow
move .pb files to root directory of my project
However, no matter I used relative path or absolute path, readModel() failed for these two path:
DnnSuperResImpl sr;
sr.readModel ("EDSR_X4.pb");
sr.readModel ("C:\\Users\\user\\source\\repos\\ZBAR_OpenCV4.5\\ZBAR_OpenCV4.5\\EDSR_x3.pb");
Results from using command "dir"
C:\Users\user\source\repos\ZBAR_OpenCV4.5\ZBAR_OpenCV4.5
2022/06/13 afternoon 01:39 201,562 EDSR_x3.pb
2022/06/13 afternoon 10:40 206,908 EDSR_x4.pb
2022/06/07 afternoon 11:37 3,124 ZBAR_OpenCV4.5.cpp
2022/06/07 afternoon 11:37 544 ZBAR_OpenCV4.5.h
2022/06/13 afternoon 02:05 8,719 ZBAR_OpenCV4.5Dlg.cpp
2022/06/13 afternoon 11:37 1,627 ZBAR_OpenCV4.5Dlg.h
Exception printed out:
code= -2
err= FAILED: ReadProtoFromBinaryFile(param_file, param) Failed to parse GraphDef file: EDSR_x4.pb
func= cv::dnn::ReadTFNetParamsFromBinaryFileOrDie
line= 42
msg= OpenCV(4.5.5) C:\OpenCV4.5\opencv\sources\modules\dnn\src\tensorflow\tf_io.cpp:42: error: (-2:Unspecified error) FAILED: ReadProtoFromBinaryFile(param_file, param). Failed to parse GraphDef file: EDSR_x4.pb in function 'cv::dnn::ReadTFNetParamsFromBinaryFileOrDie'
what= OpenCV(4.5.5) C:\OpenCV4.5\opencv\sources\modules\dnn\src\tensorflow\tf_io.cpp:42: error: (-2:Unspecified error) FAILED: ReadProtoFromBinaryFile(param_file, param). Failed to parse GraphDef file: EDSR_x4.pb in function 'cv::dnn::ReadTFNetParamsFromBinaryFileOrDie'
The solution is really simple.
Just check if the integrity of EDSR_x4.pb is a pb file or a html, and I incorrectly used the later.
So, I download from the github again, and it worked.
I am trying to build Tensorboard (1.13.1) from source. I am using Bazel version 0.26.0 (built from source) and JDK version 11.0.3. I am getting following error during build:
# bazel build --incompatible_disallow_filetype=false --incompatible_bzl_disallow_load_after_statement=false tensorboard
Starting local Bazel server and connecting to it...
ERROR: /root/.cache/bazel/_bazel_root/4e113d18791d4c114d32fe59cdd54b1a/external/io_bazel_rules_closure/closure/compiler/closure_js_library.bzl:343:17: Traceback (most recent call last):
File "/root/.cache/bazel/_bazel_root/4e113d18791d4c114d32fe59cdd54b1a/external/io_bazel_rules_closure/closure/compiler/closure_js_library.bzl", line 335
rule(implementation = _closure_js_lib..., <2 more arguments>)
File "/root/.cache/bazel/_bazel_root/4e113d18791d4c114d32fe59cdd54b1a/external/io_bazel_rules_closure/closure/compiler/closure_js_library.bzl", line 343, in rule
attr.label_list(cfg = "data", allow_files = True)
cfg must be either 'host' or 'target'.
ERROR: error loading package '': in /root/.cache/bazel/_bazel_root/4e113d18791d4c114d32fe59cdd54b1a/external/io_bazel_rules_closure/closure/defs.bzl: Extension file 'closure/compiler/closure_js_library.bzl' has errors
ERROR: error loading package '': in /root/.cache/bazel/_bazel_root/4e113d18791d4c114d32fe59cdd54b1a/external/io_bazel_rules_closure/closure/defs.bzl: Extension file 'closure/compiler/closure_js_library.bzl' has errors
INFO: Elapsed time: 14.290s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded)
If I search and replace the word 'data' with 'host' from .bzl files, I start getting another error:
# bazel build --incompatible_disallow_filetype=false --incompatible_bzl_disallow_load_after_statement=false tensorboard
ERROR: error loading package '': in /root/.cache/bazel/_bazel_root/4e113d18791d4c114d32fe59cdd54b1a/external/org_tensorflow/tensorflow/workspace.bzl: Label '#org_tensorflow//third_party:nccl/nccl_configure.bzl' crosses boundary of subpackage '#org_tensorflow//third_party/nccl' (perhaps you meant to put the colon here: '#org_tensorflow//third_party/nccl:nccl_configure.bzl'?)
ERROR: error loading package '': in /root/.cache/bazel/_bazel_root/4e113d18791d4c114d32fe59cdd54b1a/external/org_tensorflow/tensorflow/workspace.bzl: Label '#org_tensorflow//third_party:nccl/nccl_configure.bzl' crosses boundary of subpackage '#org_tensorflow//third_party/nccl' (perhaps you meant to put the colon here: '#org_tensorflow//third_party/nccl:nccl_configure.bzl'?)
INFO: Elapsed time: 15.377s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded)
Correcting the error by replacing,
load("//third_party:nccl/nccl_configure.bzl", "nccl_configure")
by
load("//third_party/nccl:nccl_configure.bzl", "nccl_configure")
in the cache file,
/root/.cache/bazel/_bazel_root/4e113d18791d4c114d32fe59cdd54b1a/external/org_tensorflow/tensorflow/workspace.bzl
Solves the error but now getting this error:
# bazel build --incompatible_disallow_filetype=false --incompatible_bzl_disallow_load_after_statement=false tensorboard
ERROR: /root/.cache/bazel/_bazel_root/4e113d18791d4c114d32fe59cdd54b1a/external/org_tensorflow/tensorflow/workspace.bzl:18:1: file '#io_bazel_rules_closure//closure:defs.bzl' does not contain symbol 'filegroup_external'
ERROR: error loading package '': Extension file 'tensorflow/workspace.bzl' has errors
ERROR: error loading package '': Extension file 'tensorflow/workspace.bzl' has errors
INFO: Elapsed time: 1.113s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded)
Now I am stuck. Any pointers highly appreciated!
I have installed locally wikidata and blazegraph following the instruction here : https://github.com/wikimedia/wikidata-query-rdf/blob/master/docs/getting-started.md
The .munge get me 424 files as "wikidump-000000001.ttl.gz" in data/split
When I try to load one file on blazegraph I get an error :
ERROR:
uri=[file:/mnt/d/thomas/wikidata/dist/target/service-0.2.1/data/split/wikidump-000000001.ttl.gz],
context-uri=[] java.util.concurrent.ExecutionException:
org.openrdf.rio.RDFParseException: Expected an RDF value here, found
'' [line 1]