Child pages
  • ChangeLog
Skip to end of metadata
Go to start of metadata

Public Releases

JHDF5 19.04.0 (2019-04-29)

CORE

The big change of this major release is the internal reworking of the JNI layer which now is fully in-sync with the JNI layer from The HDF Group.

Prerequisites and Versions

  • Upgrade the native HDF5 and the JNI library to 1.10.5.
  • Drop support for Intel 32bit platforms (Linux, MacOS X and Windows).
  • Requires JRE 8 (tested with OpenJDK 8 and 11).

New Features

  • Add HDF5DataSet and method IHDF5ObjectReadOnlyInfoProviderHandler.openDataSet(), as well as methods for block-wise reading (readArrayBlock() and readArrayBlockWithOffet()) and writing (writeArrayBlock() and writeArrayBlockWithOffet()) of primitive values using HDF5DataSet to improve performance.
  • Add methods tryGetExternalLinkFilename() and tryGetExternalLinkTarget() to IHDF5ObjectReadOnlyInfoProviderHandler.
  • Add methods for creating and using HDF5DataSetTemplate in writer.

Changes

  • All previously deprecated methods have been removed.

JHDF5 18.09.0-pre1 (2018-08-19)

CORE

This is a new major version. pre-release of the new major version based on HDF 1.10. It is intended for testing through the community.

JHDF5 14.12.6 (2016-04-30)

CORE

Bug fixes

  • Fix a segfault on writing a compound with variable-length strings (thanks to Tim Patterson for reporting this bug).

JHDF5 14.12.5 (2016-03-06)

TOOLS

Bug fixes

  • When reading an empty HDF5 dataset, HDF5DataSetRandomAccessFile was throwing an exception. This is now fixed.

JHDF5 14.12.4 (2016-02-27)

CORE

 Bug fixes

  • Fix an error where the HDF5Reader or HDF5Writer would be garbage collected and closing the files while specialized readers and writers are still open and in use (thanks to Stephan Saalfeld for reporting this bug).

JHDF5 14.12.3 (2016-02-21)

CORE

Prerequisites and Versions

  • Update HDF5 library to 1.8.16 with a patch for bug HDFFV-9670. It prevents the library from segfaulting on HDF5 files that are corrupted in the internal btree structures.

JHDF5 14.12.2 (2015-12-31)

CORE

 Bug fixes

  • When initialized from several threads in parallel, JHDF5 would deadlock. This is fixed (thanks to Roman Chernyatchik for reporting this bug).

JHDF5 14.12.1 (2015-02-22)

CORE

 Bug fixes

  • HDF5UnsignedXXXReaders: readMDArraySlice(): handle special case of no free indices correctly

  • HDF5UnsignedXXXReaders: readMDArrayBlockOfArrays(): handle negative indices correctly (set the full dimension)

  • Change visibility of static instances of DataTypeInfoOptions from default to public

  • Fix returned value HDF5CompoundMemberInformation.getOffsetInMemory() which did not take padding into account in all cases.

ARCHIVER

 Bug fixes

  • Fix CRC32 calculation for archive listings which differed from previous versions

  • Windows: Fix path problem in h5ar.bar

  • Windows: Fix error when specifying archive paths for some operations

JHDF5 14.12.0 (2014-12-22)

CORE

Prerequisites and Versions

  • Upgrade the native HDF5 library to 1.8.14.
  • Upgrade the JHI wrapper to HDF-Java 2.10.1.
  • Drop support for Oracle Solaris (SPARC / x86) platforms.
  • Add support for arm-Linux platform.

 New features

  • Add methods for reading and writing slices of primitive hyper-arrays.
  • Support for reading unsigned integer values and using them as compound members.
  • Support for variable-length strings as compound members.
  • Support for data set references as compound members.
  • Support for reading of compounds with compound types not equal to the type of the compound in the HDF file (by setting requireTypesToBeEqual=false while creating the HDF5CompoundType)
  • Let the user control whether attributes should use simple data spaces rather than primitive data types.
  • Create an API abstraction for enumeration types with no link to an HDF5 file (EnumerationType).
  • In primitive readers' IHDF5XXXReader.readMDArrayBlockWithOffset() method, allow a value of -1 as blockDimension which will be replaced with the full dimension along this axis.
  • Allow reading of data sets with simple (dimensional) data spaces and array types of primitive types as elements. Is supported by IHDF5XXXReader.readMDArray() where space indices and array indices are concatenated (in this order). Note that JHDF5 does not allow to write such data sets, only to read it if other programs wrote them.

 Changes

  • Remove all symbols which are not JNI symbols from the native Linux libraries. This enables JHDF5 to be used in process together with another HDF5 version (e.g. Matlab).
  • Add gradle build script.
  • HDF5CompoundInformationRetriever: Treat empty enum type names like null (used to be an error condition).
  • Enforce memory alignment for in-memory representations of HDF5 compounds.
  • Deprecate IHDF5UnsignedXXXWriter and make IHDF5Writer return IHDF5XXXWriter for the according unsigned writers.

 Bug fixes

  • Fix a bug in inferring the length of a string in compound().getInferredType().

New API classes and methods

  • Method HDF5CompoundMemberMapping.unsigned() and annotation CompoundElement.unsigned().
  • Methods for reading unsigned integer values (IHDF5Reader.uintXXX()).
  • Class EnumerationType.
  • Method HDF5EnumerationType.getEnumerationType().
  • Method IHDF5ObjectReadOnlyInfoProviderHandler.getElementSize().
  • Fluent configuration method IHDF5WriterConfiguration.useSimpleDataSpaceForAttributes() which enforces attributes to use simple data spaces with primitive data types rather than scalar data spaces with array data types.
  • Methods IHDF5XXXReader.readSlicedMDArray(), IHDF5XXXReader.readSlicedMDArrayBlock() and IHDF5XXXReader.readSlicedMDArrayBlockWithOffset() for reading slices of multi-dimensional arrays.
  • Add methods getSpaceRank(), getSpaceDimensions(), getArrayRank(), getArrayDimension(), getRank() and getDimensions() to IHDF5ObjectReadOnlyInfoProviderHandler.
  • Method HDF5DataTypeInformation.getRank().
  • IHDF5CompoundInformationRetriever.getInferredType() and HDF5CompoundMemberMapping.inferMapping() get method versions which take a HDF5CompoundMappingHints argument.
  • IHDF5CompoundInformationRetriever: getType(), getInferredType(), getDataSetType(), getAttributeType(), getNamedType(): add methods with argument requireTypesToBeEqual to support reading of compounds which have a type not equal to the type created by those methods.
  • Method HDF5CompoundMappingHints.enumTypeMapping(Map<String,HDF5tEnumerationType).

 Non-backward compatible API changes

  • Remove method calls deprecated in 11.05 (HDF5CompoundMemberMapping.mapping() multiple methods) and 11.09 (HDF5TimeDuration.getDuration(), IHDF5SimpleWriter.writeTimeDuration() and writeTimeDurationArray(), and HDF5DataSetRandomAccessFile.createXXX() multiple methods).
  • HDF5DataTypeInformation.getDimensions() now returns an empty int[] if the data type is scalar (be consistent with HDF5DataSetInformation.getDimension()). Code that calls getDimensions()[0] on scalar data types to determine the number of elements may fail and has to be changed to call getNumberOfElements().

ARCHIVER

No changes.

JHDF5 13.06.2 (2013-08-10)

CORE

Bug fixes

  • Fix for a thread-safety issue.

ARCHIVER

Bug fixes

  • Fix for a thread-safety issue.

JHDF5 13.06.1 (2013-07-17)

CORE

Bug fixes

  • Replace com.sun.xml.internal.ws.Closeable by java.io.Closeable in IHDF5SimpleReader.

JHDF5 13.06.0 (2013-06-22)

CORE

Prerequisites and Versions

  • Upgrade the native HDF5 library to 1.8.11.
  • Upgrade the JHI wrapper to HDF-Java 2.9.
  • Minimum support Java version is now Java 6.
  • Drop support for RHEL 4 Linux. Now the minimum version of Linux is RHEL 5.
  • Drop supportfor Mac OS X 10.4 and 10.5. Now the minimum version of MacOS X is 10.6.

New features

  • Support for conveniently working with unsigned integer values.
  • Support for reading raw strings, i.e. for not interpreting \0 as string termination character.
  • Support for shuffle filters.
  • Support for configurable house-keeping suffix; the naming pattern of house-keeping objects used to be hard-coded to: '__<NAME>__'.
  • Support for providing member to enum type mappings in compound mapping hints.

Changes

  • The biggest change in this release is that the library now uses the hierarchical "quasi-fluent" API for all functions. The flat namespace in IHDF5Reader and IHDF5Writer has been kept for backward compatibility but is deprecated and will be removed in an upcoming release. Exceptions are all methods in IHDF5SimpleReaderIHDF5SimpleWriter which are not deprecated.
  • Do not call System.exit() if the native HDF5 library cannot be found, instead throw an UnsupportedOperationException.
  • Make IHDF5SimpleReader extend java.io.Closeable.

  • For the storage features, replace the flags deleteIfExists and keepIfExists with a DataSetReplacementPolicy.
  • Deprecate the string writer methods in IHDF5SimpleWriter which take an explicit length argument. This is an advanced call and should not be in the simple writer interface.

Bug fixes

  • Fix reading of string datasets with IHDF5OpaqueReader.readArray(). The block-wise methods still cannot read string data sets.
  • Various bug fixes for working with UTF8 strings. In particular the library now retrieves the character type of stored strings from the file rather than from the configuration.
  • Ensure we create strings of the given length as compound members (rather than given length + 1).
  • Make IBooleanWriter.createBitFieldArray() respect the give array size.

New API classes and methods

  • IHDF5StringReader: methods for "raw" reading of strings like readRaw() where "raw" means not to take \0 as termination character.
  • IHDF5Unsigned*Writer interfaces for all integer data types. They are available as uintNN() from IHDF5Writer.
  • UnsignedIntUtils for converting between signed and unsigned integers.
  • IHDF5TimeDurationReader / IHDF5TimeDurationWriter: add methods for multi-dimensional time duration array attributes and data sets.
  • IHDF5DateTimeReader / IHDF5DateTimeWriter: add methods for multi-dimensional time stamp array attributes and data sets.
  • Add HDF5*StorageFeatureBuilder providing a fluent API for constructing any combination of storage features.
  • Add IHDF5Reader.isClosed().
  • Add IHDF5Reader.isHouseKeepingObject().
  • Add HDF5CompoundType.getEnumTypeMap().

  • Add creation of storage features from templates using the build() method.
  • Add methods to IBooleanReader and IBooleanWriter for reading and writing bit field arrays, represented as BitSet[], both in total and block-wise.
  • To HDF5DataTypeInformation, add methods getRawDataClass()isEnum() and isBitField() and change getDataClass() to consider the knowledge about the type variant (e.g. a scaled bit field now returns HDF5DataClass.BITFIELD).

  • Add methods HDF5EnumerationValueArray and HDF5EnumerationValueMDArrayrank() and size().

Non-backward compatible API changes

  • Change all createMatrix() methods to behave like their createMDArray() counterparts.

ARCHIVER

New features

  • Add support for compression black lists.
  • Add support for checking missing files in the VERIFY operation (Option '-m').

Changes

  • Set house keeping suffix for archiver to \1\0 which can never be used as part of a filename.
  • Write directory indices as compressed fixed-length strings rather than variable-length strings as variable-length string reading turned out to sometimes crash the JVM.
  • Change default for archiver to compress files that are not yet known to be compressed.
  • Provide a more informative error message on archive test failure.

Bug fixes

  • HDF5ArchiverFactory.open() which takes an IHDF5Writer now returns IHDF5Archiver.
  • Do not throw an exception if IHDF5Archiver.close() is called twice.
  • Fix a corner-case condition where archiving would hang indefinitely.
  • On archive extraction, the IHDF5ArchiveEntryVisitor is now also called for directories as they are extracted.
  • Fix several false alarms on verification of archives related to symbolic links.
  • When extracting a file from an archive without a directory index or with a broken index, older versions of the archiver raised a wrong alarm on CRC32 checksum failure.
  • Make h5ar.sh also work when the script is called with a relative path.

New API classes and methods

  • IHDF5ArchiveReader.extractToFilesystemBelowDirectory()to extract paths below a given directory in the archive.

  • IHDF5ArchiveReader.extractToFilesystem() to extract the whole archive.
  • IHDF5ArchiveReader.isClosed().
  • IHDF5Archiver: methods to archive (and verify) directories from the filesystem with an arbitrary root directory in the archive.
  • IHDF5Archiver: methods archiveFromFilesystemBelowDirectory() to archive the directory content at an artificial root.
  • Interface IHDF5ArchiveInfoProvider is now public.
  • ListParameters:resolveSymbolicLinks() and followSymbolicLinks().
  • ArchiveEntry: methods getRealPath(), getRealParentPath() and getRealName().
  • ArchiveEntry.hasChecksum()
  • ArchivingStrategy: methods compressAll(), seal(), isCompressAll() and isSealed().

Non-backward compatible API changes

  • IHDF5Archiver: remove all methods that get a IPathVisitor argument and add methods with IArchiveEntryVisitor where it makes sense.
  • IHDF5ArchiveInfoProvier.list(): exclude the top-level directory from the listing and add a ListParameter to ask for including it.
  • Rename IPathVisitor to IArchiveEntryVisitor.

JHDF5 12.02.3 (2013-01-03)

Bug fixes

  • Fix detection of JDK 1.7 on Mac OS X
  • Fix HDF-Java on Windows (add forgotten object file nativeData.obj)

JHDF5 12.02.2 (2012-02-20)

 Changes

  • Deprecate methods in IHDF5CompoundBasicWriter which are not part of IHDF5SimpleWriter

JHDF5 12.02.1 (2012-02-19)

 New features

  • Method for cloning a compound data type from a different file
  • Methods for creating anonymous (non-committed) compound data types
  • HDF5DataType now has methods to get the data type information and the data type path, if any

Bug fixes

  • Loosen type checks when creating a compound data type such that HDF5 numeric conversions can be applied again

JHDF5 12.02.0 (2012-02-14)

New features

  • Support for reading and writing compound, compound array and compound MD array attributes
  • New API for reading and writing compounds values available by IHDF5Reader.compounds() and IHDF5Writer.compounds()
  • Add methods for reading and writing enum MDArrays block-wise and for iterating over 'natural blocks'

  • Add methods for reading and writing enum MDArray attributes

  • New API for reading and writing enumeration values available by IHDF5Reader.enums() and IHDF5Writer.enums()
  • Support for writing unsigned integers (see HDF5IntStorageFeatures)
  • Add new API for working with HDF5 archives (package ch.systemsx.cisd.hdf5.h5ar)

  • Support for flushing caches before closing an HDF5 file (IHDF5Writer.addFlushable(Flushable) and IHDF5Writer.removeFlushable(Flushable))

  • Support automatic mapping of Java enumeration types in compound type classes
  • Support for reading object references without enforcing dereferencing the path of the object (performance improvement)

Changes

  • Upgrade the native HDF5 library to 1.8.8
  • Upgrade the JHI wrapper to HDF-Java 2.8
  • Do not enforce any minimal chunk size (i.e. allow chunk size of 1)
  • Deprecate all old compound and enum reading and writing methods in favor of the new compounds() and enums() API

Bug fixes

  • Fix an issue with incorrectly reading elements from H5T_ARRAY types when HDF5 does automatic padding (as it does for e.g. for compound types)
  • Fix iterating over very large arrays with one dimension not fitting into a 32-bit signed integer
  • Fix an error about an incomplete mapping when inferring from a Java pojo with @CompoundType(mapAllFields=false)
  • Fix a segfault on JVM shutdown when using HDF5DataSetRandomAccessFile
  • Fix a bug that prevents the 1.8 file format from being used even when specified.

API changes

Non-backward compatible changes

  • Throw HDF5JavaException rather than NullPointerException when the compound definition is invalid

New classes and methods

  • Change IHDF5SimpleReader and IHDF5SimpleWriter: add methods for reading and writing enums
  • Add classes HDF5EnumerationValueArray and HDF5EnumerationValueMDArray and methods for reading and writing them
  • Add method IHDF5EnumTypeRetriever.getAttributeType()

  • Add method HDF5DataTypeInformation.getUsableLength()

  • Add methods HDF5CompoundMemberMapping.inferMapping(Object, Map<String, HDF5EnumerationType>) and HDF5CompoundMemberMapping.inferMapping(Object[], Map<String, HDF5EnumerationType>) that infer the length and dimension information from the template objects.
  • Add method IHDF5CompoundInformationRetriever.getDataSetType(String objectPath, Class<T> pojoClass,HDF5CompoundMemberMapping... members)

  • Add methods IHDF5BooleanReader.readBitFieldBlock(String objectPath, int blockSize, long blockNumber) and IHDF5BooleanReader.readBitFieldBlockWithOffset(String objectPath, int blockSize, long offset) and IHDF5BooleanReader.isBitSetInBitField(String objectPath, int bitIndex)
  • Add method @CompoundElement.typeName() to set the name of the data type
  • Add class HDF5CompoundMappingHints to hint in what form to return enumerations as members of a compound
  • Add class DataTypeInfoOptions to control how much information is read about a data type (performance improvement)

JHDF5 11.09.1 (2011-09-12)

New features

  • Add support for reading and writing compound array and MD array attributes.
  • Add method for reading an attribute as a byte array (IHDF5GenericReader.getAttributeAsByteArray).

Changes

  • Add automatic conversions (e.g. int -> long) for compound members when the compound is provided in a map, list or array

JHDF5 11.09.0 (2011-09-06)

New features

  • Many improvements for working with compound data sets
  • Support for using type variants in compounds and attributes
  • Add JHI5 low-level Java wrappers from the HDF Group's HDF-Java 2.7 library, now cisd-jhdf5.jar can be used as a drop-in replacement for jhdf5.jar from HDF-Java to run, e.g. HDFView

Changes

  • Add support for incomplete compound mapping, i.e. Java POJOs that do not have all corresponding fields for a compound type
  • Improve javadoc start page and add example programs
  • Call checkMappingComplete() whenever the mapping is inferred in the compound writer or computed implicitly in the compound reader
  • When an enum data set is committed and keepDataSetIfTheyExist() is not configured for the writer, do not throw an exception, but rather move the old type to NAME__REPLACED_N and create a new enum type with NAME
  • Give better error message in HDF5DataSetRandomAccessFile when trying to write beyond the limit of a non-extendable dataset
  • In HDF5DataSetRandomAccessFile, do not read a block when initializing the buffer for writing and we have just extended the dataset (optimization)
  • In HDF5DataSetRandomAccessFile, throw consistently an IOExceptionUnchecked with an IOException that has the HDF5Exception as its cause (no more IllegalStateException)

Bug fixes

  • Fix a bug in the HDF5 core library that crashed the JRE with SIGSEGV in function H5FO_opened (known as HDF5 bug #7638)
  • Fix computation of string length for string compound member types when strings are unicode
  • Fix HDF5CompoundMemberMapping.inferMapping(), set length correctly for strings, bit sets and enumeration arrays
  • Fix HDF5CompoundMemberMapping.inferMapping(), set type variant correctly for time durations
  • Ensure a scalar data set is overwritten with the new type when keepDataSetsIfExist() is not given in configuring the writer
  • Ensure getNamedCompoundType() and getDataSetCompoundType() always use the data type id of the committed data type
  • Ensure compound data types are properly overwritten by getInferredCompoundType() if keepDataSetsIfExist() is not given in configuring the writer
  • Ensure compound data sets are property overwritten if keepDataSetsIfExist() is not given in configuring the writer
  • Fix NPE when calling getInferredCompoundType(Class) if keepDataSetsIfTheyExist() is configured for the writer
  • Detect and reject inappropriate compound mappings when trying to read a compound
  • When inferring a compound type from a pojo in a writer configured with keepDataSetsIfTheyExist(), return the named compound type (as it is done for inferring a compound type from a pojo class)
  • Don't delete a committed data set when it is replaced, but rename it, so we do not loose the data type variants
  • When a Java POJO is annotated with @CompoundType(mapAllFields = false), then do not consider fields without a @CompoundElement annotation when checking for an incomplete mapping
  • Too large dataset size in HDF5DataSetRandomAccessFile when internal buffer is larger than the operation buffer
  • Calling close() on the random access file does no longer close the underlying HDF5 reader / writer that has been used to create it
  • Off-by-one error in HDF5DataSetRandomAccessFile when extending datasets

API changes

Non-backward compatible changes

  • IHDF5DateTimeReader.readTimeDuration(String) returns HDF5TimeDuration (was: long)
  • IHDF5DateTimeReader.readTimeDurationArray(String) returns HDF5TimeDurationArray (was: long[])

The low-level API has been replaced by HDF-Java 2.7 JHI5 and thus changed considerably.

New classes and methods

New classes
  • HDF5Factory
  • HDF5IOAdapterFactory
  • HDF5FileNotFoundException
  • HDF5TimeDurationArray
New methods
  • IHDF5Factory.isHDF5File(File)
  • IHDF5Reader.tryGetTypeVariant(String,String)
  • IHDF5Writer.setTypeVariant(String,String,HDF5DataTypeVariant)
  • IHDF5Writer.deleteTypeVariant(String,String)
  • IHDF5DateTimeReader.getDateAttribute(String,String)
  • IHDF5DateTimeReader.getTimeStampAttribute(String,String)
  • IHDF5DateTimeReader.isTimeStamp(String,String)
  • IHDF5DateTimeReader.getDateArrayAttribute(String,String)
  • IHDF5DateTimeReader.getTimeStampArrayAttrubute(String,String)
  • IHDF5DateTimeReader.getTimeDurationAttribute(String,String)
  • IHDF5DateTimeReader.getTimeDurationArrayAttribute(String,String)
  • IHDF5DateTimeReader.isTimeDuration(String,String)
  • IHDF5DateTimeReader.tryGetTimeUnit(String,String)
  • IHDF5DateTimeReader.getTimeDurationArrayNaturalBlocks(String)
  • IHDF5DateTimeReader.readTimeDurationArrayBlock(String,int,long)
  • IHDF5DateTimeReader.readTimeDurationArrayBlockWithOffset(String,int,long)
  • IHDF5DateTimeWriter.setDateAttribute(String,String,Date)
  • IHDF5DateTimeWriter.setTimeStampAttribute(String,String,long)
  • IHDF5DateTimeWriter.setDateArrayAttribute(String,String,Date[])
  • IHDF5DateTimeWriter.setTimeStampArrayAttribute(String,String,long[])
  • IHDF5DateTimeWriter.setTimeDurationAttribute(String,String,long,HDF5TimeUnit)
  • IHDF5DateTimeWriter.setTimeDurationAttribute(String,String,HDF5TimeDuration)
  • IHDF5DateTimeWriter.setTimeDurationArrayAttribute(String,String,HDF5TimeDurationArray)
  • IHDF5DateTimeWriter.writeTimeDuration(String,HDF5TimeDuration)
  • IHDF5DateTimeWriter.writeTimeDurationArray(String,HDF5TimeDurationArray)
  • IHDF5DateTimeWriter.writeTimeDurationArrayBlock(String,HDF5TimeDurationArray,long)
  • IHDF5DateTimeWriter.writeTimeDurationArrayBlockWithOffset(String,HDF5TimeDurationArray,int,long)
  • IHDF5ComponentInformationRetriever.getInferredCompoundType(String,List,List)
  • IHDF5ComponentInformationRetriever.getInferredCompoundType(List,List)
  • IHDF5ComponentInformationRetriever.getInferredCompoundType(String,String[],Object[])
  • IHDF5ComponentInformationRetriever.getInferredCompoundType(String[],Object[])
  • IHDF5CompoundReader.getCompoundArrayNaturalBlocks(String,Class)
  • IHDF5CompoundReader.getCompoundMDArrayNaturalBlocks(String,Class)
  • IHDF5CompoundReader.readCompound(String,Class)
  • IHDF5CompoundReader.readCompoundArray(String,Class)
  • IHDF5CompoundReader.readCompoundMDArray(String,Class)
  • IHDF5CompoundWriter.writeCompound(String,T)
  • IHDF5CompoundWriter.writeCompoundArray(String, T[])
  • IHDF5CompoundWriter.writeCompoundArray(String, T[], HDF5GenericStorageFeatures)
  • IHDF5CompoundWriter.writeCompoundMDArray(String, MDArray)
  • IHDF5CompoundWriter.writeCompoundMDArray(String, MDArray, HDF5GenericStorageFeatures)
  • HDF5CompoundMemberMapping.constructCompoundTypeName(Collection, boolean)
  • HDF5CompoundMemberMapping.inferMapping(Map)
  • HDF5CompoundMemberMapping.inferMapping(List, List)
  • HDF5CompoundMemberMapping.inferMapping(String[], Object[])
  • HDF5CompoundType.checkMappingComplete()
  • HDF5CompoundType.getCompoundMemberInformation()
  • HDF5CompoundType.getUnmappedCompoundMemberInformation()
  • HDF5CompoundType.getUnmappedCompoundMemberNames()
  • HDF5CompoundType.getUnmappedFieldNames()
  • HDF5CompoundType.isDiskRepresentationIncomplete()
  • HDF5CompoundType.isMappingIncomplete()
  • HDF5CompoundType.isMemoryRepresentationIncomplete()
  • IHDF5EnumReader.getDataSetEnumType(String)
Deprecated methods
  • HDF5TimeDuration.getDuration()
  • IHDF5DateTimeWriter.writeTimeDuration(String, long timeDuration)
  • IHDF5DateTimeWriter.writeTimeDurationArray(String, long[])
  • IHDF5DateTimeWriter.writeTimeDurationArray(String, long[], HDF5TimeUnit)
  • IHDF5DateTimeWriter.writeTimeDurationArray(String, HDF5TimeDuration[])
  • IHDF5DateTimeWriter.writeTimeDurationArray(String, long[], HDF5TimeUnit, HDF5IntStorageFeatures)
  • IHDF5DateTimeWriter.writeTimeDurationArray(String, HDF5TimeDuration[], HDF5IntStorageFeatures)
  • IHDF5DateTimeWriter.writeTimeDurationArrayBlock(String, long[] data, long, HDF5TimeUnit)
  • IHDF5DateTimeWriter.writeTimeDurationArrayBlockWithOffset(String, long[] data, int, long, HDF5TimeUnit)
  • IHDF5DateTimeWriter.writeTimeDurationArrayBlock(String, HDF5TimeDuration[] data, long)
  • IHDF5DateTimeWriter.writeTimeDurationArrayBlockWithOffset(String, HDF5TimeDuration[] data, int, long)
  • IHDF5DateTimeReader.readTimeDurationAndUnit(String)
  • IHDF5DateTimeReader.readTimeDuration(String, HDF5TimeUnit)
  • IHDF5DateTimeReader.readTimeDurationAndUnitArray(String)
  • IHDF5DateTimeReader.readTimeDurationArray(String, HDF5TimeUnit)
  • IHDF5DateTimeReader.readTimeDurationArrayBlock(String, int, long, HDF5TimeUnit)
  • IHDF5DateTimeReader.readTimeDurationArrayBlockWithOffset(String, int, long, HDF5TimeUnit)
  • IHDF5DateTimeReader.readTimeDurationAndUnitArrayBlock(String, int, long)
  • IHDF5DateTimeReader.readTimeDurationAndUnitArrayBlockWithOffset(String, int, long)
  • IHDF5DateTimeReader.getTimeDurationArrayNaturalBlocks(String, HDF5TimeUnit)
  • IHDF5DateTimeReader.getTimeDurationAndUnitArrayNaturalBlocks(String)
  • IHDF5EnumReader.getEnumTypeForObject(String)
  • HDF5DataSetRandomAccessFile.createForReading(File, String)
  • HDF5DataSetRandomAccessFile.create(File, String)
  • HDF5DataSetRandomAccessFile.create(File, String, int)
  • HDF5DataSetRandomAccessFile.reateCompress(File, String)
  • HDF5DataSetRandomAccessFile.createCompress(File, String, int)
  • HDF5DataSetRandomAccessFile.createOpaque(File, String, String)
  • HDF5DataSetRandomAccessFile.createOpaque(File, String, String, int)
  • HDF5DataSetRandomAccessFile.createOpaqueCompress(File, String, String)
  • HDF5DataSetRandomAccessFile.createOpaqueCompress(File, String, String, int)
  • HDF5DataSetRandomAccessFile.createFullControl(File, String, StringOrNull, int size, HDF5GenericStorageFeatures)

JHDF5 11.05.2 (2011-05-13)

New features

  • Command CAT for extracting file from an h5ar archive directly to stdout

Changes

  • Better help text for archiver

Bug fixes

  • NullPointerException in archiver on attempting to archive an empty directory

JHDF5 11.05.1 (2011-05-12)

Changes

  • Upgrade: HDF5 from 1.8.7-pre1 to 1.8.7
  • Add flag -mtune=corei7 to Linux builds
  • Update: doctitle of javadoc to 11.05
  • Add packages ch.systemsx.cisd.hdf5.io and ch.systemsx.cisd.hdf5.tools to javadoc
  • Change visibility of helper class HDF5ArchiveLinkChecker from public to package protected

Bug fixes

  • Return IHDF5WriterConfigurator rather than the concrete class from userUTF8CharacterEncoding()

JHDF5 11.05.0 (2011-05-09)

New features

  • Support for UTF8 strings
  • Support reading and writing HDF5 object references
  • Support reading and writing compounds with dynamic structure, using lists, arrays and maps (used to support only custom POJOS, which restricted compound handling to a static structure)
  • Allow to explicitly specify on writing / creating data sets that an existing data set should be overwritten even when it has been requested to keep data sets for the whole writer
  • Implement a reliable way to get the name of a committed data type, also for compound members and attributes
  • Add support for data type variants of compound members

API changes

Non-backward compatible changes

  • Rename HDF5IntStorageFeature.createDeflationKepp() to HDF5IntStorageFeature.createDeflationKeep() (fix typo)
  • IHDF5CompoundInformationRetriever.getCompoundDataSetInformation() and IHDF5CompoundInformationRetriever.getCompoundMemberInformation() return their information in the order of the compound members rather than alphabetically sorted by member name, sorting has now to be performed explicitly
  • Make IHDF5Reader.getStringArrayAttribute() throw an HDF5JavaException rather than an IllegalArgumentException on wrong data set type

Additions

  • HDF5DataClass.REFERENCE
  • ..._DELETE variants of HDF5XXXStorageFeatures to override this behavior
  • Add configuration option keepDataSetsIfTheyExist() to IHDF5WriterConfigurator
  • Add: opaque tag to HDF5DataTypeInformation
  • Add HDF5DataSetRandomAccessFile to access HDF5 data sets like random access files in Java code
  • Add method createFromGeneric() to HDF5IntStorageFeatures and HDF5FloatStorageFeatures that translates a generic storage feature in an integer or float storage feature
  • add HDF5DataTypeInformation.tryGetJavaType() to get the "canonical" Java data type for an HDF5 data type
  • Classes HDF5CompoundDataMap and HDF5CompoundDataList
  • Add methods HDF5DataType.tryGetName(), HDF5DataType.getName(), HDF5DataTypeInformation.tryGetName() and HDF5DataTypeInformation.tryGetDataTypePath()
  • Add methods for writing and reading enumeration array attributes
  • HDF5CompoundMemberMapping and CompoundElement can transport data type variants to allow setting a data type variant for a compound member
  • HDF5TypeInformation has now information about the data type variant and new methods tryGetTypeVariant(), isTimeStamp(), isTimeDuration(), tryGetTimeUnit()
  • New field HDF5DataTypeVariant.NONE and methods isTypeVariant(), maskNull(), unmaskNone(), isCompatible()
  • Chaining setters HDF5CompoundMemberMapping.length(), dimensions(), fieldName(), memberClass(), enumType(), typeVariant() to simplify creation of compound member mappings and avoid it to become unwildly when more options were added
  • Add methods for reading and writing HDF5 object references, see IHDF5ReferenceWriter and IHDF5ReferenceReader
  • Methods IHDF5StringReader.getStringMDArrayAttribute() and IHDF5StringWriter.setStringMDArrayAttribute()
  • Add: class HDF5TimeDuration to represent a time duration including the unit and methods in IHDF5Reader and IHDF5Writer to read and write time durations using this class

Changes

  • For simple storage features, create a scalar string data type rather than an array of size 1
  • Switch h5ar to always use UTF8 character encoding
  • Implement finalize() method in HDF5Reader to allow implicit closing of resources
  • HDF5LibraryException.getMessage() now contains the lowest level HDF5 library error information from the error stack (which is usually the relevant one for debugging)
  • Deprecate HDF5CompoundMemberMapping.mapping() methods with more than one String arguments in favor of the new chaining setters
  • Deprecate IHDF5CompoundInformationRetriever.getCompoundDataSetInformation(String, boolean) as alphabetical ordering is easier done by an explicit sorting call
  • Refactor: simplify enum reading and writing logic
  • Refactor: HDF5Archiver for using it as components in other software
  • Refactor: improve code structure of compound byteifyer factories
  • Refactor reading and writing various primitive data types, avoiding some copying of arrays in memory where it is not really needed
  • Improve test coverage
  • Upgrade HDF5 to 1.8.7-pre1
  • Compile HDF5 and native layer on Linux with gcc 4.6.0, enabling additional optimizations that become available with the new gcc version
  • Compile HDF5 and native layer on Mac OS X with gcc 4.2.1
  • Rename jar file cisd-jhdf5-batteries_included_lin_win_mac_sun.jar to cisd-jhdf5-batteries_included_lin_win_mac_sol.jar

Bug fixes

  • Overwriting a string with a larger string stored a truncated version of the string if HDF5GenericStorageFeature.isKeepDataSetIfExists() was not set
  • IHDF5Writer.writeString() didn't use contiguous layout even if if explicitly requested
  • Show CRC32 checksums when listing an h5ar archive even when Unix permissions are not available in the archive
  • Handling of symbolic links on the top level of an h5ar archive
  • Don't fail on extracting files from archives which do not have index information stored
  • Buffer overflow of native layer on 32bit platforms
  • IHDF5Reader.readAsByteArray() used to returned by mistake an array of length 1 and value 0 for an empty data set
  • Set dataset fill time to H5D_FILL_TIME_ALLOC explicitly so that IHDF5Writer.setDataSetSize() produces properly nullified datasets rather than datasets filled with some random chunk at the end
  • Fix small memory leak in the native layer of IHDF5Reader.exists() and getObjectType()
  • Fix endianes issue with IHDF5Reader.getEnumAttributeAsString() on big endian platforms
  • Fix: IHDF5DataTimeWriter.writeTimeDurationArrayBlockWithOffset() and IHDF5DataTimeWriter.writeTimeDurationArrayBlock() performed the wrong conversion when the data set had a different time unit than the one requested in the write call
  • Fix: IHDF5DataTimeWriter.writeTimeDurationArrayBlockWithOffset() and IHDF5DataTimeWriter.writeTimeDurationArrayBlock() used to change the data block data as a side-effect of the conversion without properly documenting this in the Javadoc

JHDF5 10.06.0 (2010-06-19)

New features

  • annotation-based mapping from Java fields to compound members
  • support for more data types as compound member types:
    • HDF5EnumerationValueArray
    • primitive matrices (like float[][])
    • char[]
  • string arrays as attributes
  • multi-dimensional string arrays

API changes

Non-backward compatible changes

  • remove HDF5StorageLayout.VARIABLE_LENGTH as variable-length is really independent of storage layout but is a variable element length
  • rename method IHDF5Reader.getCompoundTypeForDataSet() to IHDF5Reader.getDataSetCompoundType()
  • throw exception IllegalArgumentException when a compound member, which is a primitive matrix has the wrong shape (used to be a IllegalStateException)

Additions

  • HDF5DataTypeInformation.isVariableLengthType()
  • annotations CompoundType and CompoundElement to support annotation-based mapping from Java fields to compound members
  • IHDF5Reader:
    • getStringArrayAttribute()
    • getStringMDArrayNaturalBlocks()
    • readStringMDArray()
    • getCompoundMDArrayNaturalBlocks()
    • getInferredCompoundType()
    • getDataSetCompoundType()
    • getNamedCompoundType()
  • IHDF5Writer:
    • createStringMDArray()
    • createStringVariableLengthMDArray()
    • setStringArrayAttribute()
    • writeStringMDArray()
    • writeStringVariableLengthArray() with support for setting the storage features
    • writeStringVariableLengthMDArray()

Changes

  • update native library to HDF5 1.8.5
  • refactor code for smaller classes (put readers and writers for specific data types in classes of their own)
  • variable-length string arrays now use "" instead of null to indicate an uninitialized entry (like fixed-length strings)
  • variable-length string data sets used to hide dimension, maxDimension and storageLayout in the DataSetInformation and to report a element size of 1; now they correctly report dimension, maxDimension and storageLayout and report an element size of -1 (to indicate that the element size is variable)

Bug fixes

  • a possible SIGSEGV on reading a string array
  • uninitialized entries in fixed-length arrays contained random garbage
  • use correct element size when deciding on whether a data set should be chunked or not (used to always assume element size 1)
  • when setting a string attribute and the attribute already existed with another (non-String) type the assignment failed as the type mismatch was not detected

JHDF5 10.01.1 (2010-03-20)

Bug fixes

  • upgrade the native HDF5 library to 1.8.4-patch1 for the Solaris Sparc platform; this patch fixes a file corruption issue on big-endian platforms
  • make primitive reader / writer interfaces public to allow using the library from Scala
  • small fixes in the javadoc

Changes

  • better detection of the location of the library in h5ar.bat

JHDF5 10.01.0 (2010-01-27)

API changes

Non-backward compatible changes

  • IHDF5Writer: rename HDF5XXXCompression classes to HDF5XXXStorageFeatures as they have additional capabilities now for setting for setting the storage layout and requesting to not re-create a data set when writing to it
  • IHDF5Writer: remove the writeXXXCompact() methods as the storage layout is now set by specifying the appropriate {HDF5XXXStorageFeatures}} constant

Additions

  • IHDF5Reader: getXXXMatrixAttribute() and getXXXMDArrayAttribute() for reading multi-dimensional attributes
  • IHDF5Reader: readEnumArrayBlock(), readEnumArrayBlockWithOffset() and getEnumArrayNaturalBlocks() for block-wise reading of enum arrays
  • IHDF5Reader: readStringArrayBlock(), readStringArrayBlockWithOffset() and getStringArrayNaturalBlocks() for block-wise reading of string arrays
  • IHDF5Reader: copy() and copyAll() methods for copying objects within an HDF5 file
  • IHDF5Writer: setXXXMatrixAttribute() and setXXXMDArrayAttribute() for writing multi-dimensional attributes
  • IHDF5Writer: createEnumArray() methods for creating an enum array without writing to it
  • IHDF5Writer: writeEnumArrayBlock() and writeEnumArrayBlockWithOffset() for block-wise writing of enum arrays
  • IHDF5Writer: createStringArray() and createStringVariableLengthArray() methods for creating a string array without writing to it
  • IHDF5Writer: writeStringArrayBlock() and writeStringArrayBlockWithOffset() for block-wise writing of string arrays
  • IHDF5Writer: writeStringVariableLengthArray() to write an array of variable-length strings
  • IHDF5Writer: setStringAttributeVariableLength() to write a variable-length string attribute
  • IHDF5Writer: setDataSetSize() and setDataSetDimensions() for explicitly changing the dimensions of an extendable data set
  • IHDF5Writer: move() for moving objects within an HDF5 file
  • IHDF5Writer: convenience methods createXXXArray() for creating one- and multi-dimensional arrays with only one size parameter, which will be interpreted as the total size for non-extendable data sets and as the block size for extendable data sets.

New features

  • support for multi-dimensional arrays in compound data types
  • support transparently reading data sets which are array types
  • support transparently reading attributes which are stored in data spaces rather than array types
  • support explicitly setting the storage layout by HDF5XXXStorageFeatures constants
  • allow to request not re-creating a data set on writing by HDF5XXXStorageFeatures constants

Changes

  • by default, delete an existing data set when a writeXXX() method is called (can be changed by using the appropriate HDF5XXXStorageFeatures constant
  • recompile native platforms to enable improved error reporting
  • update native library to HDF5 1.8.4

Bug fixes

  • in 9.04, the writeXXCompact() method didn't create compact dataset, but chunked data sets. The pendant to achieve the same goal in 10.01, which are the appropriate HDF5XXStorageFeatures constants, works correctly
  • in 9.04, a new-style group (which breaks 1.6 compatibility) could be created even if a strict 1.6 compatibility mode was requested; in 10.01, a call to create a new-style group will fail with an exception in this case

JHDF5 9.04.2 (2009-05-04)

Bug fixes

  • the fix of 9.04.1 for data sets not being extended turned out to be incomplete; now also compound, opaque
    byte arrays, time durations and time stamps are extended as expected

JHDF5 9.04.1 (2009-04-27)

Bug fixes

  • writeXXXBlockYYY() didn't extend data sets as needed
  • change return values of IHDF5ReaderConfigurator and IHDF5WriterConfigurator to return the interface rather than the implementation (as the implementation classes are only package visible)

JHDF5 9.04.0 (2009-04-06)

The API changes from 8.10 to 9.04 have been quite extensive. This turned out to be necessary as 8.10 was the first release of JHDF5 and we found room for improvement with the API. Future versions of the library will have a higher level of API backward compatibility.

API Changes

Non-backward compatible changes

  • move: MDXXXArray from package ch.systemsx.cisd.common.array to ch.systemsx.cisd.base.mdarray
  • add: interfaces IHDF5Reader and IHDF5Writer for hiding the concrete classes HDF5Reader and HDF5Writer
  • add: interfaces IHDF5ReaderConfigurator and IHDF5WriterConfigurator to distinguish reader / writer from the configuration object
  • change: rename HDF5SimpleReader to IHDF5SimpleReader and HDF5SimpleWriter to IHDF5SimpleWriter.
  • change: no constructors for readers, writers or configurators are publicly visible anymore, all creation has to be done by using factory methods of HDF5Factory
  • change: replace the boolean value deflate in the API by an HDF5XXXCompression object to support scaling compression and specifying the deflate level
  • change: replace HDF5WriterConfigurator.useLatestFileFormat() by HDF5WriterConfigurator.fileFormat(FileFormat) to distinguish between enforcing HDF5 1.8 file format and allowing HDF5 1.8 features.

Additions

  • add: method HDF5WriterConfigurator.syncMode() to support file syncing with the underlying storage
  • change: switch off numeric conversions by default (as they don't work on all platforms)
  • add: method HDF5Reader.performNumericConversions() to enable them on platforms that support it
  • add: methods for reading data sets and data set blocks into already existing MDXXXArray objects: HDF5Reader.readToXXXMDArrayWithOffset() and HDF5Reader.readToXXXMDArrayBlockWithOffset()
  • add: methods for iterating over "natural" data blocks (based on data set chunks)
  • add: methods HDF5Writer.writeMDXXXArrayBlockWithOffset() that allows to pick a block in memory and write it to (another) block on disk
  • support primitive arrays of fixed length in compound data types
  • support primitive arrays as attribute values
  • add: class HDF5TimeUnit
  • add support for time durations (methods HDF5Reader.readTimeDuration()/HDF5Writer.writeTimeDuration() and friends)
  • add: support for explicitly setting and getting type variants on data sets
  • add: class HDF5ObjectInformation and IHDF5Reader.getObjectInformation() and methods for determining the object type that allow to decide whether symbolic links should be followed or not (e.g. IHDF5Reader.exists(objectPath, followLinks))
  • add: constructors on classes MDXXXArray from matrices (e.g. float[][])
  • add: method MDXXXArray.toMatrix() to get a matrix back
  • add: method HDF5LinkInformation.getParentPath()
  • add: method HDF5LinkInformation.getName()

Error handling / reporting

  • ensure HDF5 files are flushed and closed correctly even when the application does not close an HDF5 file (e.g. because an uncaught exception occurred)
  • check for non-existing directory before creating a new HDF5 file in order to avoid bad error message
  • check whether file is still open and give appropriate error message if not
  • introduce upper bound of 16386 characters for path and attribute names in HDF5 containers as we have experienced segfaults on excessively long path names without enforcing an upper bound

Bug fixes

  • fix enumerations with more than 65535 elements
  • endiness bugs on big endian platforms
  • ensure HDF5Writer.writeStringVariableLength() can be called repeatedly on the same data set, i.e. that variable length string data sets of can be overwritten

Performance

  • great improvement of performance on reading large compound data set arrays
  • smaller improvement on reading / writing data sets

Storage Form

  • do not use dimension mangling for empty data sets any more, use chunked data sets with actual dimension of 0 instead
  • by default, allow HDF5 1.8 features (old default was to enforce HDF5 1.6 format)

Supported Platforms

  • support for Windows x64 (Windows XP, Windows Vista and Windows Server 2008, 2003 and 2000)
  • support for Sun Solaris 10 on the SPARC platform (32bit and 64bit)

HDF5Archiver

  • deal correctly with directory that can't be read, e.g. due to permission problems
  • create index for storing file owner and permissions and for speeding up the listing
  • remove option for ignoring empty directories
  • Add option for switching between recursive / non-recursive listing
  • slightly decrease size of groups in HDF5 files (old-style) by tweaking constant
  • error message in some corner cases
  • change default file extension from .h5 to .h5ar
  • replace option --latest-file-format with --file-format N
  • add CRC32 checksum calculation and checking to be able to detect errors

Other Changes

  • upgrade to HDF5 1.8.2
  • refactor layering structure for simplicity and performance

JHDF 8.10 (2008-10-30)

Initial Release.

  • No labels