Release 19.06.5

Release 19.06.4

Release 19.06.3


Failing to follow this procedure does not have long term repercussions but will lead to errors due to the master data upgrades failing. These upgrades will then happen next time openBIS is restarted.

Release 19.06.1

Release 19.06.0

Version S276

Version S254

Version S241

Version S236

Version S232

Version S231

Version S228

Version S227

Version S224

Version S222

Version S221

Version S216

Version S214

Version S207

Version S177

Version S175

Version S164

Version S148

Version S146

Due to reorganisation of dss api some plugins may require update.

Version S145


Version S141

Version S140

Version S139

Version S138

Version S137

Version S136

No pending configuration changes for this version.

Version S135

Version S134

Version S133

Version S132

Version S131

Version S130

Version S129

Version S128

Version S127

Version S126

Version S125

// link to sample registration page without filtering of available sample types
<a href="" onclick="window.location.hash='#action=SAMPLE_REGISTRATION&timestamp=' + (new Date().getTime()); return false;">Sample registration WITHOUT FILTERING</a>  
// link to sample registration page with filtering of available sample types to ones that match BIO.* regexp
<a href="" onclick="window.location.hash='#action=SAMPLE_REGISTRATION&sampleTypePattern=BIO.%2A&timestamp=' + (new Date().getTime()); return false;">Sample registration WITH FILTERING</a>

Please note that apart from simple links that lead to the registration page it is also possible to create more complex links that define which sample types should be available for choice at the registration page (available in "Sample Type" field). To create such a link one has to add "sampleTypePattern" parameter that has a Java regular expression as a value. The expression defines which sample type codes are accepted and will be displayed. Please note that most of the special characters that are used in Java regular expressions are also reserved characters in URLs (they have to be encoded). For example regular expression of BIO.* becomes BIO.%2A after encoding (for more information about this encoding see:  

Version S124

Version S123

plugin-services = screening-dss-api-json-exporter-servlet
screening-dss-api-json-exporter-servlet.class = ch.systemsx.cisd.openbis.dss.generic.server.DssScreeningApiJsonServlet
screening-dss-api-json-exporter-servlet.path = /rmi-datastore-server-screening-api-v1.json/*

Version S122

Version S121


Version S120 (22 November 2011)


welcome page file:


code to add:

<a href="resources/applications/sampleExplorer/sampleExplorer.html">Browse samples</a>
DSU, BaSynthec

In order to enable the HDF5 compression as postregistration task, do the following set of changes in the datastore_server/etc/

Version S119 (10 November 2011)


Version S118 (26 October 2011)


Version S117 (13 October 2011)


Version S116 (29 September 2011)


Version S113 (18 August 2011)

Application Server

This adds a 'STARTING SERVER' and 'SERVER STARTED' to the jetty.log which can be used for controlling the startup procedure:

This STARTING SERVER: org.eclipse.jetty.server.Server@58f39b3a
2011-10-20 11:33:06,674 INFO  [main] STATUS.CISDContextLoaderListener - Application: openbis
2011-10-20 11:33:06,676 INFO  [main] STATUS.CISDContextLoaderListener - Version: S117.2 (r23329)
2011-10-20 11:33:06,676 INFO  [main] STATUS.CISDContextLoaderListener - Java VM: Java HotSpot(TM) 64-Bit Server VM (v19.1-b02)
2011-10-20 11:33:06,676 INFO  [main] STATUS.CISDContextLoaderListener - CPU Architecture: amd64
2011-10-20 11:33:06,677 INFO  [main] STATUS.CISDContextLoaderListener - OS: Linux (v2.6.32-131.6.1.el6.x86_64)
SERVER STARTED: org.eclipse.jetty.server.Server@58f39b3a
Data Store Server

Version S110 (7 July 2011)


Version S109 (23 June 2011)


Version S107 (24 May 2011)

Application Server
Data Store Server

Version S106 (11 May 2011)

Application Server

Version S105 (27 April 2011)

Data Store Server

Version S104 (13 April 2011)

Data Store Server

Version S103 (30 March 2011)

Data StoreServer

Version S102 (16 March 2011)

Application Server
Data StoreServer

Version S101 (2. March 2011)

Application Server
Data StoreServer



Data Store Server






AS (optional)



DSS - Deep Sequencing

Add this to to be able to use the SOFT Export function:

processing-plugins = to-SOFT
to-SOFT.label = Flow Lane to SOFT Exporter
to-SOFT.dataset-types = SRF_PER_LANE
to-SOFT.class = ch.ethz.bsse.cisd.dsu.dss.plugins.DataSetToSOFT =



To activate browsing Microscopy data, you have to:


DSS - PhosphonetX

change lines

inputs = raw-data, prot-ident, prot_quant
... = ch.systemsx.cisd.openbis.etlserver.phosphonetx.DataSetInfoExtractorForSearchExperiment


inputs = raw-data, prot-ident, prot-ident-quantification, prot_quant
... = ch.systemsx.cisd.openbis.etlserver.phosphonetx.DataSetInfoExtractorForProteinResults

and add

prot-ident-quantification. = prot-ident.
prot-ident-quantification.incoming-dir = /dssfs/openbis/drop-box_prot_ident-quantification = MS_QUANTIFICATION =

Note, folder /dssfs/openbis/drop-box_prot_ident-quantification has to be created before DSS is started up.

DSS - Screening

FeatureVectorStorageProcessor has a new optional property: columns-to-be-ignored. It is a comma-separated list of columns in the CSV file which will be ignored and not handled as features. The default value is barcode. Columns not listed by this property and which are not columns specifying the well are handled as (non-numerical) feature.


DSS - Screening (optional)

To use the possibility of storing images of the whole plate compressed in one file update your image dropbox configuration:

# How should the original data be stored? Possible values:
#   unchanged       - nothing is changed, the default
#   hdf5            - all the data will be packaged into one hdf5 file
#   hdf5_compressed - like hdf5, but each file is stored in a compressed form
<your-images-dropbox-name>.storage-processor.original-data-storage-format = hdf5_compressed

It allows to considerably reduce both the number of files stored in the file system and the needed storage space. Files are stored in HDF5 format.


AS - all instances

Add the following to the

# Maximum number of search results = 100000

DSS - Screening (optional)

The ch.systemsx.cisd.openbis.dss.etl.PlateStorageProcessor can compress thumbnails. To turn on:

compress-thumbnails = true


AS - all instances

AS - PhosphoNetX

DSS - all instances (optional)

The new reporting plugin that mimics the behavior of the smart view is ch.systemsx.cisd.openbis.dss.generic.server.plugins.standard.GenericDssLinkReportingPlugin. It takes one required parameter, download-url, and two optional paramters. data-set-regex and data-set-path.

Here is an example configuration:

hcs-viewer.label = HCS View
hcs-viewer.dataset-types = HCS_IMAGE
hcs-viewer.class = ch.systemsx.cisd.openbis.dss.generic.server.plugins.standard.GenericDssLinkReportingPlugin = ${download-url} = .*/PNG/.*\.jpg = original


AS - all instances (optional)

Customize the text that is shown in the initial tab/page in both simple and application view mode by overriding welcomePage.html and welcomePageSimple.html (just like we do with loginHeader.html).

AS - all instances with query databases

Check if the source definitions of the query databases have set <database>.creator-minimal-role. If that is not the case, then the upgrade will use the default, which changes with S90 from SPACE_POWER_USER to INSTANCE_OBSERVER. Check with the owner of the instance whether this is intended. If you want to keep the old behavior, you have to add:

<database>.creator-minimal-role = SPACE_POWER_USER

Screening AS

To hide unnecessary sections and make the UI simpler add a line:

web-client-configuration-file = etc/


AS - all instances


After installation and startup of the new openbis server:


For updating do the following steps:

  1. Before the servers are updated the following deletion has to be done in openBIS GUI:
  2. DSS Replace following lines

    maintenance-plugins = data-set-clean-up
    post-registration-upload.interval = 1440000


    maintenance-plugins = data-set-clean-up, post-registration-upload
    post-registration-upload.execute-only-once = true

  1. DSS datastore_server.conf: Increase maximum Java heap size to 1 GB.
  2. Update DSS and AS as normal but don't start them!
  3. Drop database basysbio_productive.
  4. Start AS and DSS.
  5. Watch DSS log for uploading existing data sets into the new basysbio_productive.


Screening DSS

If you have a server which has S83 version or older, you have to migrate to the S87 version before you upgrade the server to S88. Since version S88 BDS image dataset migration is not supported.


internal CIFEX:

On all servers that run CIFEX internally the JETTY_STOP_PORT of cifex has to be different from the one used by jetty running openbis. Otherwise during installation when script is run it will shut down also CIFEX (which we usually don't want to do).

Change the port in ~openbis/cifex/jetty/bin/






Plasmids DSS:


# The extractor class to use for type extraction
main-thread.type-extractor = ch.ethz.bsse.cisd.plasmid.dss.PlasmidTypeExtractor


# The extractor class to use for type extraction
main-thread.type-extractor = ch.ethz.bsse.cisd.plasmid.dss.PlasmidTypeExtractor
main-thread.type-extractor.dataset-types = SEQ_FILE: gb fasta xdna, RAW_DATA: ab1
main-thread.type-extractor.file-types = GB: gb, FASTA: fasta, XDNA: xdna, AB1: ab1
main-thread.type-extractor.default-file-type = PROPRIETARY = DIRECTORY

YeastX DSS:


# comma separated list of pairs: file-extension file-type
# It is assumed that for each file extension a dataset type with the same name is defined in openBIS.
# The corresponding file types have to be defined in openBIS as well.
# Files with unspecified extensions will have the file type and dataset type UNKNOWN in openBIS.
main-thread.type-extractor.file-types = pdf pdf, mat matlab, zip archive, eicml xml, fiaml xml, mzxml xml


# comma separated list of mappings from type to extensions, e.g.:
# file-type1: file-extension1 file-extension2, file-type2: file-extension3"
# It is assumed that for each file extension a dataset type with the same name is defined in openBIS.
# The corresponding file types have to be defined in openBIS as well.
# Files with unspecified extensions will have the file type and dataset type UNKNOWN in openBIS.
main-thread.type-extractor.file-types = PDF: pdf, MATLAB: mat, ARCHIVE: zip, XML: eicml fiaml mzxml



Screening AS:

Screening DSS:




maintenance-plugins = hierarchical-storage-updater, data-set-clean-up, migrator

# Removes data sets deleted from openBIS also from imaging database
data-set-clean-up.class = ch.systemsx.cisd.etlserver.plugins.DataSetDeletionMaintenanceTask
data-set-clean-up.interval = 300 = imaging-db

migrator.class = ch.systemsx.cisd.etlserver.plugins.ChainedDataSetMigrationTask
migrator.execute-only-once = true
migrator.storeRoot = ${storeroot-dir}
migrator.migrators = bds-image-db, bds-original-relocator, bds-remover
migrator.bds-image-db.class = ch.systemsx.cisd.openbis.dss.etl.bdsmigration.BDSImagingDatabaseMigrator = imaging-db = dapi, gfp
#migrator.bds-image-db.extract-single-image-channels = BLUE, GREEN
migrator.bds-original-relocator.class = ch.systemsx.cisd.openbis.dss.etl.bdsmigration.BDSOriginalDataRelocatorMigrator
migrator.bds-remover.class = ch.systemsx.cisd.openbis.dss.etl.bdsmigration.BDSDataRemoverMigrator




DSS YeastX

Service properties in the config directory of yeastx server have been already updated.
High water mark value for the archiver has been set, the commented configuration for automatic archiver has been added.
The maintenance plugins section has been moved to the end of the file.

# ---------------------------------------------------------------------------
# maintenance plugins configuration
# ---------------------------------------------------------------------------

# size of the disc free space in KB which must be available to unarchive one dataset
dataset-unarchiving-highwater-mark = 2000

# Comma separated names of maintenance plugins.
# Each plugin should have configuration properties prefixed with its name.
# Mandatory properties for each <plugin> include:
#   <plugin>.class - Fully qualified plugin class name
#   <plugin>.interval - The time between plugin executions (in seconds)
# Optional properties for each <plugin> include:
#   <plugin>.start - Time of the first execution (HH:mm)
maintenance-plugins=dataset-deletion-synchronizer, auto-archiver

# Maintenance task deleting from metabol database data sets which have been deleted from openbis
dataset-deletion-synchronizer.class = ch.systemsx.cisd.yeastx.etl.MetabolDatabaseUpdater
# how often the synchronization should happen in seconds: every day
dataset-deletion-synchronizer.interval = 86400
dataset-deletion-synchronizer.start = 22:30 = metabol-db
dataset-deletion-synchronizer.scriptFolder = sql

# Performs automatic archivization of 'ACTIVE' data sets based on their properties
#auto-archiver.class = ch.systemsx.cisd.etlserver.plugins.AutoArchiverTask
# The time between subsequent archivizations (in seconds)
#  auto-archiver.interval = 86400
# size of the disc free space in KB which must be available to unarchive one dataset
#  auto-archiver.dataset-unarchiving-highwater-mark = ${dataset-unarchiving-highwater-mark}
# Time of the first execution (HH:mm)
#  auto-archiver.start =
# following properties are optional
# only data sets of specified type will be archived
# =
# only data sets that are older than specified number of days will be archived (default = 0)
#  auto-archiver.older-than =
# fully qualified class name of a policy that additionally filters data sets to be filtered
#  auto-archiver.policy.class =

# --- ARCHIVER ------------------------------------------------------------------------

archiver.class = ch.systemsx.cisd.yeastx.etl.MLArchiverTask
archiver.unique-sample-name-property-code = ${sample-name-property-code}
archiver.unique-experiment-name-property-code = ${experiment-name-property-code} = metabol-db
# size of the disc free space in KB which must be available to unarchive one dataset
archiver.dataset-unarchiving-highwater-mark = ${dataset-unarchiving-highwater-mark}



Change configuration of Query modules in BaSysBio (both test and productive) and PhosphoNetX.


query-database.label = <label>
query-database.databaseEngineCode = postgresql
query-database.basicDatabaseName = <basicDatabaseName>
query-database.databaseKind = <databaseKind>
query-database.owner = <user>
query-database.password = <password>


Choose any text as the key: <db_key>, e.g. main or 1.

# Database Configurations for Query module
query-databases = <db_key>

<db_key>.label = <label>
<db_key>.database-driver = org.postgresql.Driver
<db_key>.database-url = jdbc:postgresql://localhost/<basicDatabaseName>_<databaseKind>
<db_key>.database-username = <user>
<db_key>.database-password = <password>


The server-url for connecting to openbis no longer requires "openbis/openbis". Just specify the protocol, hostname and port, the rest will be automatically discovered.


server-url = https://localhost:8443/openbis/openbis


server-url = https://localhost:8443






Add (after SMTP properties):

data-sources = metabol-db
metabol-db.version-holder-class = ch.systemsx.cisd.yeastx.db.MetabolDatabaseVersionHolder
metabol-db.databaseEngineCode = postgresql
metabol-db.basicDatabaseName = metabol
metabol-db.databaseKind = productive
metabol-db.readOnlyGroup = metabol_readonly
metabol-db.readWriteGroup = metabol_readwrite
metabol-db.scriptFolder = sql


yeastx-databaseEngineCode = postgresql
yeastx-basicDatabaseName = metabol
yeastx-databaseKind = productive
yeastx-readOnlyGroup = metabol_readonly
yeastx-readWriteGroup = metabol_readwrite
yeastx-scriptFolder = sql

Change all plugins accessing metabol database (look for ${yeastx-) to use 'data source', for example:

Was: = ${yeastx-databaseEngineCode} = ${yeastx-basicDatabaseName} = ${yeastx-databaseKind}

Should be: = metabol-db

Add archiver:

archiver.class = ch.systemsx.cisd.yeastx.etl.MLArchiverTask
archiver.unique-sample-name-property-code = ${sample-name-property-code}
archiver.unique-experiment-name-property-code = ${experiment-name-property-code} = metabol-db


BaSysBio (Productive and Test)



BaSysBio (Productive)