Child pages
  • Pending Configuration Changes
Skip to end of metadata
Go to start of metadata

Version 13.04.9 (? 2014)

Version 13.04.7 (9 January 2014)

Version 13.04.5 (8 October 2013)

Version S164

  • Debug logging of database connections, service calls and long-running invocations requires new log4j configuration files. When upgrading an existing openBIS installation with the installer, those files will not be automatically updated. For the new functionality to work please replace openBIS-server/jetty/etc/log.xml with the content of log.xml  and datastore_server/etc/log.xml with the content of log.xml

Version S148

  • In case of more than one DSS a global cache (for files retrieved from another DSS) should be set in DSS by the property cache-workspace-folder.

Version S146

Due to reorganisation of dss api some plugins may require update.

    • Classes representing openbis entities, which are available in the dropboxes / ingestion services has been split into v1 and v2, and the scripts/classes that explicitely name those classes by package should be updated. If the script fails, because the class cannot be found - search our documentation for the class with the same name, but different package (usually containing v2)
  • The return type of methods IExperimentAdaptor.samples() and IExperimentAdaptor.dataSets() have been changed from List to Iterable. This means python (validation) scripts need to be changed, if they call any List specific operations (value[index], value.size(), etc..) on these values  

Version S145


  • Managed Properties: function showRawValueInForms is no longer used by the managed properties framework. To reflect the same behavior, user should mark the checkbox Show Raw Value in Forms on property assignement: marked checkbox correspond to True returned by the old function.

Version S141

  • AS and DSS: The properties enabled-technologies and disabled-core-plugins have been split of from etc/ into a single separate file: The property enabled-technologies has been renamed to enabled-modules. The new file is used by AS and DSS at start up. It is located in the folder core-plugins. In case of an upgrade this has to be done manually for both AS and DSS:
    1. Copy and merge properties enabled-technologies and disabled-core-plugins from datastore_server/etc/ and openBIS-server/jetty/etc/ into core-plugins/ and rename enabled-technologies to enabled-modules.
    2. Remove enabled-technologies and disabled-core-plugins from both files.
    In case of an upgrade done by the openBIS installer for standard technologies (i.e. screening, proteomics, and illumina-ngs) the installer might have done the first step already. But the removal step is necessary in any case.
  • Database configuration:
    • AS: It is no longer needed to specify database.engine because postgresql is the default value and currently the only database engine supported.
    • AS: The syntax of property database.url-host-part has changed: <hostname> or <hostname>:<port>. The default value is localhost. Previously '//' and '/' at the beginning and the end, respectively, had been needed.
    • AS and DSS for screening:
      • Default name of imaging database has been changed to imaging_prod. The second part of the name of the imaging database (i.e. prod for the default name) can be defined by the property imaging-database.kind in
      • Optional property screening-database-kind in has been renamed to imaging-database.kind.

Version S140

Version S139

  • AS: In jetty.xml the web app provider org.eclipse.jetty.deploy.providers.WebAppProvider should be replaced by ch.systemsx.cisd.openbis.generic.server.util.OpenbisWebAppProvider. Thus the corresponding section in jetty.xml should read as follows:

              <Call name="addAppProvider">
                  <New class="ch.systemsx.cisd.openbis.generic.server.util.OpenbisWebAppProvider">
                    <Set name="monitoredDir"><Property name="jetty.home" default="." />/webapps</Set>
                    <Set name="scanInterval">0</Set>
                    <Set name="extractWars">true</Set>

    As a result, the URL for webapps changed from  https://my-openbis.domain/<web app name> to  https://my-openbis.domain/openbis/webapp/<web app name>.


Version S138

  • Screening:
    • Update

      file. Add new type to the line with dataset-types. The line should look like this:

    • If you are using the well image analysis graphs you also have to update the appropriate plugin in dss

      well-image-analysis-graph.dataset-types = HCS_ANALYSIS_WELL_FEATURES , HCS_ANALYSIS_WELL_FEATURES_CONTAINER

Version S137

  • DSS: The property value ftp.server.port has been renamed to ftp.server.ftp-port and is now mandatory in order to start up the built-in FTP server. The old property name ftp.server.port is still evaluated for better backward compatibility. This change enables separate switching on and off of the built-in FTP server and the built-in SFTP server. If you relied on the default value of this property to start up the FTP server, you need to add the line

    ftp.server.ftp-port = 2121

    to your DSS file.

Version S136

No pending configuration changes for this version.

Version S135

  • AS: Entries starting with dss-based-data-source-provider. in define data sources also used by DSS. Such data sources can be configured by the new Core Plugins type dss-data-sources. The old definitions in should be removed if the actual DSS code is DSS1. Otherwise they should stay. If the database names are not the default ones (i.e. imaging_productive for screening and proteomics_productive for proteomics) the properties screening-database-kind and/or proteomics-basic-database-name and proteomics-database-kind have to be defined. Note, in screening the property data-source-provider can be deleted in any case.
  • DSS: etc/datastore_server.conf has obtained JAVA_MEM_OPTS. Instances which define memory settings in JAVA_OPTS should be changed to use JAVA_MEM_OPTS instead.
  • Both AS and DSS have been switched to use the Concurrent Mark Sweep Garbage Collector (options -XX:+UseConcMarkSweepGC -XX:+CMSClassUnloadingEnabled). It is recommended that these options are added also when upgrading.

Version S134

Version S133

  • Before upgrading an existing openBIS instance (i.e. AS and DSS) with the installer of openBIS for Standard Technologies the following changes are needed:
    • of AS: Remove property disabled-technologies and add property enabled-technologies with a comma separated list of enabled technologies. Example for an openBIS Screening instance: of AS
      enabled-technologies = screening
    • of DSS: A change is only needed if the DSS does not use core plugins. This is the case for versions before S126 or if all core plugins are disabled by the property disabled-core-plugins. In this case, the DSS is already completely configured and all core plugins need to be turned off because they might cause conflicts. Normally, the installer looks at the enabled-technologies setting in the AS to determine which core-plugins to turn on and off, but this mechanism will not work because the openBIS installation needs the technology to be enabled, it just doesn't need the DSS core plugins. 

      To turn off all core plugins, the property disabled-core-plugins in the DSS should be set to a comma-separated list of technologies that are enabled in the AS. Each technology is appended by a colon (':'). This ensures that the installer does not remove the enabled technologies from this list. Here is an example for an openBIS Screening instance: of DSS
      disabled-core-plugins = screening:

Version S132

  • If upgrading an existing openBIS instance with the installer for standard technologies all unwanted technologies should be switched off. This should be done before upgrading by adding to the of AS the following line:
    disabled-technologies = proteomics, screening, illumina-ngs

    Technologies which are enabled should be removed from this list.

Version S131

  • Data set are now deleted asynchronously from the DSS by a maintenance task. Deleting a data set in the AS does not longer remove the data set from DSS instantly. Deleted data sets are now put in a queue and are removed by a new

    ch.systemsx.cisd.etlserver.plugins.DeleteDataSetsAlreadyDeletedInApplicationServerMaintenanceTask maintenance task. If you don't specify this task in DSS file the task will be automatically added during DSS startup with a default configuration (i.e. it will be executed every 5 minutes). If you specify the maintenance task explicitly in DSS file then the default task will not be registered and the explicit configuration will be used.

Version S130

  • Remove property core-plugins-folder from of AS and DSS because core plugins folder path is hard coded. This means that DSS core plugins are always included if not explicitly switched off in the property disabled-core-plugins of DSS (for more details see Core Plugins for Data Store Server). For existing installations usually enough to switch off the drop boxes of the distribution as follows:

    disabled-core-plugins = screening:drop-boxes, proteomics:drop-boxes

    Please, investigate carefully the current DSS to find other core-plugins which have to be switched off.

  • Deletion of data sets can be disabled/restricted for chosen data set types. 
    • To disable deletion:
      • Go to 'Admin->Types->Data Set Type'
      • Pick a data set type and click 'Edit'
      • Check 'Disallow deletion' checkbox
      • Now nobody will be able to delete data sets of that type
    • To restrict deletion to a chosen user role:
      • Do 'To disable deletion' steps
      • Add FORCE_DELETE_DATA_SET and FORCE_PURGE capabilities for the chosen user role (see Capabilities Documentation for more details on editing capabilities)
      • Now only users with the chosen role (or stronger) will be able to delete data sets of that type. Moreover the deletion will require an additional confirmation (checking 'Force->Force disallowed types' option in deletion confirmation dialog)
  • Indexes have to be recreated when upgrading from older release because
    • Hibernate Search was updated to 3.4.2.Final
    • Dates are now stored in more precise format to the index.

Version S129

Version S128

Version S127

Version S126

  • Scripts dedicated for Managed Properties evaluation should be aware, that now it is possible to get no value at input of method updateFromBatchInput(bindings); now scripts should check if bindings.getValue('') is returning a not null value.
  • Manual installation of DSS plugin distribution: The internal file structure of datastore_server_plugin*.zip has changed in such a way that it has to be unzipped on the level datastore_server and not datastore_server/lib.
  • Proteomics: DSS is becoming master of proteomics database. This leads to the following changes in AS and DSS
    • AS: remove proteomics section and replace it by = DSS1
      dss-based-data-source-provider.DSS1.database-driver = org.postgresql.Driver
      dss-based-data-source-provider.DSS1.database-url = jdbc:postgresql://localhost/phosphonetx_productive

      where DSS1 is the DSS code (property data-store-server-code of of DSS) and phosphonetx_productive is the complete name of the database.

    • DSS: To the data source definition the following lines have to be added

      data-source.version-holder-class = ch.systemsx.cisd.openbis.etlserver.proteomics.DatabaseVersionHolder
      data-source.scriptFolder = sql/proteomics

      assuming that data-source is the name of the data source definition.


Version S125

  • DSS properties
    • property dataset-registration-prestaging-behavior has been updated to have only two values:
      • use_original - doesn't use the prestaging directory. Use the original input file.
      • use_prestaging (default) - Use the prestaging file during the registration process, and delete original input file on successful registration
  • Linking to sample registration page from HTML pages is now possible. Examples of such links are presented below:
// link to sample registration page without filtering of available sample types
<a href="" onclick="window.location.hash='#action=SAMPLE_REGISTRATION&timestamp=' + (new Date().getTime()); return false;">Sample registration WITHOUT FILTERING</a>  
// link to sample registration page with filtering of available sample types to ones that match BIO.* regexp
<a href="" onclick="window.location.hash='#action=SAMPLE_REGISTRATION&sampleTypePattern=BIO.%2A&timestamp=' + (new Date().getTime()); return false;">Sample registration WITH FILTERING</a>

Please note that apart from simple links that lead to the registration page it is also possible to create more complex links that define which sample types should be available for choice at the registration page (available in "Sample Type" field). To create such a link one has to add "sampleTypePattern" parameter that has a Java regular expression as a value. The expression defines which sample type codes are accepted and will be displayed. Please note that most of the special characters that are used in Java regular expressions are also reserved characters in URLs (they have to be encoded). For example regular expression of BIO.* becomes BIO.%2A after encoding (for more information about this encoding see:  

Version S124

  • new Dss property
    • property dataset-registration-prestaging-behavior can be used to define whether the registration process should make use of the the prestaging directory. It can be set to the following values:
      default or use_original - doesn't use the prestaging directory. Use the original input file.
      delete - Use the prestaging file during the registration process, and delete original input file on successful registration
      leave-untouched - Use the prestaging file during the registration process, but leave original input file untouched on successful registration.

Version S123

  • New DSS property has been introduced
    • property dss-registration-log-dir can be used to override the location of the dss registration log files. By default these end up in a folder called log-registrations in the top level of the dss.
  • By default ch.systemsx.cisd.openbis.dss.generic.server.plugins.standard.RsyncArchiver only marks data sets in the archive as deleted. To get back the previous behavior (physical deletion) the property archiver.only-mark-as-deleted has to set to false.
  • IScreeningApiServer and IDssServiceRpcScreening are now available via JSON-RPC protocol (see "openBIS JSON API" chapter in openBIS documentation for more information on that feature). While IScreeningApiServer can be access without any additional configuration, the IDssServiceRpcScreening service has to be explicitly enabled in DSS file by defining an appropriate plugin servlet that exports the service. To do this you have to make following changes in DSS file:
plugin-services = screening-dss-api-json-exporter-servlet
screening-dss-api-json-exporter-servlet.class = ch.systemsx.cisd.openbis.dss.generic.server.DssScreeningApiJsonServlet
screening-dss-api-json-exporter-servlet.path = /rmi-datastore-server-screening-api-v1.json/*

Version S122

Version S121

  • if you have a dropbox which imports tiff images using BioFormats library and each tiff file contains only one image, then you can considerably improve the performance of dataset registration by using 'TiffDelegateReader##SINGLE_IMAGE' reader.
    Just remove the old calls to setImageLibrary(...)method in the dropbox script and set the right reader:

    imageDataset.setImageLibrary('BioFormats', 'TiffDelegateReader##SINGLE_IMAGE')

Version S120 (22 November 2011)

  • An alpha version of the Sample Explorer has been developed. It allows to quickly change the view between experiments and samples. To access the explorer add a link to your welcome page:

welcome page file:


code to add:

<a href="resources/applications/sampleExplorer/sampleExplorer.html">Browse samples</a>
  • An alpha version of the Plate Explorer has been developed.  Sample Explorer in screening context allows to quickly change the view between plates and assays. See "Generic->SampleExplorer" section for more information on how to enable this feature.
DSU, BaSynthec
  • After upgrade check if the DSU Downloader/BaSynthec Browser webapps are working. If they are not hosted on the same domain as openBIS, the property trusted-cross-origin-domains in openBIS-server/jetty/etc/ needs to be updated.

In order to enable the HDF5 compression as postregistration task, do the following set of changes in the datastore_server/etc/

  • Change the name of the root directory for image data sets from "original" to "original.h5". This will allow image URL to remain unaffected from the HDF5 compression. = ch.systemsx.cisd.openbis.dss.etl.jython.JythonPlateDataSetHandler
    # 'dropbox-all-in-one-with-library'  is the name of the configured Jython dropbox
    dropbox-all-in-one-with-library.image-datasets-original-dir-name = original.h5
  • Set up a postregistration maintenance task which execute HDF5 compression

    maintenance-plugins = ..., post-registration
    # General post registration maintenance task
    post-registration.class = ch.systemsx.cisd.etlserver.postregistration.PostRegistrationMaintenanceTask
    # start once a day at 23:00 o'clock
    post-registration.interval = 23:00
    post-registration.interval = 86400
    post-registration.cleanup-tasks-folder = cleanup-tasks
    post-registration.ignore-data-sets-before-date = 2011-11-22
    post-registration.last-seen-data-set-file = ../../last-seen-data-set
    # add compression to HDF5 = hdf5-compression
    post-registration.hdf5-compression.class = ch.systemsx.cisd.openbis.dss.etl.postregistration.ScreeningHdf5PostRegistrationTask = HCS_IMAGE_RAW, HCS_IMAGE_SEGMENTATION

Version S119 (10 November 2011)

  • In of AS and DSS rename 'phosponetx.' -> 'proteomics.'. Don't forget corresponding files in folder config/. Important: Don't change database name!
  • Rename in also 'phosphonetx' -> 'proteomics'.
  • Important: To ensure that deleted datasets are removed from your database you have to edit of DSS and change this line

    data-set-clean-up.class = ch.systemsx.cisd.etlserver.plugins.DeleteFromExternalDBMaintenanceTask


    data-set-clean-up.class = ch.systemsx.cisd.openbis.dss.etl.DeleteFromImagingDBMaintenanceTask

Version S118 (26 October 2011)

  • BioFormats library became a default one to read TIFF images because the standard java library JAI proved to be unstable.
    To avoid polluting your DSS logs with BioFormats messages for each read image we recommend adding a new 'logger' tag in servers/datastore_server/etc/log.xmlfile:

    <log4j:configuration ...>
      <logger name="">
        <!-- Print only messages of level warn or above from the BioFormats library -->
        <level value="warn"/>
  • To make the ExperimentBasedArchivingTask work with the new PostgresPlusFileSystemFreeSpaceProvider one must install the 'pgstattuple' extension in the YeastX database.

Version S117 (13 October 2011)

  • Important: the old method of upgrading the server with 2 zip files is strongly discouraged, please use the new procedure.
  • Location of following files have been changed:
    • jetty/webapps/openbis/WEB-INF/classes/ moved to jetty/etc/
    • jetty/webapps/openbis/WEB-INF/classes/etc/log.xml moved to jetty/etc/log.xml
    • jetty/bin/ moved to jetty/etc/
    • jetty/bin/openbis.conf moved to jetty/etc/openbis.conf
      Some of these files (, log.xml, and could already been accessed in jetty/etc through symbolic links. After the upgrade the only symbolic link will be in jetty/webapps/openbis/WEB-INF/classes/ pointing to jetty/etc/
  • Important: From now on "Installation and Upgrade Wizard" should be used for upgrade. The upgrade procedure has been simplified and unified. For more details see the updated openBIS setup guide.
  • Add to DSS

    protein-data-set-parent-linking.class = ch.systemsx.cisd.openbis.etlserver.phosphonetx.ProteinResultDataSetParentLinkingTask
    protein-data-set-parent-linking.execute-only-once = true

    and add protein-data-set-parent-linking to the list of maintenance task (property maintenance-plugins).

Version S116 (29 September 2011)

  • dropbox API behaviour change for image transformations: from this version using in your dropbox the 'smart image intensity transformation' means that this transformation will become the default one and the 'Optimal (image)' transformation will not be available for users. One can change that by either making 'smart image intensity transformation' non-default or by adding 'Optimal (image)' transformation explicitly (with a chosen threshold). Consult documentation for details.

Version S113 (18 August 2011)

Application Server
  • Important: Add the following lines to etc/jetty.xml beforedoing the update:

        <Call name="addLifeCycleListener">
          <Arg><New class="ch.systemsx.cisd.openbis.generic.server.util.LifeCycleListener"/></Arg>

This adds a 'STARTING SERVER' and 'SERVER STARTED' to the jetty.log which can be used for controlling the startup procedure:

This STARTING SERVER: org.eclipse.jetty.server.Server@58f39b3a
2011-10-20 11:33:06,674 INFO  [main] STATUS.CISDContextLoaderListener - Application: openbis
2011-10-20 11:33:06,676 INFO  [main] STATUS.CISDContextLoaderListener - Version: S117.2 (r23329)
2011-10-20 11:33:06,676 INFO  [main] STATUS.CISDContextLoaderListener - Java VM: Java HotSpot(TM) 64-Bit Server VM (v19.1-b02)
2011-10-20 11:33:06,676 INFO  [main] STATUS.CISDContextLoaderListener - CPU Architecture: amd64
2011-10-20 11:33:06,677 INFO  [main] STATUS.CISDContextLoaderListener - OS: Linux (v2.6.32-131.6.1.el6.x86_64)
SERVER STARTED: org.eclipse.jetty.server.Server@58f39b3a
Data Store Server
  • If you are using self signed SSL certificate provided by default in openBIS installation package then you have to modify datastore_server/etc/datastore_server.conf file and add to the JAVA_OPTSvariable. It should look e.g. like that:

    JAVA_OPTS=${JAVA_OPTS:=-server -d64}

    If you have installed openBIS and do not know about SSL, then you most probably do have to make these changes.

  • in your python scripts (e.g. for dropboxes or data validation) it is no longer possible to import classes using the '*' pattern. All used classes have to be mentioned explicitly.
  • Your python dropboxes should mention all the imported API classes explicitly (see the remark for Data Store Server).
    From the beginning of each dropbox for importing images you should remove this line:

    import ch.systemsx.cisd.openbis.dss.etl.dto.api.v1.*

    and add this block instead:

    from ch.systemsx.cisd.openbis.dss.etl.dto.api.v1 import SimpleImageDataConfig
    from ch.systemsx.cisd.openbis.dss.etl.dto.api.v1 import ImageMetadata
    from ch.systemsx.cisd.openbis.dss.etl.dto.api.v1 import Location
    from ch.systemsx.cisd.openbis.dss.etl.dto.api.v1 import ChannelColor
    from ch.systemsx.cisd.openbis.dss.etl.dto.api.v1 import Channel

    Not all the imports are usually necessary, but it should solve all your problems in most cases.

Version S110 (7 July 2011)

  • Important The upgrade will fail if imaging-db.scriptFoldervariable in DSS file is not changed from:

    imaging-db.scriptFolder = sql


    imaging-db.scriptFolder = sql/imaging

    Usually you can find this file in servers/datastore_server/etc/

Version S109 (23 June 2011)

  • For the HCSPub installation of screening, add the following in the

    # Material properties of the configured type will be rendered as links
    # to the material detail view. Replaces the formerly existing "Show/Show Details" links
    screening.material-details-property-type = GENE_SYMBOLS

Version S107 (24 May 2011)

Application Server
  • Activate 'Ad Hoc Vocabulary Term' by Power Users in openBIS-server/jetty/etc/

    allow-adding-unofficial-terms = true
Data Store Server
  • Added new way of configuring the response to errors that occur during the evaluation of jython drop box. These are documented thoroughly in the page on the JythonTopLevelDataSetHandler.

Version S106 (11 May 2011)

Application Server
  • IMPORTANT! The search index needs to be rebuild from scratch after the upgrade. Otherwise search will stop working! If search index is not removed during server upgrade process it needs to be removed manually. See property in openBIS-server/jetty/etc/ to figure out which directory should be removed.
  • Possibility to add so called unofficial vocabulary terms was added. They can be added when user is filling up the form and she have specified non-existing code. To enable this feature allow-adding-unofficial-terms property in should be set to true.

Version S105 (27 April 2011)

Data Store Server

Version S104 (13 April 2011)

Data Store Server
  • Added an optional property, "validation-script-path", to data set handler thread configurations. This refers to the path of a script that is invoked to validate the data set before processing. See for more information about the scripts
  • ch.systemsx.cisd.etlserver.plugins.SegmentedStoreShufflingTask needs a Share Finder. Example:

    store-shuffler.class = ch.systemsx.cisd.etlserver.plugins.SegmentedStoreShufflingTask
    store-shuffler.shuffling.class = ch.systemsx.cisd.etlserver.plugins.SimpleShuffling
    store-shuffler.shuffling.minimum-free-space-in-MB = 1024
    store-shuffler.shuffling.share-finder.class = ch.systemsx.cisd.etlserver.plugins.SimpleShufflingShareFinder
    store-shuffler.shuffling.share-finder.minimum-free-space-in-MB = 30

Version S103 (30 March 2011)

Data StoreServer
  • A new optional property has been added: webstart-jar-path. In production systems, it should not be necessary to change this, but it can be helpful in development, where the jars for WebStart clients are not available in a standard location. In these cases, the jars for the WebStart client can be put into a folder and this folder can be specified as the webstart-jar-path.
  • The default value for the property older-than in the configuration of AutoArchiverTask has been changed to 30 days (used to be 0). Systems that used AutoArchiverTask to do eager archiving should switch to post-registration maintenance tasks.

Version S102 (16 March 2011)

Application Server
  • Lucene index should be rebuild. If index is not removed automatically in the installation process remove the marker file (.MARKER_full_index from lucene index directory before restarting the server. The index directory is specified in property in
Data StoreServer
  • ch.systemsx.cisd.etlserver.plugins.DataSetDeletionMaintenanceTask has been renamed to ch.systemsx.cisd.etlserver.plugins.DeleteFromExternalDBMaintenanceTask. must be changed accordingly for projects using the task.

Version S101 (2. March 2011)

Application Server
  • Usability:
    • Replaced 'Perform' menu in Data Set tables with its Archiving submenu. Moved reporting services to a combo box in section's header (switching between them replaces the table view). Moved processing services to separate menu in section's header.
    • Display last active tab when user logs in (only in 'Normal View Mode' when user entered openbis without history token in URL)
  • Force web browsers to use Standards Mode when displaying openBIS pages (fixes simple view mode history in IE).
  • Migration to GXT 2.2.1
Data StoreServer
  • The symbolic link names created by HierarchicalStorageUpdater are now configurable by a template. The link folder structure was simplified to contain only 4 nested levels (previously they were 10)
  • Jython dropbox improvements
    • added support for setting/getting data set properties
    • registration of multiple datasets by storage processors working with external databases is now possible
  • well properties are displayed in the Well Search table
  • a dropbox to import image analysis results can be configured in Jython in a very flexible way
  • a dropbox to import image data can be configured to use ImageMagic 'convert' tool to generate thumbnails. The tool has to be installed separately, but is generating thumbnails much faster than the one which is used by default.
  • improved speed of loading data for showing plate layout (all features of a well are loaded when well's tooltip is shown)



  • New standarized dataset types have been introduced, replacing the old types:

    dataset type code



    Raw High Content Screening Images


    Overview High Content Screening Images. Generated from raw images


    HCS image analysis well feature vectors

    This means that the current 'HCS_IMAGE' and 'HCS_IMAGE_ANALYSIS_DATA' dataset types are deprecated (although they still work).
    We suggest to change their codes manually in the database. 'HCS_IMAGE' should be changed to 'HCS_IMAGE_RAW' or 'HCS_IMAGE_OVERVIEW', depending on weather your datasets store raw data or image overview data (e.g. in jpg format).
    One can use e.g. following SQL commands:

    psql -U postgres -d openbis_screening -c "update data_set_types set code = 'HCS_ANALYSIS_WELL_FEATURES' where code = 'HCS_IMAGE_ANALYSIS_DATA'"
    psql -U postgres -d openbis_screening -c "update data_set_types set code = 'HCS_IMAGE_RAW' where code = 'HCS_IMAGE'"

    After the change in the database type codes have to be modified accordingly in following places:

    • DSS dropboxes (servers/datastore_server/etc/
    • AS web client configuration (servers/openBIS/jetty/etc/

Data Store Server



  • Change the Online documentation link

    onlinehelp.generic.root-url =



  • PhosphonetX: Processing/reporting plugin: ch.systemsx.cisd.openbis.dss.phosphonetx.server.plugins.APMSReport


AS (optional)

  • To change the limit on number of columns visible in tables add a line in is 50):

    max-visible-columns = 20


  • The data set processing plugins ch.systemsx.cisd.openbis.dss.generic.server.plugins.standard.DataSetCopier, ch.systemsx.cisd.openbis.dss.generic.server.plugins.standard.DataSetCopierForUsers, and ch.systemsx.cisd.openbis.dss.phosphonetx.server.plugins.DataSetCopier have the optional property send-detailed-email. If it is true a detailed e-mail will be sent for each data set. It contains associated experiment and sample as well as time stamps when processing started and finished.


DSS - Deep Sequencing

Add this to to be able to use the SOFT Export function:

processing-plugins = to-SOFT
to-SOFT.label = Flow Lane to SOFT Exporter
to-SOFT.dataset-types = SRF_PER_LANE
to-SOFT.class = ch.ethz.bsse.cisd.dsu.dss.plugins.DataSetToSOFT =



To activate browsing Microscopy data, you have to:

  • AS
    • add a line in

      data-set-types-with-image-overview = MICROSCOPY_IMAGE
    • create MICROSCOPY_IMAGE data set type using the Web Browser
  • DSS
    • add following lines to

      overview-plugins = microscopy-image-overview
      microscopy-image-overview.class = ch.systemsx.cisd.openbis.dss.generic.server.MergingImagesDownloadServlet
      microscopy-image-overview.dataset-types = MICROSCOPY_IMAGE, .*IMG.*
  • Configuring dropboxes
    • to import images with tiles/timepoints/depth scans/channels one has to configure a new dropbox using a new MicroscopyStorageProcessorstorage processor, e.g.: = ch.systemsx.cisd.openbis.dss.etl.MicroscopyStorageProcessor = ch.systemsx.cisd.openbis.dss.etl.MicroscopyImageFileExtractor = imaging-db = RED, GREEN, BLUE = RED, GREEN, BLUE = 2x3 = 1,2,3;4,5,6

      Note that type-extractor and data-set-info-extractor configuration is the same as for any other dropbox.
      The default MicroscopyImageFileExtractorsupports following images naming convention:


      So valid file names are e.g.: s1_z3_t5_cDAPI.png, s4_cGFP.gif, s3_z19.23_cGFP.gif
      To support different naming conventions a new plugin has to be written.

    • to import images with any naming convention use the following dropbox configuration:

      microscopy-series-dropbox.incoming-dir = <PATH-TO-THE-INCOMING-DIRECTORY>
      microscopy-series-dropbox.incoming-data-completeness-condition = auto-detection
      # The extractor class to use for code extraction = ch.systemsx.cisd.etlserver.DefaultDataSetInfoExtractor = . = 0 = <SPACE-CODE, e.g. DEMO>
      microscopy-series-dropbox.type-extractor = ch.systemsx.cisd.etlserver.SimpleTypeExtractor
      microscopy-series-dropbox.type-extractor.file-format-type = <FILE-FORMAT-TYPE-CODE, e.g. TIFF>
      microscopy-series-dropbox.type-extractor.locator-type = RELATIVE_LOCATION = MICROSCOPY_IMAGE = true
    = ch.systemsx.cisd.openbis.dss.etl.MicroscopyBlackboxSeriesStorageProcessor = imaging-db


DSS - PhosphonetX

change lines

inputs = raw-data, prot-ident, prot_quant
... = ch.systemsx.cisd.openbis.etlserver.phosphonetx.DataSetInfoExtractorForSearchExperiment


inputs = raw-data, prot-ident, prot-ident-quantification, prot_quant
... = ch.systemsx.cisd.openbis.etlserver.phosphonetx.DataSetInfoExtractorForProteinResults

and add

prot-ident-quantification. = prot-ident.
prot-ident-quantification.incoming-dir = /dssfs/openbis/drop-box_prot_ident-quantification = MS_QUANTIFICATION =

Note, folder /dssfs/openbis/drop-box_prot_ident-quantification has to be created before DSS is started up.

DSS - Screening

FeatureVectorStorageProcessor has a new optional property: columns-to-be-ignored. It is a comma-separated list of columns in the CSV file which will be ignored and not handled as features. The default value is barcode. Columns not listed by this property and which are not columns specifying the well are handled as (non-numerical) feature.


DSS - Screening (optional)

To use the possibility of storing images of the whole plate compressed in one file update your image dropbox configuration:

# How should the original data be stored? Possible values:
#   unchanged       - nothing is changed, the default
#   hdf5            - all the data will be packaged into one hdf5 file
#   hdf5_compressed - like hdf5, but each file is stored in a compressed form
<your-images-dropbox-name>.storage-processor.original-data-storage-format = hdf5_compressed

It allows to considerably reduce both the number of files stored in the file system and the needed storage space. Files are stored in HDF5 format.


AS - all instances

Add the following to the

# Maximum number of search results = 100000

DSS - Screening (optional)

The ch.systemsx.cisd.openbis.dss.etl.PlateStorageProcessor can compress thumbnails. To turn on:

compress-thumbnails = true


AS - all instances

  • Add

    # Set to true to also query for email aliases
    ldap.queryEmailForAliases = true


AS - PhosphoNetX

  • Add

        <bean id="stacked-authentication-service" class = "ch.systemsx.cisd.authentication.stacked.StackedAuthenticationService">
                    <ref bean="ldap-authentication-service" />
                    <ref bean="crowd-authentication-service" />

    to openBIS-server/jetty/webapps/openbis/WEB-INF/classes/genericCommonContext.xmland change the line

    authentication-service = stacked-authentication-service


DSS - all instances (optional)

The new reporting plugin that mimics the behavior of the smart view is ch.systemsx.cisd.openbis.dss.generic.server.plugins.standard.GenericDssLinkReportingPlugin. It takes one required parameter, download-url, and two optional paramters. data-set-regex and data-set-path.

Here is an example configuration:

hcs-viewer.label = HCS View
hcs-viewer.dataset-types = HCS_IMAGE
hcs-viewer.class = ch.systemsx.cisd.openbis.dss.generic.server.plugins.standard.GenericDssLinkReportingPlugin = ${download-url} = .*/PNG/.*\.jpg = original


AS - all instances (optional)

Customize the text that is shown in the initial tab/page in both simple and application view mode by overriding welcomePage.html and welcomePageSimple.html (just like we do with loginHeader.html).

AS - all instances with query databases

Check if the source definitions of the query databases have set <database>.creator-minimal-role. If that is not the case, then the upgrade will use the default, which changes with S90 from SPACE_POWER_USER to INSTANCE_OBSERVER. Check with the owner of the instance whether this is intended. If you want to keep the old behavior, you have to add:

<database>.creator-minimal-role = SPACE_POWER_USER

Screening AS

To hide unnecessary sections and make the UI simpler add a line:

web-client-configuration-file = etc/


AS - all instances

  • remove data-store-server-base-url property completely (together with its comment) from the file


After installation and startup of the new openbis server:

  • check whether migration 055-056 was successful in the openBIS log
  • request creation of a new search index, otherwise searching for materials (e.g. genes) will not work: in openbis set = INDEX_FROM_SCRATCH
  • restart openbis server
  • set the index-mode (openbis to the previous value


For updating do the following steps:

  1. Before the servers are updated the following deletion has to be done in openBIS GUI:
    • All data sets of type LCA_MIC_TIME_SERIES and TEST_DATA_TYPE.
    • Data set types LCA_MIC_TIME_SERIES and TEST_DATA_TYPE.
  2. DSS Replace following lines

    maintenance-plugins = data-set-clean-up
    post-registration-upload.interval = 1440000


    maintenance-plugins = data-set-clean-up, post-registration-upload
    post-registration-upload.execute-only-once = true
  1. DSS datastore_server.conf: Increase maximum Java heap size to 1 GB.
  2. Update DSS and AS as normal but don't start them!
  3. Drop database basysbio_productive.
  4. Start AS and DSS.
  5. Watch DSS log for uploading existing data sets into the new basysbio_productive.


Screening DSS

If you have a server which has S83 version or older, you have to migrate to the S87 version before you upgrade the server to S88. Since version S88 BDS image dataset migration is not supported.


internal CIFEX:

On all servers that run CIFEX internally the JETTY_STOP_PORT of cifex has to be different from the one used by jetty running openbis. Otherwise during installation when script is run it will shut down also CIFEX (which we usually don't want to do).

Change the port in ~openbis/cifex/jetty/bin/






Plasmids DSS:


# The extractor class to use for type extraction
main-thread.type-extractor = ch.ethz.bsse.cisd.plasmid.dss.PlasmidTypeExtractor


# The extractor class to use for type extraction
main-thread.type-extractor = ch.ethz.bsse.cisd.plasmid.dss.PlasmidTypeExtractor
main-thread.type-extractor.dataset-types = SEQ_FILE: gb fasta xdna, RAW_DATA: ab1
main-thread.type-extractor.file-types = GB: gb, FASTA: fasta, XDNA: xdna, AB1: ab1
main-thread.type-extractor.default-file-type = PROPRIETARY = DIRECTORY

YeastX DSS:


# comma separated list of pairs: file-extension file-type
# It is assumed that for each file extension a dataset type with the same name is defined in openBIS.
# The corresponding file types have to be defined in openBIS as well.
# Files with unspecified extensions will have the file type and dataset type UNKNOWN in openBIS.
main-thread.type-extractor.file-types = pdf pdf, mat matlab, zip archive, eicml xml, fiaml xml, mzxml xml


# comma separated list of mappings from type to extensions, e.g.:
# file-type1: file-extension1 file-extension2, file-type2: file-extension3"
# It is assumed that for each file extension a dataset type with the same name is defined in openBIS.
# The corresponding file types have to be defined in openBIS as well.
# Files with unspecified extensions will have the file type and dataset type UNKNOWN in openBIS.
main-thread.type-extractor.file-types = PDF: pdf, MATLAB: mat, ARCHIVE: zip, XML: eicml fiaml mzxml



  • in config/ for both servers updated (FJE).

Screening AS:

  • Adding:

    data-source-provider = dss-based-data-source-provider = <code of DSS>
    dss-based-data-source-provider.<name of DSS>.database-driver = org.postgresql.Driver
    dss-based-data-source-provider.<name of DSS>.database-url = jdbc:postgresql://localhost/imaging_<database kind>

    where <code of DSS> is the code of the DSS as specified by the property data-store-server-code of of DSS and <database kind> is the part of the name of imaging database as specified by the property imaging-db.databaseKind of of DSS.
    In this case the 'imaging' properties will be ignored.

Screening DSS:

  • remove 'plate-image-params-reporter' plugin
  • change imaging db configuration by moving these lines from openBIS config to DSS config:

    imaging-db.version-holder-class = ch.systemsx.cisd.openbis.dss.etl.ImagingDatabaseVersionHolder
    imaging-db.scriptFolder = sql



  • in config/ for both servers updated (FJE).


  • of Data Store: configure additional maintenance plugin data-set-clean-up
maintenance-plugins = hierarchical-storage-updater, data-set-clean-up, migrator

# Removes data sets deleted from openBIS also from imaging database
data-set-clean-up.class = ch.systemsx.cisd.etlserver.plugins.DataSetDeletionMaintenanceTask
data-set-clean-up.interval = 300 = imaging-db

migrator.class = ch.systemsx.cisd.etlserver.plugins.ChainedDataSetMigrationTask
migrator.execute-only-once = true
migrator.storeRoot = ${storeroot-dir}
migrator.migrators = bds-image-db, bds-original-relocator, bds-remover
migrator.bds-image-db.class = ch.systemsx.cisd.openbis.dss.etl.bdsmigration.BDSImagingDatabaseMigrator = imaging-db = dapi, gfp
#migrator.bds-image-db.extract-single-image-channels = BLUE, GREEN
migrator.bds-original-relocator.class = ch.systemsx.cisd.openbis.dss.etl.bdsmigration.BDSOriginalDataRelocatorMigrator
migrator.bds-remover.class = ch.systemsx.cisd.openbis.dss.etl.bdsmigration.BDSDataRemoverMigrator


  • needs to be updated. The plugin now uses different handlers and extractors.



  • config/ on both servers have been updated. (FJE)

DSS YeastX

Service properties in the config directory of yeastx server have been already updated.
High water mark value for the archiver has been set, the commented configuration for automatic archiver has been added.
The maintenance plugins section has been moved to the end of the file.

# ---------------------------------------------------------------------------
# maintenance plugins configuration
# ---------------------------------------------------------------------------

# size of the disc free space in KB which must be available to unarchive one dataset
dataset-unarchiving-highwater-mark = 2000

# Comma separated names of maintenance plugins.
# Each plugin should have configuration properties prefixed with its name.
# Mandatory properties for each <plugin> include:
#   <plugin>.class - Fully qualified plugin class name
#   <plugin>.interval - The time between plugin executions (in seconds)
# Optional properties for each <plugin> include:
#   <plugin>.start - Time of the first execution (HH:mm)
maintenance-plugins=dataset-deletion-synchronizer, auto-archiver

# Maintenance task deleting from metabol database data sets which have been deleted from openbis
dataset-deletion-synchronizer.class = ch.systemsx.cisd.yeastx.etl.MetabolDatabaseUpdater
# how often the synchronization should happen in seconds: every day
dataset-deletion-synchronizer.interval = 86400
dataset-deletion-synchronizer.start = 22:30 = metabol-db
dataset-deletion-synchronizer.scriptFolder = sql

# Performs automatic archivization of 'ACTIVE' data sets based on their properties
#auto-archiver.class = ch.systemsx.cisd.etlserver.plugins.AutoArchiverTask
# The time between subsequent archivizations (in seconds)
#  auto-archiver.interval = 86400
# size of the disc free space in KB which must be available to unarchive one dataset
#  auto-archiver.dataset-unarchiving-highwater-mark = ${dataset-unarchiving-highwater-mark}
# Time of the first execution (HH:mm)
#  auto-archiver.start =
# following properties are optional
# only data sets of specified type will be archived
# =
# only data sets that are older than specified number of days will be archived (default = 0)
#  auto-archiver.older-than =
# fully qualified class name of a policy that additionally filters data sets to be filtered
#  auto-archiver.policy.class =

# --- ARCHIVER ------------------------------------------------------------------------

archiver.class = ch.systemsx.cisd.yeastx.etl.MLArchiverTask
archiver.unique-sample-name-property-code = ${sample-name-property-code}
archiver.unique-experiment-name-property-code = ${experiment-name-property-code} = metabol-db
# size of the disc free space in KB which must be available to unarchive one dataset
archiver.dataset-unarchiving-highwater-mark = ${dataset-unarchiving-highwater-mark}



Change configuration of Query modules in BaSysBio (both test and productive) and PhosphoNetX.


query-database.label = <label>
query-database.databaseEngineCode = postgresql
query-database.basicDatabaseName = <basicDatabaseName>
query-database.databaseKind = <databaseKind>
query-database.owner = <user>
query-database.password = <password>


Choose any text as the key: <db_key>, e.g. main or 1.

# Database Configurations for Query module
query-databases = <db_key>

<db_key>.label = <label>
<db_key>.database-driver = org.postgresql.Driver
<db_key>.database-url = jdbc:postgresql://localhost/<basicDatabaseName>_<databaseKind>
<db_key>.database-username = <user>
<db_key>.database-password = <password>


The server-url for connecting to openbis no longer requires "openbis/openbis". Just specify the protocol, hostname and port, the rest will be automatically discovered.


server-url = https://localhost:8443/openbis/openbis


server-url = https://localhost:8443



  • config/ has been updated. (FJE)


  • config/ has been updated. (FJE)



Add (after SMTP properties):

data-sources = metabol-db
metabol-db.version-holder-class = ch.systemsx.cisd.yeastx.db.MetabolDatabaseVersionHolder
metabol-db.databaseEngineCode = postgresql
metabol-db.basicDatabaseName = metabol
metabol-db.databaseKind = productive
metabol-db.readOnlyGroup = metabol_readonly
metabol-db.readWriteGroup = metabol_readwrite
metabol-db.scriptFolder = sql


yeastx-databaseEngineCode = postgresql
yeastx-basicDatabaseName = metabol
yeastx-databaseKind = productive
yeastx-readOnlyGroup = metabol_readonly
yeastx-readWriteGroup = metabol_readwrite
yeastx-scriptFolder = sql

Change all plugins accessing metabol database (look for ${yeastx-) to use 'data source', for example:

Was: = ${yeastx-databaseEngineCode} = ${yeastx-basicDatabaseName} = ${yeastx-databaseKind}

Should be: = metabol-db

Add archiver:

archiver.class = ch.systemsx.cisd.yeastx.etl.MLArchiverTask
archiver.unique-sample-name-property-code = ${sample-name-property-code}
archiver.unique-experiment-name-property-code = ${experiment-name-property-code} = metabol-db


BaSysBio (Productive and Test)

  • config/ has been updated. (FJE)


  • config/ and config/ have been updated. (FJE)


BaSysBio (Productive)

  • Enable Custom SQL queries (it is already configured on the test server, take the configuration from there)



  • SMTP must be configured (mail) (IA)


  • Data Set Info Extractor of ETL thread prot-resultis simplyfied to the following line = ch.systemsx.cisd.openbis.etlserver.phosphonetx.DataSetInfoExtractorForSearchExperiment

    See also DSS Admin Guide for PhosphoNetX.



  • No labels