Usage

Major capabilities of the application are showcased and explained in this part of the guide. Please see the Getting Started section on how to install this software if you would like to try out any of the features described here.

We use Microsoft Windows convention in describing key bindings. Please replace Ctrl key with Command key if you are a Mac user.

Project

Projects are the main organizational and collaborative unit of this application. One project can have one or more users. Every user must belong to at least one project. Collaboration among users is determined by project membership.

By default, every user belongs to three projects: Sandbox and Templates (shared among all users), and the personal project with the user’s name. For any issue regarding projects and users, please contact the HPD Server administrator (ajelenak@hdfgroup.org). If requesting more users to join a project provide their emails as recorded in the NASA Earthdata Login.

To open any of the user’s projects, use the menu Project > Open. You can open multiple projects if you want to copy and move designs across projects.

To close a project, use the menu Project > Close.

Only HPD Server administrator can delete a project. Selecting Project > Delete menu will just display a dialog box with the HPD Server administrator’s email address. Use it to request removal of that project.

Special Project: Templates

The purpose of this project is to house in one place designs that represent best practices or consist of comprehensive and high-quality content. All HDF Product Designer users are members of this project and have read-only access. This allows them to copy any of the project’s designs into their projects and start their work from there.

The HDF Product Designer Development Team is responsible for curating the designs in this project. Users are welcome to suggest additional designs for adding.

Currently available designs:

NODC_grid
Data represented or projected on a regular or irregular grid.
NODC_point
A single data point (having no implied coordinate relationship to other points).
NODC_profileIncomplete
An ordered set of data points along a vertical line at a fixed horizontal position and fixed time. Incomplete multidimensional array representation of profiles.
NODC_profileOrthogonal
An ordered set of data points along a vertical line at a fixed horizontal position and fixed time. Orthogonal multidimensional array representation of profiles.
NODC_timeSeriesIncomplete
A series of data points at the same spatial location with monotonically increasing times. Incomplete multidimensional array representation of time series.
NODC_timeSeriesOrthogonal
A series of data points at the same spatial location with monotonically increasing times. Orthogonal multidimensional array representation of time series.
NODC_timeSeriesProfileIncomVIncomT
A series of profile features at the same horizontal position with monotonically increasing times. Incomplete time and vertical coordinates.
NODC_timeSeriesProfileIncomVOrthoT
A series of profile features at the same horizontal position with monotonically increasing times. Incomplete time coordinate with orthogonal vertical coordinate.
NODC_timeSeriesProfileOrthoVIncomT
A series of profile features at the same horizontal position with monotonically increasing times. Orthogonal time coordinate with incomplete vertical coordinate.
NODC_timeSeriesProfileOrthoVOrthoT
A series of profile features at the same horizontal position with monotonically increasing times. Orthogonal time and vertical coordinates.
NODC_trajectoryIncomplete
A series of data points along a path through space with monotonically increasing times. Incomplete multidimensional array representation.
NODC_trajectoryProfileIncom
A series of profile features located at points ordered along a trajectory. Multidimensional array representation.
NODC_trajectoryProfileOrtho
A series of profile features located at points ordered along a trajectory. Multidimensional array representation.

All the designs with the NODC prefix in their names are from the NOAA National Centers for Environmental Information (NCEI) template collection. They have been imported from the CDL examples (see NetCDF CDL) available from the NCEI site. The Incom and Ortho in the design names reflect different representations:

  • Ortho indicates a design where its HDF5 datasets contain identical coordinate values along an axis.
  • Incom indicates a design where its HDF5 datasets contain different coordinate values along an axis.

Design

Designs represent content stored in HDF5 files. A single project can have many designs and all users of the same project can access its designs.

A design can have multiple versions and the version that can be edited is called current working version. The design versioning system is a simple timeline of saved versions (checkpoints). At any point while working on a design, user can opt to save its version as a checkpoint. Any of these checkpoints can be later promoted into the current working version. Checkpoints must have unique labels, and the current working version’s label is HEAD and cannot be changed.

Currently supported design actions:

  • Create a new design: Design > New.
  • Open an existing design: Design > Open.
  • Import design: Design > Import.
  • Find an HDF5 object by name in design(s): Design > Find.
  • Replace characters in HDF5 object names in design(s): Design > Replace.
  • Checkpoint design’s current version: Design > Save.
  • Close design in order to work on another one: Design > Close.
  • Delete a design: Design > Delete. This action cannot be undone so please be cautious.
  • Copy and paste design from one project to another: Ctrl+C and Ctrl+V.
  • Move design from one project to another: Ctrl+X and Ctrl+V or drag & drop with mouse.

Find and Replace start searching from the selected tree node. Selecting Up option will not search parent nodes unless search is already done using Find and user presses Find Next button.

Import

Importing designs allows reuse of existing HDF5/HDF4/netCDF products as a starting point for creating new products. Currently HPD Desktop supports import from the formats below:

All import formats except the HDF5 File are text-based so can be viewed or edited easily with any text editor before import.

HDF5

To import a design from an HDF5 file, please follow the steps below:

  1. Select a project from the tree view (Fig. 6).
  2. Choose Design > Import from > HDF5 from the menu (Fig. 7).
  3. A file selection dialog will appear. Select an HDF5 file and then Open. (Fig. 8).
  4. HPD Desktop will start loading the design from the file. It will update tree view dynamically during import (Fig. 9) by scrolling down the view automatically as the tree grows. The importing process is completed when the scrolling stops. Inspecting the import is recommended at this point.
  5. If a parsing error occurs during the import, the design will be saved up to the point where the parsing error occurred. The error message(s) will be shown in a separate wxPython stderr/stdout window.

Note

HDF5 file content is converted to HDF5/JSON in memory prior to importing. Any HDF5 feature not supported by the hdf5-json tools will cause import operation to fail.

HDF5/JSON

HDF5/JSON is a JSON representation of HDF5 file content. JSON (JavaScript Object Notation) is a lightweight, open format for data exchange, very popular in web applications. The h5tojson.py command line program can be used to generate the JSON from any HDF5 file.

To illustrate this functionality we have added to our HDF-EOS THREDDS server a capability to generate HDF5/JSON for the sample NASA HDF-EOS5/HDF5 products hosted there, as shown in Fig. 3 and Fig. 4.

_images/thredds_nasa_hdf5_catalog.png

Fig. 3 THREDDS web service for generating HDF5/JSON

If you click one of sample NASA products, you’ll get H5JSON link under Access (Fig. 4).

_images/thredds_nasa_hdf5_h5json.png

Fig. 4 THREDDS catalog page displaying various access methods for one HDF5 file

Clicking the link will generate HDF5/JSON file without actual data (i.e., metadata information only) as shown in Fig. 5.

_images/click_h5json_link.png

Fig. 5 HDF5/JSON output from THREDDS H5JSON access

If the HDF5/JSON representation of an existing HDF5 file using the above method is saved to a file, it can be imported by following the steps below.

  1. Select project node from the tree view (Fig. 6).
_images/tree_select_project.png

Fig. 6 HPD Desktop tree control view with a selected project

2. Press Design > Import from > HDF5 JSON from menu (Fig. 7).

_images/menu_design_import_hdf5.png

Fig. 7 HPD Desktop menu selection for importing design from various sources

  1. A file selection dialog will appear. Select an HDF5/JSON file (Fig. 8).
_images/dialog_design_import_hdf5.png

Fig. 8 HPD Desktop dialog box for opening an HDF5/JSON file

  1. Press the Open button. HPD Desktop will start loading the design from the file. It will update tree view dynamically during importing (Fig. 9) and the tree view control will scroll down automatically as the tree grows. If you don’t see any change in the tree view, it means it has completed importing. You may want to scroll up to check your imported design.
_images/tree_import_hdf5.png

Fig. 9 Updated tree view in HPD Desktop during HDF5/JSON file import

  1. If a parsing error occurs during the import, the design will be saved up to the point where the parsing error occurred. You can examine an error message from the wxPython stderr/stdout message window.

Note

The HDF5/JSON importer in HPD Desktop has some limitations. For example, it cannot handle references and custom datatypes. They will be ignored without any warning messages because they can make products inoperable with netCDF.

Importing datasets without layout and filter creation properties will make them chunked with the size equal to their size (the entire dataset is one chunk), and enable gzip (deflate) compression filter with level 6.

HDF4 XML

Importing designs from HDF4 files, for example existing NASA HDF-EOS2/HDF4 products, is possible using the h4mapwriter tool which generates HDF4 file content map. HDF4 file content maps are XML documents that describe file structure, metadata, and offsets and lengths of raw data. h4mapwriter is available from The HDF Group project website.

To demonstrate the tool’s capability we exposed it as a THREDDS server’s data access service for sample NASA HDF4 products, as shown in Fig. 10.

_images/thredds_nasa_hdf4_catalog.png

Fig. 10 THREDDS web service for generating HDF4 file content map

If you click one of sample NASA products, you’ll get H4MAP link under Access (Fig. 11).

_images/thredds_nasa_hdf4_h4map.png

Fig. 11 THREDDS catalog page displaying various access methods for HDF4 file

Clicking the link will generate HDF4 map file in XML (Fig. 12).

_images/click_h4map_link.png

Fig. 12 HDF4 XML output from THREDDS H4MAP access

Once the HDF4 XML representation of an existing HDF4 file using the above method is saved to a file, it can be imported by following the steps below.

  1. Select project node from the tree view (Fig. 6).
  2. Press Design > Import from > HDF4 Map from menu (Fig. 7).
  3. A file selection dialog will appear. Select an XML file (Fig. 8).
  4. Press the Open button. HPD Desktop will start loading the design from the file. It will update tree view dynamically during importing (:numref :tree-import-hdf5) and the tree view will scroll down automatically as the tree grows. If you don’t see any change in the tree view, it means it has completed importing. You may want to scroll up to check your imported design.
  5. If a parsing error occurs during the import, the design will be saved up to the point where the parsing error occurred. You can examine an error message from the wxPython stderr/stdout message window.

A useful feature of the HDF4 map writer is that it can merge a set of split file attributes (due to the attribute size limitation in HDF4) into a single big string value in XML. To support such merged ODL string attributes, HPD Desktop’s HDF4 map importer can parse the entire merged ODL string and re-represent it as a hierarchy consisting of groups and attributes. Those re-constructed groups start with name Metadata_ in the imported design. For example, Fig. 13 illustrates that the content of coremetadata is re-represented under the Metadata_core group.

_images/tree_import_hdf4_meta.png

Fig. 13 An imported HDF-EOS2 file design with merged coremetadata

The current HDF4 map importer has several limitations. It cannot import Table (a.k.a Vdata in HDF4) data. You will see the following message in the console window:

INFO: Skipping unsupported Table Cloud Data

If importing Vdata is important, please try the NcML method.

Please note that HDF4 allows duplicate names under the same group but HDF5 does not (Fig. 14).

_images/hdf4_can_have_groups_with_same_name.png

Fig. 14 HDFView displaying an HDF4 file that has two groups with the same name

In such case you’ll see an error message in the console window:

hpdws.create_group():put:409
...
{"message": "Strip: Object with this name already exists.", ...}

HPD Desktop doesn’t give any pop-up warning dialog messages for the above cases because they may happen quite often. However, the HDF4 map importer will give a pop-up notification dialog message when it replaces / character with _ for group/dataset/attribute names.

NcML

NcML is an XML representation of the metadata in a netCDF file (the information similar to the output of the ncdump -h). The netCDF-Java library from Unidata can read most NASA HDF4 and HDF5 files. Thus, the toolsUI, which is based on netCDF-Java, can produce NcML easily from local HDF files. By clicking on the NcML tab, an HDF file can be opened and then saved as NcML (Fig. 15).

_images/toolsUI_NcML.png

Fig. 15 ToolsUI displaying an HDF4 file content in NcML

The THREDDS data server, which is also based on NetCDF-Java, can generate NcML easily using a web browser. We provide a demo THREDDS server for many NASA products from different NASA Earth data centers (Fig. 16) for which NcML representation can be obtained. Please note that this generic THREDDS server is different from the custom THREDDS server that provides HDF5/JSON and HDF4 Map access in the previous sections (e.g., port number is different).

_images/thredds_NcML.png

Fig. 16 THREDDS web service for generating NcML from HDF files

Clicking folders will lead you to a specific HDF product catalog that provides various access methods (Fig. 17).

_images/thredds_NcML_catalog.png

Fig. 17 THREDDS catalog page displaying various access methods including NcML

Simply click the link next to NCML to get the NcML representation of NASA HDF products (Fig. 18).

_images/thredds_NcML_npp.png

Fig. 18 NcML output from the THREDDS NCML service

Once obtained NcML representation of an existing HDF or netCDF file is saved to a file, importing it is done by following the steps below:

  1. Select project node from the tree view (Fig. 6).
  2. Press Design > Import from > NcML from menu (Fig. 7).
  3. A file selection dialog will appear. Select an NcML file (Fig. 8).
  4. Press the Open button. HPD Desktop will start loading the design from the file and save the design on HPD Server. It will update the tree view dynamically (Fig. 9) by scrolling down as the tree grows. If you don’t see any change in the tree view, it means it has completed importing. You may want to scroll up to check your imported design.
  5. If a parsing error occurs during the import, the design will be saved up to the point where the parsing error occurred. The error message(s) will be displayed from the wxPython stderr/stdout message window.

NetCDF CDL

The network Common data form Description Language (CDL) is a text representation of a netCDF file that is used as input and output by the netCDF utilities.

To import a CDL file, please follow the steps below:

  1. Select a project from the tree view (Fig. 6).
  2. Choose Design > Import from > CDL from the menu (Fig. 7).
  3. From a file selection dialog pick a CDL file (Fig. 8).
  4. HPD Desktop will start loading the design from the file. It will update the tree view dynamically (Fig. 9) by scrolling down as the tree grows. Importing is finished when the scrolling stops.
  5. If a parsing error occurs during the import, the design will be saved up to the point where the parsing error occurred. The error message(s) will be displayed in the wxPython stderr/stdout message window.

OPeNDAP DMR

The OPeNDAP Dataset Metadata Response (DMR) is an XML representation of the metadata from a remote data source hosted by an OPeNDAP server. DMR is based on the DAP 4.0 protocol. DMR is replacing the DAP 2.0-based DDX and keeps evolving. DMR support may break at any time if its specification changes in between this application’s releases.

To import an OPeNDAP DMR, please follow the steps below:

  1. Select a project from the tree view (Fig. 6).
  2. Choose Design > Import from > OPeNDAP DMR from the menu (Fig. 7).
  3. A text dialog box will appear. Enter an OPeNDAP server DMR URL (Fig. 19).
_images/dialog_design_import_opendap.png
  1. HPD Desktop will start loading the design from the server. It will update the tree view dynamically (Fig. 9) by scrolling down as the tree grows. Importing is finished when the scrolling stops.
  2. If a network error or parsing error occurs during the import, the design will be saved up to the point where the parsing error occurred. The error message(s) will be displayed in the wxPython stderr/stdout message window.

Export

A design can be exported in several formats: HDF5 template file, HDF5/JSON, and as Python, MATLAB, IDL, or FORTRAN source code. The generated source code can produce the template file and, if modified to write actual data, final HDF5 products.

Note

Due to varying level of support for the HDF5 features in different programming languages not all source code exports may be available for some designs.

HDF5 Template File

Template files represent a design as an HDF5 file. They do not contain any real data but their structure, the names of groups/datasets/attributes, and metadata (attribute values) accurately represent the design. Therefore these files are ideal for quickly verifying the design by testing its template file with various display or visualization software, for example HDFView.

To generate an HDF5 template file from a design, please follow the steps below:

  1. Select the design item in the tree first (Fig. 20).
_images/tree_select_design.png

Fig. 20 HPD Desktop tree control view with a selected design

  1. Press Design > Export as > HDF5 from menu (Fig. 21).
_images/menu_design_export_hdf5.png

Fig. 21 HPD Desktop menu selection for exporting design as HDF5 template file

  1. You will be asked to provide a name for HDF5 file. By default, it uses the same design name with extension .h5.
_images/dialog_design_export_hdf5.png

Fig. 22 HPD Desktop dialog box for saving design as HDF5 template file

  1. Press the Save button. HPD Desktop will send a request to HPD server to a new HDF5 file from design and start downloading the file as soon as it is ready.
  2. If exporting fails, the pop-up window will display the error message (Fig. 23).
_images/dialog_download_failed.png

Fig. 23 HPD Desktop dialog message box for download failure

  1. A pop-up window will confirm successful export and show the full file path to the template file.

HDF5/JSON

HPD Desktop can export a design as HDF5/JSON.

  1. Select a design item in the tree (Fig. 20).
  2. Press Design > Export as > JSON from menu (Fig. 21).
  3. If exporting fails, the pop-up window will display the error message (Fig. 23).
  4. A pop-up window will confirm successful export and show the full file path to the exported file.

HDF5/JSON files can be edited with any text editor. This makes it possible to perform certain editing operations that are currently not yet supported in the HPD Desktop, such as bulk search and replace. The modified file can then be imported back or turned into an HDF5 file with the jsontoh5.py.

Python Source Code

HPD Desktop can export a design as Python source code. The code requires the h5py package.

  1. Select a design item in the tree (Fig. 20).
  2. Press Design > Export as > Python from menu (Fig. 21).
  3. If exporting fails, the pop-up window will display the error message (Fig. 23).
  4. A pop-up window will confirm successful export and show the full file path to the exported file.

The Python code was tested with Python 2.7 and h5py versions 2.4 and 2.5.

IDL Source Code

HPD Server can generate IDL code using IDL’s HDF5 routines.

  1. Select a design item in the tree (Fig. 20).
  2. Press Design > Export as > IDL from menu (Fig. 21).
  3. If exporting fails, the pop-up window will display the error message (Fig. 23).
  4. A pop-up window will confirm successful export and show the full file path to the exported file.

Note

IDL source code export will fail if:

  • 8-bit signed integer (int8) or variable string (vlstring) datatypes are used.

MATLAB Source Code

HPD Server can generate MATLAB code using MATLAB’s HDF5 routines.

  1. Select a design item in the tree first (Fig. 20).
  2. Press Design > Export as > MATLAB from menu (Fig. 21).
  3. If exporting fails, the pop-up window will display the error message (Fig. 23).
  4. A pop-up window will confirm successful export and show the full file path to the exported file.

Note

MATLAB source code export will fail if:

  • Any dataset in a design has the total number of elements greater than 248 - 1.

FORTRAN Source Code

Warning

The functionality and support for HDF5 features of this source code generator currently significanly lags behind the others so it is not recommended for any serious use.

HPD Server can generate FORTRAN 90 code using HDF5 FORTRAN APIs.

  1. Select a design item in the tree first (Fig. 20).
  2. Press Design > Export as > FORTRAN from menu (Fig. 21).
  3. If exporting fails, the pop-up window will display the error message (Fig. 23).
  4. A pop-up window will confirm successful export and show the full file path to the exported file.

To compile the exported FORTRAN code, the HDF5 library is required and the HDF5 library must be compiled with these flags: --enable-fortran and --enable-fortran2003.

Group

You can add new group using menu items under Group. You can edit or delete an existing group using menu items. You can also edit group directly by clicking and editing from the tree view. As soon as you change group, it will be saved on HPD Server.

Dimension

A dimension is attached by dragging & dropping it onto a dataset. The DIMENSION_LIST and REFERENCE_LIST attributes of the dataset and its dimension scale will be updated accordingly.

To detach a dimension, drag & drop dataset onto that dimension. Deleting a dimension will detach that dimension from the dataset automatically.

Dataset

You can add new dataset using menu items under Dataset. You can edit or delete an existing dataset using menu items. As soon as you change dataset, it will be saved on HPD Server.

The shape of dataset must be a set of integers separated by comma. For example, to create a dataset with 2x4 shape like dset[2][4], please type 2,4 in the shape text box of New Dataset dialogue box. If you leave the shape empty, a scalar dataset will be created. To specify unlimited dimension size, use *. For example, to create a dataset with unlimited x 4 shape, please type *,4.

To create a compound dataset, create a dataset with compound type first. Then, select the dataset and create another dataset under it. The child dataset becomes the field of compound dataset.

When you create a dataset, you can specify compression filters such as gzip, storage layout (chunk shape), and fill value. By default, HPD users gzip compression with level 6 and chunked storage. If you leave chunk shape blank, chunk shape will be same as dataset shape.

In tree view, storage layout is indicated with ‘/’ after dataset’s shape. If ‘/’ doesn’t appear, it means contiguous storage is used. If an array style shape (e.g., [2][4]) is specified after ‘/’, it means chunking storage layout is used. Single ‘/’ means compact storage layout is used. The following table summarizes the storage layout notation in tree view.

Example Meaning
int8 d[10][20] Contiguous storage layout
int8 d[10][20]/[2][4] Chunking storage layout with shape 2x4
int8 d[10][20]/ Compact storage layout

Filters and the fill value are indicated within parenthesis and they are separated by commas. For example, int8 dset[10](fv=-1,gz=6) means: gzip filter with compression level 6, with -1 as the fill value. The following table explains the two letter notation of filters and fill value in dataset creation property.

Notation Meaning Value Required?
fv fill value Yes
gz gzip Yes - [1,9]
sh shuffle No
fl fletcher32 No

Since interoperability is most important, fill value in dataset creation property will be overwritten by _FillValue attribute’s value if CF or NUG conventions are active and a _FillValue attribute is defined for a dataset. The same applies when importing designs from NcML and HDF4 map if either CF or NUG conventions are in effect.

Attribute

All operations on HDF5 attributes are available from the Attribute menu. Please note that creating, deleting, or modifying any attribute is immediately recorded.

If you want to specify multiple values for a single attribute, you can surround the values in square braket and separte them with comma (Fig. 24).

_images/attr_edit_multipe_values.png

Fig. 24 How to add multiple values for a single attribute

Convention Support

  • When you create a new design, you can pick a set of conventions that you want to use. However, you cannot choose both CF and NUG at the same time.
  • If you select convention, a toolbar will appear automatically.
  • Using conventions will activate CLIPS expert engine except for HDF-EOS convention.
  • To apply convention, you can select individual root group or dataset and press a button in toolbar.
  • For HDF-EOS convention, you need to select the root group before you press any Point/ZA/Profile/Swath/Grid button.
  • Choosing custom convention allow you to load your own CLIPS code. You may want to download and edit a sample CLIPS code that modifies the default values of ACDD convention and to test custom convention.

CLIPS

Unless you’re comfortable with CLIPS programming, we don’t recommend you to modify existing CLIPS rules. If you dare to customize CLIPS rules, please try them first on generic CLIPS engine, not PyCLIPS engine used in HPD Desktop because a subtle mistake in CLIPS programming is hard to debug with HPD Desktop. HPD Desktop doesn’t provide enough debugging message on system console. Although PyCLIPS generates my_trace.log, it may not be useful for debugging either. The way PyCLIPS works in a packaged environment is a still mystery and we are still investigating its limitation by trial-and-error.

The current user interface is only focused on adding attributes to groups and datasets suggested by CLIPS rules. However, the current rule set is more sophisticated than what HPD Desktop GUI utilizes. For example, CLIPS rules can handle dimensions but HPD Desktop doesn’t utilize them yet. We will address them in future versions.

Therefore, the only thing that HPD Desktop tested is to customize global name values using defglobal when ACDD/NUG/CF conventions are used. We don’t know what will happen to HPD Desktop by writing a custom defrule, so it’s entirely up to your hacking ability. If you have trouble with using PyCLIPS in HPD, please figure it out yourself from source code or please support us by providing more funding if a serious effort needs to be made for custom conventions.

Tools

HDF Product Designer provides tools for validating design and generating documentation.

Validation

HDF Product Designer provides several tools for testing design’s interoperability. Currently these tools rely on the web services provided by the THREDDS and Hyrax data servers. We collectively refer to these interoperability tools and services as HPD Online to distinguish them from those of the HPD Server.

HPD Online services can be invoked from the Tools menu at any time while working on a design (Fig. 25).

_images/tree_select_tools.png

Fig. 25 HPD Desktop Tools menu for HPD Online tools

The selected design will be exported as an HDF5 template file and this file will be made available to the HPD Online tools. An error message will be displayed if this process fails for any reason (Fig. 26).

_images/tools_publish_error.png

Fig. 26 Error message for publishing HDF5 file on HPD Online

Finally, the chosen HPD Online tool will run against the design’s HDF5 file and report its result back via a web browser window opened by the HPD Desktop.

This workflow can be repeated as many times as needed to either correct any reported error or maximize design’s interoperability as defined by the tool.

For requesting addition of new tools to HPD Online please contact eoshelp@hdfgroup.org.

CDL

The network Common data form Description Language (CDL) is a text representation of a netCDF file that is used as input and output by the netCDF utilities, which are based on the netCDF C library. These netCDF utilities are very useful for interoperability testing so HPD Online supports obtaining a CDL output of a design.

The CDL output from HPD Online is directly generated by the ncdump tool. Correct display of the design’s content assures it will be interoperable with any netCDF-C based tools.

  • To validate a design with the ncdump -h command use the Tools > Validate > CDL menu.
  • If no errors, the CDL output will be displayed in a web browser window (Fig. 27).
_images/tools_validate_cdl.png

Fig. 27 CDL output from HPD Online

Get as netCDF-3

netCDF-3 is a widely supported file format especially by the legacy tools. NASA data centers allow users to download HDF files in the netCDF-3 format on demand.

HPD Online connects the Hyrax service which enables download of a design as a netCDF-3 file. This can benefit users to ensure the files generated from the design will be interoperable with software which relies on the netCDF-3 format.

  • To validate a design with netCDF-3 tools use the Tools > Validate > Get as netCDF-3 menu.
  • If no errors, HPD Desktop will prompt for a file name and save the netCDF-3 content (Fig. 28).
_images/tools_validate_nc3.png

Fig. 28 HPD Desktop dialog box for saving design as netCDF-3 file

CF

The Climate and Forecast convention (CF) is one of the NASA ESDIS Standards Office (ESO) approved metadata standards. The convention is quite important because there are many data access software that utilize it to geolocate, plot, or subset data correctly. A subtle mistake in following the CF convention can make CF tools fail to load data.

HPD Online can check the CF convention compliance by using the ncdismember service powered by the CFChecker via the THREDDS server. The ncdismember tool will split the contents of a file along its HDF5 groups into multiple files. The CFChecker will then check each dismembered file and report on its CF compliance.

It can also check the CF convention compliance by using the NASA JPL Metadata Compliance Checker. JPL’s checker service uses CF version 1.6 while HPD’s ncdismember service uses CF version 1.5.

  • To validate a design with CFChecker use the Tools > Validate > CF menu.
  • If no errors, CF compliance reports will be displayed in a web browser as two tabs (Fig. 29).
_images/tools_validate_cf.png

Fig. 29 CF compliance report from HPD Online. Note how the test.h5 file is split into the abc.nc and root.nc files in the report.

ACDD

The Attribute Convention for Data Discovery ACDD is a set of global (root group) attributes that provide metadata for improving the data’s discoverability. It works very well with other Earth Science conventions because there is no overlap in the metadata content between them. Metadata cataloging tools use ACDD attributes to extract metadata from files to support search or translation into other metadata formats such as DIF, FGDC, or ISO 19115. We highly recommend using the ACDD attributes in your designs.

HPD Online utilizes THREDDS UDDC service. It provides an evaluation of how well the metadata contained in a design conforms to the ACDD.

It can also check the ACDD convention compliance by using the NASA JPL Metadata Compliance Checker. JPL’s checker service uses ACDD version 1.3.

  • To validate a design with the ACDD checker use Tools > Validate > ACDD menu.
  • If no errors, compliance reports will be displayed in a web browser as two tabs (Fig. 30).
_images/tools_validate_acdd.png

Fig. 30 ACDD report from HPD Online

ISO

International Organization for Standardization (ISO) has published Geographic Information Metadata standard 19115. NASA conventions and best practices for ISO 19115 are currently under development. HPD Online simply utilizes THREDDS’s ISO service to provide ISO 19115 metadata description of a design.

  • To obtain ISO metadata record of a design use Tools > Validate > ISO menu.
  • If no errors, the design’s ISO metadata will appear in a web browser window (Fig. 31).
_images/tools_validate_iso.png

Fig. 31 ISO metadata output from HPD Online

OPeNDAP

OPeNDAP stands for “Open-source Project for a Network Data Access Protocol”. OPeNDAP is both the name of a non-profit organization and the commonly-used name of a protocol which the OPeNDAP organization has developed. Hyrax is the official OPeNDAP server by OPeNDAP organization. Hyrax can serve both HDF4 and HDF5 data using the HDF4 OPeNDAP handler and the HDF5 OPeNDAP handler.

The HDF5 OPeNDAP handler is a software that can be used to access HDF5 data via OPeNDAP’s Data Access Protocol. The HDF Group has designed and implemented this software. The handler can support OPeNDAP’s visualization client tools that access NASA HDF-EOS5 (OMI, HIRDLS, MLS, TES and MOPITT) and some HDF5 (GPM, OBPG and some MeASUREs) products.

The HDF5 handler provides two options: CF and generic. The CF option tries to make HDF5 products follow the CF convention. The generic option serves HDF5 data as-is, which can break many CF tools. HPD Online uses HDF5 handler with CF option enabled to test the interoperability of a design.

  • To validate a design with OPeNDAP server (Hyrax) use the Tools > Validate > OPeNDAP menu.
  • If no errors, an OPeNDAP HTML form will be displayed in a web browser window (Fig. 32).
_images/tools_validate_opendap.png

Fig. 32 OPeNDAP server dataset access form from HPD Online. Note that the group hierarchy abc is flattened as part of the variable name _abc_ by CF- enabled HDF5 handler because the CF convention do not have the notion of group hierarchy.

THREDDS

The Unidata’s THREDDS Data Server (TDS) is a web server that provides metadata and data access for scientific files, using OPeNDAP, OGC Web Mapping Service (WMS) and Web Coverage Service (WCS), HTTP, and other remote data access protocols. The TDS is implemented as a Java servlet on top of the netCDF-Java library. Therefore, TDS is a good testbed for all netCDF-Java applications such as Panoply and IDV. If THREDDS can access and interpret a design’s HDF5 file correctly and other netCDF-Java tool can, too.

  • To validate a design with THREDDS use the Tools > Validate > THREDDS menu.
  • If no errors, a THREDDS dataset access page will be displayed in a web browser window (Fig. 33).
_images/tools_validate_thredds.png

Fig. 33 THREDDS dataset access from HPD Online

Documentation

Preparing product documentation is often a repetitive task, especially for designs where large number of HDF5 groups, datasets, and attributes have to be described. To reduce the mundane workload and speed up the process, design documentation can be downloaded as a Microsoft Word docx file to serve as a starting point for further editing. The formatting of such documents is deliberately minimal to allow easier adjustment to the final style. Documentation content consists of:

  • Design’s name and version as the document title.
  • A sentence below the title with the timestamp of document’s creation.
  • Group names as section titles.
  • Group’s attribute information (name, datatype, shape, value) in a table with single line borders.
  • Dataset names as titles of subsections under their parent groups.
  • Dataset information (parent group, datatype, shape, and max. shape) in a table with single line borders.
  • Dataset’s attribute information (name, datatype, shape, value) in a table with single line borders.
  • HDF5 dimension scales are distinguished from ordinary HDF5 datasets by having Dimension Scale in their subsection title.

Note

This is an experimental feature and the formatting and content of generated Microsoft Word files may change. We solicit user comments and suggestions on how to improve the utility of the generated Word document.

To obtain design documentation:

  • Select the Tools > Document > MS Word menu option.
  • If no errors, a dialog window to choose a file name for the MS Word document will appear (Fig. 34).
_images/tools_document_msword.png

Fig. 34 Saving design’s product documentation as a Microsoft Word file

Shortcut Keys

The following shortcut keys are supported.

Key Function
Ctrl+Shift+O Open Project
Ctrl+Shift+W Close Project
Ctrl+Q Quit
Ctrl+O Open Design
Ctrl+W Close Design
Ctrl+C Copy
Ctrl+X Cut
Ctrl+V Paste
DEL Delete
Ctrl+F Find
Ctrl+(Shift)+H (Mac) Replace