Direkt zum Inhalt | Direkt zur Navigation

Personal tools


OASIS3-MCT developments Nov 2017

# Your E-Mail Address Your Name Your Laboratory Your Country Development you would like to see in OASIS3-MCT Priority 1 Development you would like to see in OASIS3-MCT Priority 2 Development you would like to see in OASIS3-MCT Priority 3 Comments
1 [Email protection active, please enable JavaScript.] Marie-Alice Foujols IPSL France Efficient and simple statistics logging of each exchanged field. A simple way to organize 1D fields to be used as 2D fields in visualization tools (ferret, ...). At least examples base on toys models. If appropriate ie after a complete study, organize the way for a convergence with XIOS : input files? restart files? xml files? ... Thank you for the survey. Thank you for OASIS, for services and your help.
2 [Email protection active, please enable JavaScript.] Christian Danish Meterological Institute Denmark Dear OASIS-Team, since I'm new to OASIS I may propose something that already exists -- my apologizes in this case. For the regridding capability I would like to use a simple command tool to define the grid/regrid information. Would it be possible to use from the "climate data operator" (CDO) the following commands, where `Source.nc` and `Target.nc` are two netcdf files holding the information of the source and target grid: ``` cdo griddes Source.nc Source.grid.description.txt cdo genbil,Source.grid.description.txt,Target.nc \ Weights_Source2Target.nc ``` If this works already, I would like suggest adding this to the OASIS manual. Else I would like to have a similar tool. Thanks in advance. All the best, Christian
3 [Email protection active, please enable JavaScript.] Carsten Lemmen Helmholtz-Zentrum Geesthacht Germany Interoperability with ESMF. Steps toward this: - OASIS accepts externally provided MPI Communicator - OASIS accepts namcouple-like information from communicated metadata instead of file.
4 [Email protection active, please enable JavaScript.] MR CLIVE P JONES Fujitsu Labs of Europe United Kingdom This is more of a bug. We have an atmospheric grid that is decomposed so that each process contains the data for the same latitude bands north and south. Since the data on each process is not contiguous we are using orange partitioning. However, we find that then if the atmospheric model uses more processes than the ocean model then OASIS hangs. However, it is not a problem with orange per se as we can decompose it in the more conventional contiguous fashion, i.e. just allocate rows from north to south across the processes, and that works fine. So, it appears to be a problem if the data on a process is not contiguous. We are only using a 1D decomposition in the latitude direction.
5 [Email protection active, please enable JavaScript.] Sebastien Masson LOCEAN France We already discuss this several time... Using OASIS in simulations including zooms require that the control of the time évolution (the fact that we are not going backward) is done separately for each variable. In presence of zooms, the same model may have variables that are in advance in time in comparison with other variables. do we really need is-enes3 do do this?
6 [Email protection active, please enable JavaScript.] Eric Maisonnave CERFACS FRANCE Introduce a light parallelism (OpenMP) in interpolation weights & addresses computations. Useful to reduce computing time by a factor 20-30, which made affordable ORCA12 interpolation calculations, for example. Insure the descendant compatibility of any new community coupling library development with the existing OASIS coupling interfaces. OASIS users must be sure that any new development in their model must be usable during the next 10 years. In that perspective, the users must have the guarantee that they won't have to make a substantial effort to adapt their interfaces, or they will freeze their interface developments. Introduce vertical interpolations ... even though probably out of scope at short term.
7 [Email protection active, please enable JavaScript.] Klaus Wyser Rossby Centre / SMHI Sweden Sending/receiving a simple scalar, e.g. for a signal that tells if a field has been read by the target. Of course we could define a 1x1 array for this purpose, but why not making something more user-friendly? Remapping/interpolation method for categorical fields (e.g. vegetation types). There is - to my knowledge - nothing similar to a nearest neighbour interpolation in Oasis. Coupling 3-d fields with horizontal and vertical interpolation. We have workarunds today but at some point in the future we may want couple full 3-d fields.
8 [Email protection active, please enable JavaScript.] Uwe Fladrich SMHI Sweden Support ESM configuration management w.r.t. coupling + support coupling configurations (other than on per-field basis) + interact with workflow management tools + modernise and extend namcouple file syntax Modernise OASIS code + get rid of F77 + get rid of compiler warnings Take advantage of being the "ESM hub" + online performance analysis (continue Lucia develeopment) + develop features for online diagnostics + support data "readers" and "writers" Keep up the good work!
9 [Email protection active, please enable JavaScript.] Richard Hill Met Office UK Formal support for the exchange of single dimension (real) array values through OASIS3-MCT, without any form of regridding. e.g. To pass an arbitrarily dimensioned 1D array from one component to another, with no modification to the values in that array, effectively using OASIS3-MCT as an MPI send-receive interface. Improve handling of 3D fields in namcouple files so that we no longer need to explicitly list all field names in full on a single line in the form: outfield001:outfield002:outfield003:....outfieldnnn infield001:infield002:infield003:....infieldnnn Ideally we would have some form of stub field name with a new additional item to indicate the number of vertical levels (or some other 3rd "psuedo-level" dimension, e.g. ice category). So in the namcouple you might have: outfield infield 85 to indicate that there are 85 separate "levels" of data, whose names, one might imagine, would be automatically generated by OASIS-MCT as: outfield001 outfield002 outfield003 .. .. outfield085 and infield001 infield002 infield003 .. .. infield085 Additional information in namcouple files for each transient, to allow the namcouple (or a namelist or other file derived from it) to act as THE SINGLE source of information for all coupling fields. This would avoid the need to hard-wire field names and other critical information (e.g. whether 2nd order terms are to be used) into component source code or supply such information separately via the component's own input (typically namelist) mechanism or make assumptions about grid types, dimensions, presence or absence of halos etc. So the things each component would need in addition to the existing details are: Source component Target component(s) Grid types for each field (e.g. T, U, V). A unique identifying index which components can interrogate in order to link each transient field with its source or target field within the component. A flag indicating whether or not first or second order terms are to be used. For some while, we have achieved the above by annotating namcouple files with our own specially formatted comments. These are pre-processed at run time in order to generate a namelist which the UM can read in order to set up all necessary coupling fields without having to hard code anything in the source or otherwise provide details separately through the main UM input mechanism. This avoids problems when details don't tally with what's in the namcouple. This has worked well, but is currently restricted to use in the UM, since NEMO (and other models) are still hard coded with assumptions about names, grid types, field ID's and fields are activated via separate namelist inputs which have no cross reference to the namcouple content and are therefore prone to mismatch errors (e.g. if a field is switched on in NEMO but not present in the namcouple, or vice versa). Other requirements relate to being able to cater for adapting weights files "on-the-fly" for schemes such as wetting/drying and ice sheet model coupling, however we don't have any viable examples to demonstrate or test this sort of thing at present so there seems little point is requesting this until we can be more precise about requirements.
10 [Email protection active, please enable JavaScript.] Redler MPI Meteorology Germany Improved scaling of data exchange MPI-M is using OASIS3-MCT in the MPIOM-ECHAM6 coupled setup for some years now with no further active development of the model setup.
11 [Email protection active, please enable JavaScript.] Florence Sevault METEO-FRANCE FRANCE In debug mode, a complete and explicit control of the namcouple (when a configuring line is not complete, an argument missing or more arguments than needed for example). A new pre-processing allowing to launch oasis without a restart for the fields, with the possibility to choose the constant which will be given to the whole field before interpolation.
12 [Email protection active, please enable JavaScript.] KUEH, Mien-Tze Academia Sinica Taiwan More options for the pre/post-processing transformation. For example, more options to extend the functionality of the BLASOLD or BLASNEW so as to allow us to do filterings or scale-decompositions before/after the remapping. This may be of help to isolate (or remove) some forcing modes (or noises), thereby providing hints for any further adjustment or even modification of model physics's routines for a coupling system. I understand these could be done by modifying the relevant model component, yet it would be nice if we can do this via OASIS. Or, at least to extend the BLASOLD/BLASNEW to be able to read in a prescribed input file to allows the exchanging field to be scaled? Multiple sources for a single target field. Consideration: ensemble runs for a model (e.g., atmospheric component), downscaling/nesting, combination of model result and prescribed dataset. More options for the time transformation? (LOCTRANS)
13 [Email protection active, please enable JavaScript.] Marti Olivier France First priority is to keep OASIS running on all our computing platforms with good performances. (so far so good, thanks !) * A field send by a model could be send several time to an other model, after different processings : - Wind stress send by Atm is interpolated and both grid U and V on the ocean grid/ - Calving : at IPSL, 3 different processing for the same field, to handle some specific interpolation around Antarctica. (* Up to now, the namcouple is simple and efficient. However, this request could implies more complexity, and a switch to XML configuration file could be relevant.) * We need to progress on diagnostics - Handle fractionnal grid boxes (with only a fraction of ocean for instance) - Output time mean of fields rather than at each coupling time step.
14 [Email protection active, please enable JavaScript.] Mario Acosta BSC Spain Enhance the functionalities of the internal OASIS load balancing tool LUCIA, introducing metrics to evaluate the per-coupling time step analysis. These metrics could be used to achieve the load balance of complex coupled models and increase the computational performance. New functionalities for OASIS: 1)If that is not already available in OSAIS3-MCT, I would suggest enabling prescribed additions to all fluxes when they are being exchanged between NEMO and IFS in both directions. For example, when IFS calculate zonal surface wind stress, tau_x(x,y,t), OASIS3-MCT should have option to pass to NEMO: tau_x'(x,y,t) = alpha(t) x tau_x(x,y,t) + beta(x,y,t), where alpha(t) and beta(x,y,t) are external prescribed files. When alpha=0 then this is a simple data override case. 2) Also, I would suggest considering the introduction of stochastic elements during coupling (e.g. additive and multiplicative noise terms with various spectral characteristics). Save the oasis restarts during the middle of a leg/chunk, this should be setted using a user parameter.
15 [Email protection active, please enable JavaScript.] Laure Coquart CERFACS-CNRS France OpenMP in nearest-neighbor routine
© Copyright ENES Portal 2011