Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add blending and associated pre-processing tools. #38

Merged
merged 14 commits into from
Oct 26, 2023

Conversation

delippi
Copy link
Contributor

@delippi delippi commented Jun 28, 2023

DESCRIPTION OF CHANGES:

The raymond filter source code was added (raymond.f) which is based on code used in https://journals.ametsoc.org/view/journals/wefo/37/1/WAF-D-21-0069.1.xml (see section 3d Blended ICs)

Before we are able to do any blending, the coldstart variables from chgres_cube need to be preprocessed which includes rotating the winds and vertically remapping all the blended variables. Therefore, three additional pre-blending tools are also included: chgres_winds, remap_dwinds, and remap_scalar. These stand-alone programs are heavily based on https://github.com/NOAA-GFDL/GFDL_atmos_cubed_sphere/blob/bdeee64e860c5091da2d169b1f4307ad466eca2c/tools/external_ic.F90#L433. These three tools rotate the chres_cube/coldstart winds to the model D-grid, vertically remap the winds, and vertically remap non-winds respectively. Each of these tools are compiled using f2py (Fortran to Python inferface generator) such that the *.so files can be imported as python modules.

TESTS CONDUCTED:

There are fv3jedi tools (ColdStartWinds and VertRemap) which perform these exact functions. Instead of downloading and installing GDASApp in RRFS, we created the stand-alone programs noted above. The fv3jedi tools, however, were used to verify that the pre-blending tools were working as expected. The wind rotation in the stand-alone exactly (within rounding error) that of the fv3jedi tool ColdStartWinds. At the time of the creation of this PR, the stand-alone vertically remapping part does not exactly reproduce the fv3jedi tools, but it seems close enough (I would like to revisit this in the future).

These tools have been tested only on WCOSS2. They have been tested in the context of being called within the regional_workflow. Full scientific evaluation of the blending has not been done.

DEPENDENCIES:

Adding large-scale blending step to regional_workflow:
NOAA-GSL/regional_workflow#540

DOCUMENTATION:

BlendingGlobal_to_Regional_Proposal ppt: https://docs.google.com/presentation/d/1AGXqGar1rNtKOXXhoTTD_tZ6nFg8LeZQ/edit#slide=id.p1

Large Scale Blending Tests ppt:
https://docs.google.com/presentation/d/1dVN3ZphUlVv8EIpf9Tev3ZciTMHi3oUHbNUNBJjayKE/edit?pli=1#slide=id.g21875f2a523_0_15

Copy link
Collaborator

@guoqing-noaa guoqing-noaa left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

CMake is preferred over Makefiles.

Copy link
Collaborator

@christopherwharrop-noaa christopherwharrop-noaa left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This cannot be accepted as is. It must be incorporated into the CMake build structure before this is approved.

@christopherwharrop-noaa christopherwharrop-noaa added the Do not merge Can not be merged until this label is removed label Jun 30, 2023
@hu5970
Copy link
Contributor

hu5970 commented Jul 1, 2023

We will try to convert makefile to CMake.

It is not difficult to compile those four subdirectories as four regular fortran lib. I made CMake to compile them as fortran libraries on Hera:
/scratch1/BMC/wrfruc/mhu/rrfs/v0.5.3/ufs-srweather-app/src/rrfs_utl/blending

But the makefile also has function to build fortran extension modules for python:

f2py3.8 -DF2PY_REPORT_ATEXIT --build-dir . -c --f90flags='-fopenmp' -L/pe/intel/compilers_and_libraries_2020.4.304/linux/compiler/lib/intel64_lin/libiomp5.so -liomp5 -m chgres_winds chgres_winds.f90

I cannot figure out how to let CMake to to build fortran extension modules for python.

Any idea how to make it?

Thanks,
Ming

@guoqing-noaa
Copy link
Collaborator

@hu5970 You may try the POST_BUILD custom command. A simple example:

add_custom_command(TARGET abc
        POST_BUILD
        COMMAND /bin/sh /path/to/my_script
        )

However, I would think wrapping all these blending codes into a Python package and publishing it on conda-forge or similar will be a better solution.

@christopherwharrop-noaa
Copy link
Collaborator

I did some casual research on using CMake with f2py and there are some promising examples to follow. Once I have some time to look at them more carefully I can link them here. I can help with CMake issues.

However, I'd like to know more about how these tools will be used and why RRFS can't use the ones that already exist.

@christopherwharrop-noaa
Copy link
Collaborator

There is the additional problem of testing. This repository is now an authoritative community SRW repository. This means that testing needs to occur within the context of the SRW develop branch as that is the authoritative repository to which all RRFS functionality is being migrating as fast as possible. At a minimum, this PR needs to be tested with SRW develop to verify the SRW build will not break. If any changes on the SRW end are necessary, an issue should be created in SRW with an accompanying PR ensure that the community SRW App will be compatible with this update.

@delippi
Copy link
Contributor Author

delippi commented Jul 5, 2023

However, I'd like to know more about how these tools will be used and why RRFS can't use the ones that already exist.

@christopherwharrop-noaa, in order to use the tools that already exist, we would have to install the entire GDASApp as part of the RRFS installation to run these couple of steps. These steps are also done within the model, but we don't want to have to run the entire model to perform the simple task of rotating the winds and vertically remapping the cold start variables.

@christopherwharrop-noaa
Copy link
Collaborator

@delippi - Ok. But how are you going to make sure all of these duplicate implementations of the same functionality stay in sync as they evolve? Your explanation makes sense in that you are avoiding installation of another large dependency, but it's also trading one problem for another. It's generally unwise to stash and use multiple copies of the same code in various places. Perhaps a better solution would be to peel off those parts of the GDAS in a way that makes those pieces reusable across multiple UFS Apps? One of the primary missions of the UFS is "unification" and reuse of common utilities across different modeling systems to minimize proliferation of duplicate implementations of the same functionality.

@delippi
Copy link
Contributor Author

delippi commented Jul 5, 2023

@christopherwharrop-noaa, thanks for raising that concern, and I understand your point. I think the long-term goal is to eventually use the FV3jedi tools, but the pragmatic solution to have this functionality ready for RRFSv1 (in pre-jedi RRFS) is to build this stand-alone program. Once we move into the jedi era RRFS, we should be able to make use of these FV3jedi tools.

I also want to make clear that the only thing that is being duplicated here are the wind rotation and vertical remapping codes (chgres_winds.f90, remap_dwinds.f90, and remap_scalar.f90) which are necessary to convert the cold start variables (after running chgres_cube on the host model) to match what is found the in RRFS restart files (warm starts). This way the blending code (raymond.f90), which is not a duplicated functionality, is blending variables that match 1-to-1 between the host model (e.g., GDAS) and the RRFS.

@guoqing-noaa
Copy link
Collaborator

@delippi I tried a compilation on Jet (Hera will be the same) and the f2py step failed:

error: ‘for’ loop initial declarations are only allowed in C99 mode
     for (int i = 0; i < rank; ++i) {
     ^
: note: use option -std=c99 or -std=gnu99 to compile your code

The reason is that the default gcc version is too low.

There are two solutions to this:

  1. Specify a high gcc version explicitly.
  2. export CFLAGS=-std=c99 before running f2py.
    I used method 2 and it succeeded.

@guoqing-noaa
Copy link
Collaborator

@delippi I think a major concern right now is that we need a CMake building system instead of a direct "make" system for your PR. Do you plan to make this change in the next 1~2 weeks?

@delippi
Copy link
Contributor Author

delippi commented Oct 19, 2023

I just made the commit for CMake building system. There are still a few lingering issues such as "f2py3.8" and the path to libiomp5.so are hard coded in the CMakeLists.txt files.

We may be able to use just "f2py" without a version number attached. But I've been using the following on the following machines.

On WCOSS2:
f2py3.8
/pe/intel/compilers_and_libraries_2020.4.304/linux/compiler/lib/intel64_lin/libiomp5.so

On Orion:
f2py3.7
/apps/intel-2018/intel-2018.u4/clck/2018.3/provider/share/common/lib/intel64/libiomp5.so

On Hera:
f2py (and ends up building as 3.9)
/apps/intel/compilers_and_libraries_2020.2.254/linux/compiler/lib/intel64_lin/libiomp5.so

@delippi
Copy link
Contributor Author

delippi commented Oct 20, 2023

@guoqing-noaa I think I got it to build on WCOSS2, Hera, and Orion now with cmake and to use a generic f2py and omp library.

Make sure to use python 3.x. I had the most issues on Hera: These were the modules I had loaded
Currently Loaded Modules:

  1. cmake/3.26.4 2) intel/2022.1.2 3) intelpython/3.6.8

@guoqing-noaa
Copy link
Collaborator

guoqing-noaa commented Oct 24, 2023

@delippi Thanks for making this PR CMake ready! I tested it and it worked! Great jobs!

Could you include "blending" in CMakeLists.txt under the top directory of "rrfs_utl"?

Also, we need to add the "install" part in CMakeLists.txt under each of your 4 subdirectories (i.e. we need to copy those Python libraries into the centralized "install" location in ufs-srweather-app). We need to check out ufs_srweather-app (feature/RRFS_dev1 branch) and test whether we can build everything (including your Python libraries) by running one command "devbuild.sh".

Thanks!
Guoqing

@delippi
Copy link
Contributor Author

delippi commented Oct 25, 2023

@guoqing-noaa, I added the "blending" in CMakeLists.txt under the top directory of "rrfs_utl" and the "install" part in CMakeLists.txt under each of your 4 subdirectories. I've also checked out ufs_srweather-app (feature/RRFS_dev1 branch) and tested whether we can build everything (including Python libraries) by running one command "devbuild.sh". I tested on WCOSS2 so far. I had to "module load python/3.8.6" to "env/build_wcoss2_intel.env" to get it to work. I will try testing on Hera later today as well.

@guoqing-noaa
Copy link
Collaborator

@delippi I think I got a working CMake system for your blending code. I created a PR to your fork:

delippi#1

The build process works on Hera/Jet. Please test it on WCOSS2.
If it also works on WCOSS2, please merge it and the changes will be automatically reflected here.

Thanks,
Guoqing

@delippi
Copy link
Contributor Author

delippi commented Oct 26, 2023

@guoqing-noaa, thanks for helping with the cmake. I just merged your changes after confirming that it worked on WCOSS2.

Copy link
Collaborator

@guoqing-noaa guoqing-noaa left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@guoqing-noaa
Copy link
Collaborator

@christopherwharrop-noaa Could you take a look at this PR at your earliest convenience? We would like this PR to be merged as fast as possible. The real-time and retro RRFS runs are all waiting for this PR at the moment. Thanks a lot!

Copy link
Collaborator

@christopherwharrop-noaa christopherwharrop-noaa left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you very much for all the CMake work. I know that was not easy. This is in pretty good shape now. My only complaint is that this still introduces some non-portable constructs via the hardcoding of Intel Fortran compiler settings. However, I don't see a good way around that right now due to the way f2py needs to run. And given that there are many other non-portable usages of compiler settings throughout other parts of recent rrfs_utl development, I don't think it is worth holding this up any longer.

project(chgres_winds)

enable_language(Fortran)
set(CMAKE_Fortran_COMPILER ifort)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You should never set the compiler explicitly. This should be discovered by Cmake when it looks for compilers in the current environment.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We removed lines 4 and 5 in the latest version. So that would not be an issue now. Thanks for the review and quick action!

@guoqing-noaa guoqing-noaa merged commit 77517c1 into NOAA-GSL:develop Oct 26, 2023
1 check passed
@guoqing-noaa guoqing-noaa removed the Do not merge Can not be merged until this label is removed label Oct 26, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants