Training Resources
Videos
Cloud Tutorials:
Building and Running Containerized Land Data Assimilation (DA) Workflow on Amazon Web Services (AWS)
Unified Workflow:
Graduate Students:
GitHub Tutorial: Part 1. Contributing to UFS/EPIC Repositories
This video requires some basic linux command line knowledge ahead of time. It will talk you through the process of learning the basics of Git, an GitHub integrations.
Description
This tutorial is the first of two tutorials that cover the basics of Git and GitHub, including common terms, concepts, and commands. It demonstrates how to:
- Create repositories, forks, and branches (local and remote)
- Use basic Git commands
- Commit (save) changes locally and push them to a remote repository
- Explore remote repositories
This Git/GitHub tutorial is intended to facilitate collaborative code development and to promote contributions from UFS community developers to the UFS and NOAA-EPIC GitHub repositories.
GitHub Tutorial: Part 2. Contributing to UFS/EPIC Repositories
This video requires some basic linux command line knowledge ahead of time. It will talk you through the process of learning the basics of Git, an GitHub integrations.
Description
It is highly encouraged to watch Part 1 of the tutorial before proceeding with this video.
This tutorial is Part 2 of the series on contributing to UFS/EPIC repositories. It presents instructions on navigating local and remote repositories using Git and GitHub. The instructions build on the material presented in the Part 1 tutorial, which covered Git and GitHub basic terms and operations. The following information is covered in Part 2:
- UFS/EPIC public repositories
- Forks and clones of the repositories
- Branches and tags
- Steps to contribute to open-source repositories
- Three phases of local Git space
- Fetching and merging remote branches
- Resolve merge conflicts
- Making pull requests (PRs)
- Testing another developer’s PR and providing feedback
- GitHub Issues and Discussions pages for support
Instructions
The tutorial primarily follows presentation slides and has several live demonstrations. A hard copy of the presentation includes some snapshots from the live demo. All the materials and tutorial presentations (Part 1 and Part 2) can be found at https://github.com/NOAA-EPIC/training-github
The instructional materials and tutorial presentations (Part 1 and Part 2) can also be downloaded from the command-line as follows:
git clone https://github.com/NOAA-EPIC/training-github.git
cd ./training-github
Land DA Workflow Demo
Description
In the Land Data Assimilation (DA) System, the Noah-MP land surface model (LSM) from the UFS Weather Model (WM) and the Joint Effort for Data assimilation Integration (JEDI) system are used to assimilate snow depth data via the Local Ensemble Transform Kalman Filter-Optimal Interpolation (LETKF-OI) algorithm.
This video walks through how to run the Land DA System on the Mississippi State University (MSU) Orion supercomputer. The steps are repeatable on other supported Level 1 systems (i.e., MSU Hercules, and NOAA’s Hera machine). Users on non-Level 1 systems may still benefit from watching this video but should refer to the section of the User’s Guide for running Land DA in a container. This video covers:
- Useful resources for Land DA
- Workflow steps/tasks
- Land DA directory structure
Users will also see many basic Linux and Git-related commands in use.
Instructions
- Users can view the slides to visit links to resources.
- This cheatsheet lists the commands used in the Land DA demo.
- The Land DA repository has evolved between the time that the demo was recorded and the time of posting. This document provides a list of updates and notes that are pertinent to the demo.
- Users are encouraged to view the Land DA User’s Guide and ask questions in our GitHub Q&A forum.
Creating a Base Image on Amazon Web Services (AWS)
This video requires that you have an AWS account, and have limited knowledge of EC2 and S3. This video will create an AWS AMI that can then be generated into an HPC resource.
Description
This instructional video will walk users through how to set up an Amazon Web Services (AWS) Amazon Machine Image (AMI) that will allow them to run the Short-Range Weather (SRW) Application. It installs and builds everything needed to run the SRW Application. Users can also leverage the multi-cloud Packer framework to build out other application images by running a handful of commands.
Instructions
View the AWS Packer SRW Commands (TXT) document that accompanies the “Creating a Base Image on Amazon Web Services (AWS)” tutorial.
Launching a PCluster Image to run SRW on Amazon Web Services (AWS)
This video requires that you have an AWS account, and have limited knowledge of EC2 and S3. This video will create the AWS HPC resources needed to build the SRW application.
Description
This instructional video will walk users through how launch a SRW-configured image on an Amazon Web Services (AWS) Amazon Machine Image (AMI) that will allow them to run the Short-Range Weather (SRW) Application. It installs and builds everything needed to run that was loaded in the prior video. The last tutorial in this series will walk users through running the SRW Application end-to-end.
Instructions
- Create a micro EC2 instance with AWSlinux2 OS
sudo yum install git
sudo yum install -y yum-utils
git clone https://github.com/NOAA-EPIC/packer-srwcluster.git
- cd packer-srwcluster/scripts/deployment
- wget https://us-east-1-aws-parallelcluster.s3.us-east-1.amazonaws.com/parallelcluster/3.0.2/installer/pcluster-installer-bundle-3.6.1.209-node-v16.19.0-Linux_x86_64-signed.zip
- unzip pcluster-installer-bundle-3.6.1.209-node-v16.19.0-Linux_x86_64-signed.zip -d pcluster-installer-bundle
- cd pcluster-installer-bundle
- chmod +x install_pcluster.sh
- bash install_pcluster.sh
- source /home/ec2-user/.bash_profile pcluster version
- ##Walkthrough and modify the files in that directory
sh generateClusters.sh
Building and Running Containerized Land Data Assimilation (DA) Workflow on Amazon Web Services (AWS)
This instructional video will walk users through how to build and run the containerized Land DA System using the AWS HPC resources built in Tutorials 1 and 2.
Description:
This is the third tutorial in the Cloud series for running UFS applications in AWS. This video requires that you have an AWS account and limited knowledge of EC2 and S3. This video will allow you to build and run the containerized Land DA workflow from the AWS HPC resources built in Tutorials 1 and 2.
Instructions
#install intel module
module use /opt/spack-stack/spack/share/spack/modules/linux-*
module avail intel
module load intel-oneapi-compilers/2022.1.0
module load intel-oneapi-mpi/2021.6.0
#Stage a scratch directory
sudo mkdir /lustre
sudo chmod -R 777 /lustre
cd /lustre
mkdir LandDA
sudo chmod -R 777 /lustre/LandDA
cd LandDA
export LANDDAROOT=/lustre/LandDA
#stage the data
wget https://noaa-ufs-land-da-pds.s3.amazonaws.com/current_land_da_release_data/v1.2.0/Landdav1.2.0_input_data.tar.gz
tar xvfz Landdav1.2.0_input_data.tar.gz
export LANDDA_INPUTS=/lustre/LandDA/inputs
#Build the container
singularity build --force ubuntu20.04-intel-landda.img docker://noaaepic/ubuntu20.04-intel-landda:release-public-v1.2.0
export img=/lustre/LandDA/ubuntu20.04-intel-landda.img
#export the workflow
singularity exec -B /lustre:/lustre $img cp -r /opt/land-DA_workflow .
cd $LANDDAROOT/land-DA_workflow
export BASELINE=singularity.internal
./do_submit_cycle.sh settings_DA_cycle_era5
An Introduction to Unified Workflow Tools and the File Copy/Link Tool
The uwtools Python package contains generic tools and Unified Forecast System (UFS) software component drivers that facilitate configuring and running numerical weather prediction applications. This introduction covers the basics of installing uwtools and configuring and running a Unified Workflow (UW) tool on the command line and using the API.
This video corresponds to uwtools v2.3.x.
Visit the GitHub Repository: https://github.com/ufs-community/uwtools
Visit our Documentation: https://uwtools.readthedocs.io/en/main/
Instructions
Visit the GitHub Repository: https://github.com/ufs-community/uwtools
Visit our Documentation: https://uwtools.readthedocs.io/en/main/
HPC-Stack Setup on a Mac
The key to this video is to build HPC-Stack on a Macbook Pro. HPC-Stack will soon be replaced by Spack-Stack, but this is a key different requirement to run locally on a Mac for the time being.
Description
This video will guide users through the process of installing and building HPC-Stack on MacOS systems with M1/arm64 or x86_64 architecture.
HPC-Stack provides a unified, shell script-based build system to build the software stack required for numerical weather prediction (NWP) tools such as the Unified Forecast System (UFS) and the Joint Effort for Data assimilation Integration (JEDI) framework.
This tutorial covers the following steps:
- Preparing your MacOS system by installing prerequisites (e.g., Homebrew, Xcode Command Line Tools (CLT), compilers, CMake, Make, OpenSSL, Lmod, Wget, Python, and Git)
- Cloning the High-Performance Computing Stack (HPC-Stack) repository
- Configuring the system (e.g., specifying compilers, Python libraries, Message Passing Interface (MPI) libraries, and appropriate flags)
- Setting up the Lmod environment and customizing the HPC-Stack installation by specifying which MPI libraries to build.
- Building the HPC-Stack, setting up modules and environment, and finally, compiling the software stack.
Join us as we walk you through each step to successfully install and build HPC-Stack on your MacOS system.
Instructions
The instructions for this tutorial are provided on our ReadTheDocs page for the HPC-Stack:
https://hpc-stack-epic.readthedocs.io/en/latest/mac-install.html