WRFv3.4 ARW PS:4.1.1.1.2.1.1 (with DA)

WRFv3.4 ARW PS:4.1.1.1.2.1.1 (with DA)

  • Code
  • Domain
  • Model
  • Initialization
  • Cases
  • Verification

Codes Employed

The components of the end-to-end forecast system used for this test included:

    • WRF Preprocessing System (WPS) (v3.4)

    • WRF Data Assimilation (WRFDA) (v3.4)

    • WRF-ARW model (v3.4)

    • Unified Post Processor (UPP) (v1.1)

    • Model Evaluation Tools (MET) (v4.0) *Includes PB2NC bug fix

    • NCAR Command Language (NCL) for graphics generation

    • Statistical programming language, R, to compute confidence        intervals

Domain Configuration

    • Contiguous U.S. (CONUS) domain with 15-km grid spacing

Click thumbnail for larger image.

    • 403 x 302 gridpoints, for a total of 121,706 horizontal gridpoints

    • 56 vertical levels (57 sigma entries); model top at 10 hPa

    • Lambert-Conformal map projection

Sample Namelists

    • namelist.wps

    • namelist.input

AFWA Reference Configuration

Microphysics: WRF Single-Moment 5 Scheme
Radiation (LW/SW): RRTM/Dudhia
Surface Layer: Monin-Obukhov Similarity Theory
Land Surface: Noah
PBL: Yonsei University Scheme
Convection: Kain-Fritsch Scheme

    • Sample namelist.input

Other run-time settings

    • Long timestep = 90 s; Acoustic step = 4
    • Calls to the boundary layer and microphysics parameterization        were made every time step
    • Calls to the cumulus parameterization were made every 5        minutes
    • Calls to radiation were made every 30 minutes

Updates since WRFv3.3.1 for AFWA RC physics options

    • WSM5 Microphysics: Fix for nccn variable on lateral boundary
    • YSU: improved Pr number computation and change to lower        limits on diffusion
    • Full list of enhancements and bug fixes can be found at:       http://www.mmm.ucar.edu/wrf/users/wrfv3.4/updates-3.4.html

Initial and Boundary Conditions

    • Initial conditions (ICs) and Lateral Boundary Conditions
       (LBCs)
: 0.5° x 0.5° Global Forecast System (GFS) model

    • Lower Boundary Conditions (LoBCs): AFWA Land        Information System (LIS) utilizing the Noah land surface        model v2.7.1

    • SST Initialization: Fleet Numerical Meteorology and        Oceanography Center (FNMOC) daily, real-time sea surface        temperature (SST) product

Preprocessing Component

The time-invariant component of the lower boundary conditions (topography, soil and vegetation type etc.), using United States Geological Survey (USGS) input data, was generated through the geogrid program of WPS. The avg_tsfc program of WPS was also used to compute the mean surface air temperature in order to provide improved water temperature initialization for lakes and smaller bodies of water in the domain that are further away from an ocean.

Data Assimilation Component

A 6-hour "warm start" spin-up procedure preceded each forecast (see figure below). Data assimilation was conducted using WRFDA at the beginning and the end of the 6-hour warm start window using observation data files provided by AFWA. At the beginning of the data assimilation window, the GFS derived initial conditions were used as the model background, and at the end of the window, the 6-hour WRF forecast initialized by the WRFDA analysis was used. After each WRFDA run, the LBCs derived from GFS were updated and used in the subsequent forecasts.
Click thumbnail for larger image.

Cases Run

Forecast Dates: 1 July 2011 - 29 June 2012

Initializations: Every 36 hours, including both 00 and 12 UTC

Forecast Length: 48 hours; output files generated every 3 hours

00 UTC Initializations
July 2011: 1, 4, 7, 10, 13, 16, 19, 22, 25, 28, 31
August 2011: 3, 6, 9, 12, 15, 18, 21, 24, 27, 30
September 2011: 2, 5, 8, 11, 14, 17, 20, 23, 26, 29
October 2011: 2, 5, 8, 11, 14, 17, 20, 23, 26, 29
November 2011: 1, 4, 7, 10, 13, 16, 19, 22, 25, 28
December 2011: 1, 4, 7, 10, 13, 16, 19, 22, 25, 28, 31
January 2012: 3, 6, 9, 12, 15, 18, 21, 24, 27, 30
February 2012: 2, 5, 8, 11, 14, 17, 20, 23, 26, 29
March 2012: 3, 6, 9, 12, 15, 18, 21, 24, 27, 30
April 2012: 2, 5, 8, 11, 14, 17, 20, 23, 26, 29
May 2012: 2, 5, 8, 11, 14, 17, 20, 23, 26, 29
June 2012: 1, 4, 7, 10, 13, 16, 19, 22, 25, 28

12 UTC Initializations
July 2011: 2, 5, 8, 11, 14, 17, 20, 23, 26, 29
August 2011: 1, 4, 7, 10, 13, 16, 19, 22, 25, 28, 31
September 2011: 3, 6, 9, 12, 15, 18, 21, 24, 27, 30
October 2011: 3, 6, 9, 12, 15, 18, 21, 24, 27, 30
November 2011: 2, 5, 8, 11, 14, 17, 20, 23, 26, 29
December 2011: 2, 5, 8, 11, 14, 17, 20, 23, 26, 29
January 2012: 1, 4, 7, 10, 13, 16, 19, 22, 25, 28, 31
February 2012: 3, 6, 9, 12, 15, 18, 21, 24, 27
March 2012: 1, 4, 7, 10, 13, 16, 19, 22, 25, 28, 31
April 2012: 3, 6, 9, 12, 15, 18, 21, 24, 27, 30
May 2012: 3, 6, 9, 12, 15, 18, 21, 24, 27, 30
June 2012: 2, 5, 8, 11, 14, 17, 20, 23, 26, 29

The tables below list the forecast initializations that failed to complete the end-to-end process due to the reasons described in the table. All incomplete forecasts were due to missing or bad input data sets, not model crashes.

Missing Forecasts:
Affected Case Missing Data Reason
2011080112 WRF Output Missing SST Input Data
2011082400 WRF Output Missing SST Input Data
2012050312 WRF Output Missing SST Input Data
2012050612 WRF Output Bad obs_gts input data
2012060400 WRF Output Bad SST input data

Missing Verification:
Affected Case Missing Data Reason
2011072500 Missing 3-h QPF verification for 18 21-h
Missing 24-h QPF verification for 36-h
Missing ST2 analysis

Verification

The Model Evaluation Tools (MET) package, comprised of:

    • grid-to-point comparisons - utilized for surface and upper air        model data
    • grid-to-grid comparisons - utilized for QPF

was used to generate objective verification statistics, including:

    • Bias-corrected Root Mean Square Error (BCRMSE), Root
       Mean Square Error
(RMSE) and Mean Error (Bias) for:
        • Surface: Temperature (2 m), Dew Point Temperature (2 m)            and Winds (10 m)
        • Upper Air: Temperature, Dew Point Temperature and Winds

    • Gilbert Skill Score Score (GSS) and Frequency Bias        (FBias) for:
        • 3-hr and 24-hr Precipitation Accumulation intervals

Each type of verification metric is accompanied by confidence intervals (CIs), at the 99% level, computed using a parametric method for the surface and upper air variables and a boostrapping method for precipitation.

Area-averaged verification results were computed for the full and CONUS domains, East and West domains, and 14 sub-domains.