Global climate models (GCMs) are an essential tool for simulating past and present climates and projecting future climate change. However, current GCMs, such as those submitted to the Coupled Model Inter-comparison Project Phases 3, 5, and 6 (CMIP3, CMIP5, and CMIP6, respectively), represent atmospheric and land processes at spatial resolutions of 70–500 km (Eyring et al., 2016; Flato et al., 2013; Randall et al., 2007). Their coarse resolution lacks the fine scale details required for local and regional impact assessments and adaptation planning and limiting their direct use on regional scales (Fowler et al., 2007; Hattermann et al., 2011; Maraun et al., 2010).
Dynamical downscaling of coarse-resolution outputs from GCMs or reanalyzes via regional climate models (RCMs) have typically been used to bridge the resolution gap (Ekström et al., 2015) and reproduce regional climatology at high spatial resolution (Ekström et al., 2015; Giorgi, 2006; Laprise, 2008; Wang et al., 2004). A number of studies have reported that downscaling can not only improve grid resolution but also improve the accuracy of the GCM simulation, which is known as “added value” (Di Luca et al., 2012, 2013; Di Luca et al., 2016a; Di Virgilio et al., 2020; Dosio et al., 2019; Solman & Blázquez, 2019; Torma et al., 2015). A number of previous projects have produced regional climate projections using RCM ensembles including the Regional Climate Model Inter-comparison Project for Asia (Fu et al., 2005), Prediction of Regional scenarios and Uncertainties for Defining European Climate change risks and Effects (Christensen et al., 2007), ENSEMBLES (van der Linden & Mitchell 2009), North American Regional Climate Change Assessment Program (Mearns et al., 2012), Europe–South America network for climate change assessment and impact studies in La Plata Basin (Solman et al., 2013), and the global Coordinated Regional Climate Downscaling Experiment (CORDEX; Giorgi et al., 2009).
In 2014, the New South Wales (NSW) and Australian Capital Territory (ACT) regional climate modeling (NARCliM; Evans et al., 2014) project was developed to produce finer scale regional climate projections for Southeast Australia (Figure 1). The project was designed in conjunction with government, environmental agencies, and research organizations to provide regional climate information to directly support policy and decision-making at local scales.
Figure 1. Weather Research and Forecasting (WRF) model domains with grid spacing of approximately 50 km (outer Coordinated Regional Climate Downscaling Experiment [CORDEX] domain; yellow) and 10 km (inner NARCliM domain; red). The black dots denote the location of the capital cities.
Several studies have evaluated the N1.0 historical simulations (Evans et al., 2014, 2017; Fita et al., 2016; Ji et al., 2016; Olson et al., 2016) and have indicated that, overall, N1.0 accurately simulates several features of the observed climate, but is cold- and wet-biased. Other studies have also shown that N1.0 can provide added value to simulated surface variables such as precipitation, maximum and minimum temperatures when compared to the driving GCMs (Di Luca et al., 2013; Di Luca et al., 2016a; Olson et al., 2016). Since the delivery of N1.0 in 2014, it has been widely used in climate change impact studies led by the NSW government (
While the N1.0 datasets have been applied successfully in multiple contexts, an independent evaluation and multiple technical user workshops conducted revealed some limitations to N1.0 use in impact assessments (Table S1). The major issues raised by end-users of N1.0 simulations include: (a) The simulations use CMIP3 GCMs. (b) The simulations are non-continuous and are only available for three 20-year epochs. (c) Use of only one emission scenario (A2 from the Special Report on Emissions Scenarios - SRES). This user feedback led to development of the second iteration of NARCliM, namely NARCliM1.5 (from here N1.5).
The N1.5 simulations do not replace N1.0 simulations; rather these updated and enhanced simulations are performed to complement N1.0 simulations and create an expanded data set. For this reason, N1.5 simulations are performed using two of the three RCMs of N1.0 and run on same domains as N1.0 to provide complementary datasets. By using the same RCMs and domain settings as in N1.0, users can conveniently use either or both the N1.0 and N1.5 datasets without consideration of the impacts from changes in the model domain boundaries location and spatial resolutions and/or RCM physics settings. In N1.5, three GCMs have been downscaled by two RCMs each, providing six RCM ensemble members for use in impact assessments.
In this study we address four key questions of N1.5: (a) What are the methodological differences between the N1.0 and N1.5 simulations? (b) How well do N1.0 and N1.5 represent historical mean temperature and rainfall? (c) How do the two iterations of NARCliM compare in simulating the future projections of mean temperature and precipitation? (d) What further insights do the ensemble mean (joint statistics) of N1.0 and N1.5 provide?
Data and Methods NARCliM1.0 (N1.0)N1.0 is a twelve-member ensemble of regional climate simulations, which were run using four GCMs (Model for Interdisciplinary Research on Climate v3.2 (MIROC3.2; Watanabe et al., 2011), fifth-generation atmospheric general circulation model developed at the Max Planck Institute for Meteorology (ECHAM5; Roeckner et al., 2003), Canadian Coupled General Climate Model v3.1 (CGCM3.1; Yukimoto et al., 2012) and Commonwealth Scientific and Industrial Research Organization's global climate model v3.0 (MK3.0; Gordon et al., 2002) from the CMIP3 model suite. We note that the “CGCM3.1” and “MK3.0” were previously referred to as “CCCMA3.1” and “CSIRO MK3.0,” respectively (e.g., Di Luca et al., 2018; Olson et al., 2016). The four GCMs were selected based on their model performance over Australia and independence of their errors, and to span the full range of potential future climates as represented in a climate futures framework (Whetton et al., 2012) over south-eastern Australia (Figure 2). Full methodology details can be found in Evans et al. (2014). The four GCMs selected approximately span the entire range of projected temperature and precipitation change within the CMIP3 ensemble. Each of these four GCMs were dynamically downscaled using three selected RCMs (R1, R2, and R3). The three RCM configurations are distinct combinations of physics schemes (Table S2) of the Weather Research and Forecasting (WRF) v3.3 model (Skamarock & Klemp, 2008), which were selected from 36 physics scheme combinations based on model performance and independence of their errors (Evans et al., 2012; Evans et al., 2013a; Evans et al., 2013b; Ji et al., 2014). All RCM simulations were performed at 10-km resolution over Southeast Australia, embedded within the 50-km resolution domain of the CORDEX Australasia region (Figure 1). Each simulation was run for three 20-year periods: The recent past (1990–2009), near future (2020–2039) and far future (2060–2079), under the SRES A2 scenario (Solomon et al., 2007). The three 20-year simulations were accompanied by a reanalysis-forced simulation of each of the three RCMs for 1950–2009 using the National Centers for Environmental Prediction reanalysis data set (Kalnay et al., 1996). The N1.0 GCMs and RCMs are summarized in Table 1 with further details provided in Table S2 and S3.
Figure 2. Scatter plot of future change (differences between 2060–2079 and 1990–2009) in rainfall and temperature over the land part of the NARCliM domain (Figure 1) for 34 Coupled Model Inter-comparison Project Phase 5 (CMIP5) (blue) and 14 CMIP3 (red) Global climate models (GCMs) that passed the performance test. Larger dots represent the three GCMs selected for N1.5 (blue) and the four GCMs selected for N1.0 (red).
Table 1 NARCliM1.0 and NARCliM1.5 Simulations Used in This study
[IMAGE OMITTED. SEE PDF.] |
Note. Here for clarity, blue, and orange colors differentiate between N1.0 and N1.5 simulations.
NARCliM1.5 (N1.5)N1.5 is a six-member ensemble of regional climate simulations, which were driven by three GCMs (Australian Community Climate and Earth System Simulator Coupled Model v1.3 and v1.0 (ACCESS1-3, ACCESS1-0; Bi et al., 2013) and second generation Canadian Earth System Model (CanESM2; Chylek et al., 2011)) from the CMIP5 ensemble. The CMIP5 GCMs were selected using a similar performance evaluation as the CMIP3 GCMs in N1.0 (Figure 2). That is, considerations include their performance in simulating temperature and precipitation, representation of various climate processes and phenomena (e.g., east coast lows, ENSO patterns etc.) and an independence ranking. We also compared N1.0 CMIP3 GCMs with the CMIP5 GCM ensemble in a climate futures framework, highlighting that the CMIP5 ensemble includes a group of models with warmer and drier future changes not sampled by the existing N1.0 CMIP3 GCMs. For this analysis, we use the highest emission scenario for both generations of GCMs (i.e., SRES A2 and RCP8.5 from CMIP3 and CMIP5, respectively). Both the emission scenarios are not exactly the same but previous studies have shown that trends in key climate metrics (e.g., temperature increases) in global climate projections are similar across CMIP3 and CMIP5 (Flato et al., 2013; Knutti & Sedláček, 2013; Moise et al. 2015), supporting that they can be combined into expanded ensembles. We therefore select CMIP5 GCMs from this unsampled space within NARCliM, all with similar temperature increases but spanning the range of precipitation changes from “no change” to “moderate decrease” to “large decrease” (Figure 2). While capturing the unsampled space, we ended up selecting two GCMs (ACCESS1-3 and ACCESS1-0) from the same institution. However, this selection does not bias the results as these GCMs are found to be very different likely due to their use of different land surface models. Thus, GCMs in N1.5 complement those in N1.0 in terms of the range of projected climate change (Figure 2).
Each of these GCMs were then dynamically downscaled using two of the three RCMs used in N1.0 (R1 and R2; Table S2). RCM R3 was not used in N1.5 due to limitations in computational resources and relatively poor performance over the southeast Australian region when driven by the ERA-Interim reanalysis (Di Virgilio et al., 2019b). Each N1.5 simulation was run from 1951 to 2100 continuously using the WRF version 3.6 model. We use a more recent version of WRF in N1.5 as compared to N1.0 but retain the same RCM physics configurations. However, testing showed that this change in WRF version did not impact model simulation performance. The N1.5 data set provides future projections based on two future emission scenarios, the CMIP5 Representative Concentration Pathways RCP4.5 and RCP8.5, which represent partial-mitigation and business-as-usual pathways, respectively. These two scenarios also straddle the SRES A2 future pathway that was employed in N1.0 in terms of global warming (Van Vuuren et al., 2011). The 150-year GCM driven simulations are also accompanied by ERA-Interim reanalysis (Dee et al., 2011) forced simulations for 1979–2013 The N1.5 experimental setup is summarized in Table 1 with further information provided in Table S2 and S3.
Both N1.0 and N1.5 include high emission (business-as-usual) futures (SRES A2 and RCP8.5, respectively) and have common RCMs, thus the two datasets are compared in this study. However, the N1.5 data set has several advantages over N1.0 when used as a standalone data set for hindcasts and projections. For example, it provides continuous 150-year (1951–2100) climate projections that can be more easily used in conjunction with other climate and non-climate datasets having various temporal periods. The availability of a continuous period in N1.5 also enables statistically significant analyses for natural hazards impacts, and probability and threshold estimation across the entire 21st century for multiple emission scenarios.
ObservationsWe use observational data from the Australian Gridded Climate Data set (AGCD, Jones et al., 2009) in order to compare the N1.0 and N1.5 simulated variables with observations for the historical time period. This monthly mean gridded maximum and minimum temperature and precipitation data set has a spatial resolution of 0.05° and is obtained from an interpolation of station observations across the Australian continent. The majority of these stations are located in the more heavily populated coastal areas with a sparser representation inland, and there are more precipitation stations than temperature stations. In order to compare models with slightly different spatial resolutions with gridded observations of a higher resolution, two different approaches can be adopted. One is that model output can be interpolated to match the higher resolution of the gridded observations such that the latter remain unchanged (Vautard et al., 2013; Zollo et al., 2016). The second approach would be to interpolate the observation to match the coarse resolution of the models. In our case, the resolution of the observations is approximately two times higher than that of the models (5 by 5 km as compared to approximately 10 by 10 km). Therefore, we would not expect large difference if we perform the interpolation using either of the two approaches. Past studies of N1.0 evaluation, have used the second approach and have re-gridded AGCD data to the N1.0 model grid (Di Luca et al., 2018; Di Virgilio et al., 2019b). We therefore adopt the same approach and re-grid the AGCD data to the NARCliM model grid using the conservative area weighted re-gridding scheme from the Climate Data Operators (Schulzweida et al., 2006).
We also use the fifth generation of European Center for Medium-Range Weather Forecast (ECMWF) reanalysis (ERA5) data (Hersbach et al., 2020) for the evaluation of domain-mean statistics for mean precipitation, and maximum and minimum temperature in the historical period. Since wind data at 850 hPa or similar levels are not available in N1.0, we used 10-m zonal and meridional wind data to evaluate the impact of winds on biases in precipitation. ERA5 uses the IFS Cycle 41r2 4D-Var data assimilation system and replaces the ERA-Interim reanalysis. The monthly mean gridded ERA5 data are available at a regular lat-lon grid resolution of 0.25°. We use the similar approach as per AGCD, that is, by re-gridding the ERA5 data to the NARCliM model grid.
MethodIn this study we evaluate the means of three atmospheric variables: Precipitation, maximum temperature and minimum temperature using the monthly data. The analysis is then performed for the four Austral seasons, that is, summer: December-January-February (DJF), autumn: March-April-May (MAM), winter: June-July-August (JJA) and spring: September-October-November (SON). We also analyze 10-m zonal and meridional winds in order to investigate the links between surface winds and precipitation.
The evaluation is performed broadly in two parts. First, we evaluate the historical climate simulated by N1.5 from 1990 to 2009 in comparison with the N1.0 simulations. The N1.5 simulation historical data were available until 2005, thus the remaining 5 years are taken from the RCP8.5 projection (2006–2010) noting there are minimal differences between all RCP future scenarios for this period. We then analyze various metrics (spatial patterns, bias, standard deviations, correlations and statistical distributions) to evaluate the performance of both individual and ensemble mean N1.0 and N1.5 simulations in simulating the three atmospheric variables.
We then compare the future projections of the N1.0 and N1.5 simulations. We again choose an overlapping time-period (2060–2079, that is, the far future period in N1.0) in the two NARCliM iterations. In order to take advantage of the longer timeseries of N1.5 simulations, we also evaluate the temperature and precipitation projections in the five Australian State capital cities within the NARCliM domain.
When assessing future changes (compared to historical period) and biases (compared to AGCD observations) in mean atmospheric maximum and minimum temperature we calculate the statistical significance for each grid cell using t-tests (α = 0.05) assuming equal variance. The Mann–Whitney U test (α = 0.05) was used for future changes and biases in precipitation given its non-normality. Results on ensemble mean statistical significance were then separated into three classes following Tebaldi et al., (2011) to identify regions of statistically significant change with model agreement. This method considers the presence of internal climate variability and assesses the degree of consensus between models on the significance of a change. For each grid cell, when 50% or more of the model ensemble (of which there are 12 members in N1.0, 6 members in N1.5 and 18 members in combined ensemble N1.0 + N1.5 (hereafter N1.X) show significant change and at least 80% of those models agree on the direction of change, the difference in that grid cell is considered significant. If at least 50% of the model ensemble shows significant change, but less than 80% of those models agree on the direction of change, the multi-model mean is not shown in the subsequent figures but instead the grid cell is shown in white, indicating significant model disagreement on the projected change. Finally, if less than 50% of the model ensembles show a significant change, we do show the multi-model mean in the subsequent figures but without indication of significance.
Results N1.0 and N1.5 Simulations for the Historical Period (1990–2009)The main analysis in the following section describes the ensemble mean bias from the N1.0, N1.5 and the combined ensemble N1.X with respect to the three focus variables precipitation, maximum temperature and minimum temperature. Maps of the absolute values of these variables in the N1.0 and N1.5 ensembles are available in the supplementary material.
Mean Biases in the N1.0 and N1.5 SimulationsWe first investigated how N1.0 and N1.5 ensembles perform in simulating seasonal precipitation patterns. Figure 3 shows the observed precipitation and biases in simulated precipitation for the ensemble means from N1.0, N1.5, and N1.X for the four seasons DJF, MAM, JJA, and SON. The N1.0 and N1.5 simulations perform well in capturing the spatial pattern of precipitation for all seasons (Figure S1), however, both sets of simulations are generally wet-biased. With the exceptions of the CGCM3.1-R1, CGCM3.1-R2 and CGCM3.1-R3 simulations in N1.0 in DJF and SON, all the other simulations in N1.0 and N1.5 show an overall wet bias (Figures S2–S5).
Figure 3. Spatial pattern of precipitation (mm/mon) from Australian Gridded Climate Data set (AGCD) data (first row) and spatial pattern of biases (simulations minus observations) from ensemble mean NARCliM1.0 (second row), NARCliM1.5 (third row) and NARCliM1.X (fourth row) simulations. Here the columns (a,e,i,m) (b,f,j,n), (c,g,k,o) and (d,h,l,p) represent DJF, MAM, JJA and SON seasons, respectively. Stippling indicates statistically significant bias (Section 2.4 for method).
For the wettest season (DJF), both N1.0 and N1.5 ensemble mean simulations show the strongest bias (Figures 3e and 3i). However, overall, the N1.5 simulations show smaller biases in comparison to N1.0. The northern and eastern parts of domain show large improvements in N1.5; on average, the DJF biases in N1.5 (Figure 3i) are reduced by 90% over most of the domain. For the other three seasons, N1.5 simulations also show improvements in capturing the observed precipitation patterns and the biases are generally reduced with most improvement along the eastern coastlines. The exception is during SON when the N1.5 (Figure 3i) ensemble shows large biases along the eastern coastline.
We also investigated if the surface winds have an impact on the precipitation biases. For this analysis we used 10-m zonal and meridional wind data from ERA5 and compared it with the NARCliM ensemble as wind data are not available in AWAP (Figures S6–S7). Winds at 10-m were selected, because wind data at 850 hPa are not available for the N1.0 simulations. For DJF and MAM, both N1.0 and N1.5 show strong winds moving inland from the surrounding oceanic regions in the north-eastern part of the domain (Figure S7). These winds are much stronger in comparison to what is observed in ERA5. Past studies have shown that moisture advection via winds from oceanic regions contributes toward precipitation (Taschetto & England, 2009). Therefore, these stronger winds in RCMs can bring abundant moisture leading to more precipitation in RCMs thus contributing to a wet bias (Figure S6). Contrary to this in JJA and SON, both N1.0 and N1.5 show strong winds from inland to oceanic regions in the southern and eastern part of the domain (Figure S7). These winds are again stronger than those observed in ERA5. These strong winds in RCMs may carry the moisture away from the eastern coastlines contributing to the dry biases (Figure S6).
Similar to precipitation, the N1.0 and N1.5 simulations also perform well in capturing the spatial patterns of maximum temperature for all seasons (Figure S8). However, both sets of simulations show cold biases over most parts of the domain, except for the eastern coastline where a slight warm bias is identified (Figure 4). At seasonal timescales, the cold bias tends to be lower in intensity during summer DJF relative to during JJA. Looking at the individual simulations spatially reveals that N1.0 simulations show both hot (i.e., CGCM3.1-R1, CGCM3.1-R2, CGCM3.1-R3, ECHAM5-R1, ECHAM5-R2 and ECHAM5-R3) and cold biases (i.e., MK3.0-R1, MK3.0-R2, MK3.0-R3, MIROC-R1, MIROC-R2 and MIROC-R3), whereas N1.5 simulations on an average always show a cold bias (Figures S9–S12). The spread in the sign of the biases in N1.0 produces a comparatively smaller overall ensemble bias in N1.0 as compared to N1.5. Therefore, in contrast to the result of precipitation analysis, the simulated maximum temperatures in N1.5 show larger biases for all four seasons over most regions. However, biases over the eastern coastlines have been significantly reduced (∼0.5°C–1.2°C) in N1.5 relative to N1.0. For winter (JJA), N1.5 simulations show biases that are approximately double those of N1.0. For the other seasons, the biases in the N1.5 and N1.0 ensemble are similar and within the range of approximately 20%–35%.
Figure 4. Spatial pattern of maximum temperature (°C) from Australian Gridded Climate Data set (AGCD) data (first row) and spatial pattern of biases (simulations minus observations) from ensemble mean NARCliM1.0 (second row), NARCliM1.5 (third row) and NARCliM1.X (fourth row) simulations. Here the columns (a,e,i,m), (b,f,j,n), (c,g,k,o) and (d,h,l,p) represent DJF, MAM, JJA and SON seasons, respectively. Stippling indicates statistically significant bias (Section 2.4 for method).
The N1.0 and N1.5 simulations also perform well in capturing the spatial patterns of minimum temperature for the four seasons (Figure S13). However, the spatial variability of the positive and negative biases is found to be large for both N1.0 and N1.5 ensemble (Figure 5) and individual simulations (Figures S14–S17) and the magnitude of the biases in both ensembles are generally similar. Also, the biases in minimum temperature are smaller than those for maximum temperature. Here, for the summer and spring, N1.5 simulations show larger biases than N1.0 simulations. For these two seasons N1.5 shows an overall cold bias, albeit a warm bias along the eastern coastlines and parts of Victoria. Such warm biases also exist in N1.0, however, the earlier generation also show warm bias in northern parts of the domain. For the MAM and JJA, the biases in both sets of simulations show very similar patterns. In contrast to DJF and SON, N1.5 improves over N1.0 in these two seasons.
Figure 5. Spatial pattern of minimum temperature (°C) from Australian Gridded Climate Data set (AGCD) data (first row) and spatial pattern of biases (simulations minus observations) from ensemble mean NARCliM1.0 (second row), NARCliM1.5 (third row) and NARCliM1.X (fourth row) simulations. Here the columns (a,e,i,m), (b,f,j,n), (c,g,k,o) and (d,h,l,p) represent DJF, MAM, JJA, and SON seasons, respectively. Stippling indicates statistically significant bias (Section 2.4 for method).
The spatial maps of precipitation, maximum and minimum temperature also contain a combined statistic, that is, N1.X (last row in Figures 3–5). Although the two ensembles were driven by two generations of climate change scenarios, the RCM configuration is exactly the same and thus the combined ensemble (18-members) provides a useful expansion to the number of members individually available (12 in N1.0 and 6 in N1.5). This combination is especially relevant to users who want to use larger ensembles in their studies for robustness while continuing to use the N1.0 simulations. The N1.X combined ensemble members can also, in some cases, cancel biases with opposite signs and thus result in smaller biases in the ensemble mean. For instance, simulated precipitation in the N1.5 ensemble during DJF shows smaller biases than N1.0, whereas for SON the N1.0 ensemble shows smaller biases relative to N1.5 (Figure 3). The combined ensemble means from N1.X for precipitation can thus help to retain the added value from each of the N1.0 and N1.5 while increasing sample size. However, in terms of overall bias, the N1.5 ensemble represents precipitation better than N1.X.
The N1.X ensemble mean for maximum and minimum temperature possess similar characteristics in representing the historical climate, as compared to the smaller ensembles. Here again both N1.0 and N1.5 simulations show some advantages that complement each other by their respective skill in simulating the temperature for the different seasons and different spatial locations. The minimum temperature biases vary substantially spatially in both N1.0 and N1.5. Thus, such spatial variability is reflected in N1.X. In general, in order to get more robust statistics of temperature changes in a region, the use of N1.X has proven to be a beneficial strategy, as it would retain the added value from each of N1.0 and N1.5 and can also help in canceling the bias of different signs.
Domain Mean EvaluationTo assess the domain-wide results for N1.0 and N1.5, Taylor diagrams for model evaluation are examined. Taylor diagrams (Taylor, 2001) graphically summarize how closely a pattern (or a set of patterns) matches observations. The similarity between patterns in two data sets is quantified in terms of their correlation, their centered root-mean-square difference and the amplitude of their variations (represented by their standard deviations). These diagrams are especially useful in evaluating multiple aspects of complex models or in gauging the relative skill of many different models. From the Taylor diagrams for precipitation (Figures 6a–6d), it is evident that in comparison to N1.0, N1.5 simulations are consistently closer to observations for all seasons. The N1.0 simulations on average show a large spread and standard deviations for the ECHAM- and MIROC-driven simulations, which are the largest among all the N1.0 and N1.5 simulations. The Taylor diagrams of maximum and minimum temperature (Figures 6e–6l) show that both NARCliM iterations represent the historical climate well.
Figure 6. Taylor diagram to show the seasonal model correlation, bias and standard deviation of errors for both NARCliM1.0 (N1.0) and NARCliM1.5 (N1.5) simulations for precipitation (a,b,c,d), maximum temperature (e,f,g,h) and minimum temperature (i,j,k,l) for the four seasons DJF (a,e,i), MAM (b,f,j), JJA (c,g,k) and SON (d,h,l). Here red and blue color denotes N1.0 and N1.5 simulations, respectively, and the black star denotes the observation (AGCD).
Next in our analysis we investigated the domain-averaged biases (i.e., difference between models and AGCD observation data) for the three variables in the individual N1.0 and N1.5 simulations. For precipitation (Figure 7a–7d), in all seasons the N1.0 simulations show a very large spread in their biases. The N1.0 simulations show large positive biases, except for CGCM3.1-driven RCMs which show a combination of positive and negative biases; also evident in Figures S2–S5. On an average, all N1.5 ensemble members perform better than those of N1.0 during DJF and MAM. However, CGCM3.1-R1 from N1.0 performs better than individual simulations in N1.5 for both the DJF and MAM seasons (Figures S2 and S3). During JJA and SON a similarly large spread is observed in the N1.5 simulations and their ensemble mean is about the same as for N1.0. Given that the biases in N1.0 and N1.5 simulations are mostly of similar sign, N1.X does not give a quantitative improvement. To check the robustness of the results we also compared the domain mean biases and root-mean-square errors (RMSE) in N1.0 and N1.5 simulations obtained against AGCD and ERA5 data, respectively (Table S4). Similar to AGCD, the biases and RMSEs obtained against ERA5 also show that N1.5 ensemble performs better than N1.0 with respect to precipitation in all seasons, however, the largest improvement is observed in DJF and MAM.
Figure 7. Scatterplot of area-averaged-time-averaged bias (difference between the simulations and observations (AGCD)) for individual NARCliM1.0 (N1.0) and NARCliM1.5 (N1.5) simulations for precipitation (a,b,c,d), maximum temperature (e,f,g,h) and minimum temperature (i,j,k,l) for the four seasons DJF (a,e,i), MAM (b,f,j), JJA (c,g,k) and SON (d,h,l). Here red and blue color denotes individual N1.0 and N1.5 simulations, respectively, whereas large red, blue and green circles denote the ensemble mean of N1.0, N1.5 and N1.X (N1.0 + N1.5), respectively.
For maximum temperature (Figures 7e–7h), the N1.0 simulations show a large spread and their biases span both signs. However, all the N1.5 simulations show a negative bias (also evident in Figures S9–S12). For these reasons, the ensemble means in N1.0 indeed show an overall smaller bias as compared to N1.5. Scatterplots of the minimum temperature (Figures 7i–7l) again show a larger spread for the N1.0 simulations compared to N1.5. However, here the biases are both positive and negative. For example, in JJA, the spread in sign of the biases in the large ensemble produces a very small overall bias in N1.X (Figure 7k). The N1.5 ensemble shows larger biases for maximum and minimum temperature as compared to N1.0 ensemble when these data are evaluated against ERA5 data (Table S5–S6). This replicates the similar findings when comparing these data against AGCD. The consistent comparison of N1.0 and N1.5 simulations for two independent data sets strengthens the robustness of our results.
We also compared how individual N1.0 and N1.5 simulations perform in capturing the observed spread in precipitation and maximum and minimum temperature using box and whisker plots. For the three variables, most of the NARCliM models show a larger spread than the observations (Figure 8). For precipitation, the spread in individual N1.5 simulations is closer to that of the observations than the N1.0 simulations. However, the distributions in the CGCM3.1-R1, CGCM3.1-R2 and CGCM3.1-R3 simulations are closest to observations (similar to the results shown Figures 7a–7d). For maximum and minimum temperature, both N1.0 and N1.5 simulations are approximately within the spread of observations for the MAM and SON seasons. For the other two seasons for temperature, the N1.0 simulations show larger spread than the N1.5 simulations, in which the distributions are more on the lower end of the corresponding observed distribution.
Figure 8. Box and whisker plots for observation (AGCD) and individual NARCliM1.0 (N1.0) and NARCliM1.5 (N1.5) simulations for precipitation (a,b,c,d), maximum temperature (e,f,g,h) and minimum temperature (i,j,k,l) for the four seasons DJF (a,e,i), MAM (b,f,j), JJA (c,g,k) and SON (d,h,l). Here lower and upper box boundaries denote 25th and 75th percentiles, respectively, the horizontal line inside the box denotes the median, and lower and upper error lines are 10th and 90th percentiles respectively. Here green, red and blue color denotes AGCD, and individual N1.0 and N1.5 simulations, respectively.
We investigated how the individual models and the ensemble mean in N1.0 and N1.5 perform in capturing the monthly variability of precipitation, and maximum and minimum temperature with respect to observations (Figure S18). For precipitation, between March and August (the months that generally receive the least precipitation), both the N1.0 and N1.5 simulations capture the variability well and are close to observations. In contrast, for the other months (when precipitation received is higher), there is a large spread in the individual N1.0 and N1.5 simulations and on average the N1.0 simulations suffer from large wet bias, as has been discussed above, and the N1.5 simulations are closer to observations. For maximum and minimum temperature, both the N1.0 and N1.5 simulations capture the seasonal variability well and the spread between the individual simulations is small.
In summary, the analyses of spatiotemporal patterns of biases, spatial statistics, distribution, and interannual variability consistently reveal that, for precipitation N1.5 simulations have provided a clear and substantial improvement over N1.0. Conversely, a mixed result is found for temperature. In comparison to N1.0, overall N1.5 simulations do not show any improvement for maximum temperature. However, for minimum temperature, some seasons (MAM and JJA) are improved in N1.5 simulations, whereas the other seasons show larger biases in comparison to N1.0 simulations.
Climate Projections for N1.0 and N1.5Having assessed the performance of N1.5 simulations in comparison to the N1.0 ensemble and observations for the historical period (1990–2009), we now focus on comparing N1.0 and N1.5 for the far future period (2060–2079).
The N1.0 and N1.5 simulations significantly differ for precipitation for all seasons, with the most distinct and strongest differences in DJF and SON (Figure 9). For DJF and MAM, N1.0 ensemble simulations project large increases in precipitation for most of the domain except the southern-most parts of Victoria (Figures 9a and 9b). The individual N1.0 simulations (Figures S19–S22) reveal that the strong increases in precipitation in N1.0 ensemble is mostly driven by the strong increases in precipitation for ECHAM5 and MIROC driven simulations. The N1.5 simulations, on the other hand, project future decreases in precipitation in many parts of domain, with only some increases in the eastern parts of the domain for DJF and MAM (Figures 9e and 9f) that is found in all the individual N1.5 simulations (Figures S19 and S20).
Figure 9. Future changes (difference between far future (2060–2079) and the historical time-period (1990–2009)) in precipitation (mm/mon) for ensemble mean NARCliM (N1.0) (a,b,c,d), NARCliM1.5 (N1.5) (e,f,g,h) and N1.X (N1.0 + N1.5) (i,j,k,l) simulations for RCP8.5 and for the four seasons DJF (a,e,i), MAM (b,f,j), JJA(c,g,k) and SON(d,h,l). Stippling indicates statistically significant change (Section 2.4 for method).
For JJA, the N1.5 simulations project future decreases in precipitation in most of the domain (except for Victoria), whereas the N1.0 simulations project uniform but negligible changes (Figure S21). Similarly, for SON, the N1.5 simulations project very strong decreases in precipitation. The N1.0 simulations for SON project very strong decreases in precipitation only in the southern parts of Victoria, only very small changes for all other areas (due to the difference in the sign of projections in individual N1.0 driven simulations (Figure S22). The reasons for the differences in the N1.0 and N1.5 projections can be attributed to the different emission scenarios and/or the different driving GCMs used respectively in the two iterations of NARCliM. However, the latter reason is more apparent as the N1.0 and N1.5 simulations have captured similar projections of dry and wet patterns as their driving GCMs (Figures S23 and S24).
N1.5 projects larger changes in maximum and minimum temperature compared to N1.0 (Figures 10 and 11, respectively). This is also evident in the individual N1.0 and N1.5 simulations (Figures S25–S32). On average, future maximum temperature projections of N1.5 for DJF, MAM, JJA and SON are approximately 2.2°C, 1.5°C, 1.9°C, and 3.1°C higher than N1.0, respectively. On the other hand, the projected minimum temperature changes by N1.5 for the four seasons are approximately 1.6°C, 0.8°C, 0.4°C, and 1.4°C higher than N1.0, respectively. The stronger changes in temperature for N1.5 simulations is expected, as the driving GCMs were chosen to represent futures with greater warming than those provided in the N1.0 ensemble. Some of the additional warming can also be explained by the higher emissions scenarios (RCP8.5) used in N1.5, particularly in the second half of the 21st century.
Figure 10. Future changes (difference between far future (2060–2079) and historical time-period (1990–2009)) in maximum temperature (°C) for ensemble mean NARCliM (N1.0) (a,b,c,d), NARCliM1.5 (N1.5) (e,f,g,h) and N1.X (N1.0 + N1.5) (i,j,k,l) simulations for RCP8.5 and for the four seasons DJF (a,e,i), MAM (b,f,j), JJA(c,g,k) and SON(d,h,l). Stippling indicates statistically significant change (Section 2.4 for method).
Figure 11. Future changes (difference between far future (2060–2079) and historical time-period (1990–2009)) in minimum temperature (°C) for ensemble mean NARCliM (N1.0) (a,b,c,d), NARCliM1.5 (N1.5) (e,f,g,h) and N1.X (N1.0 + N1.5) (i,j,k,l) simulations for RCP8.5 and for the four seasons DJF (a,e,i), MAM (b,f,j), JJA(c,g,k) and SON(d,h,l). Stippling indicates statistically significant change (Section 2.4 for method).
As can be seen in Figure 2, the combination of driving GCMs in N1.0 and N1.5 have sampled the future change space of the entire CMIP5 ensemble (RCP8.5) quite well. Examining the combined N1.X ensemble thus provides an improved sampling of this future change uncertainty. However, here we acknowledge that much of the ensemble that represents N1.X is from the CMIP3 GCMs which are quite different from CMIP5 GCMs in terms of modeling and emission scenario. While these differences must be considered, past studies have shown that for key climate variables, both CMIP3 and CMIP5 GCMs project similar future changes in the higher emission scenario (Flato et al., 2013; Knutti and Sedláček, 2013; Moise et al. 2015). As shown in Figure 2 the combined N1.X ensemble provides good coverage of the combined CMIP3 and CMIP5 future change space. Future projections of precipitation from the N1.X ensemble (Figures 9i–9l) reveals wetter conditions in the DJF and MAM seasons and drier conditions in JJA and SON. Future projections of maximum (Figures 10i–10l) and minimum temperature (Figures 11i–11l) show that DJF is warming faster than JJA.
As discussed in the methodology, the N1.0 simulations only consist of three 20-year time periods, whereas the N1.5 simulations have been run continuously from 1950 to 2100. To take advantage of the continuous runs of N1.5 and to compare them with the N1.0 epochs and observationally based estimates, we also looked at the 150-year timeseries of anomalies (reference period- 1990–2009) of maximum (Figure 12), and minimum temperature (Figure 13) and mean precipitation (Figure 14) for all the capital cities in the 10-km domain, that is, Sydney, Melbourne, Canberra, Brisbane and Adelaide (for location refer to Figure 1). Here, the aim is to examine the range of projections in the wet/dry and warm/cold periods under the historical period and future global warming for these highly populated capital cities.
Figure 12. Timeseries (from 1951 to 2099) of maximum temperature anomaly (calculated with respect to 1990–2009) for the five capital cities (a) Sydney, (b) Melbourne, (c) Canberra, (d) Brisbane and (e) Adelaide within the model domain. Here red and blue solid lines denote the ensemble mean of NARCliM1.0(N1.0) and NARCliM1.5(N1.5) simulations, respectively, and the shading denotes the maximum and minimum across the ensemble members. The black solid line represents the AGCD observations.
Figure 13. Timeseries (from 1951 to 2099) of minimum temperature (°C) for the five capital cities (a) Sydney, (b) Melbourne, (c) Canberra, (d) Brisbane and (e) Adelaide within the model domain. Here red and blue solid lines denote the ensemble mean of NARCliM1.0(N1.0) and NARCliM1.5(N1.5) simulations, respectively, and the shading denotes the maximum and minimum across the ensemble members. The black solid line represents the AGCD observations.
Figure 14. Timeseries (from 1951 to 2099) of precipitation (mm/mon) for the five capital cities (a) Sydney, (b) Melbourne, (c) Canberra, (d) Brisbane and (e) Adelaide within the model domain. Here red and blue solid lines denote the ensemble mean of NARCliM1.0(N1.0) and NARCliM1.5(N1.5) simulations, respectively, and the shading denotes the maximum and minimum across the ensemble members. The black solid line represents the AGCD observations.
The N1.5 maximum temperature timeseries aligns well with the N1.0 epochs and adequately fills the temporal gaps. The continuous and longer N1.5 simulations are also found to be comparable with observations (AGCD) for all the capital cities. However, focusing just on the far-future period (2060–2079), N1.5 simulations projects larger warming compared to N1.0 simulations for all the capital cities except Melbourne and Adelaide where the projections from N1.0 and N1.5 are comparable to each other. Similar results are also observed for timeseries of minimum temperature (Figure 13) and precipitation (Figure 14) where the observed and projected changes for the different cities are found to compare well with both observations and N1.0 simulations, reinforcing that the two NARCliM iterations can be used together for climate analyses.
SummaryIn summary, the evaluation of historical climate revealed that N1.0 and N1.5 simulations are generally cold-biased for maximum temperature, slightly warm-biased for minimum temperature, and overestimated precipitation. However, model performance in N1.0 and N1.5 simulations varied considerably between seasons.
Previous studies have suggested that some of the cold bias in maximum temperature in N1.0 simulations might be partly caused by the unified Noah land surface physics scheme used in these simulations (Ji et al., 2016; Olson et al., 2016). This scheme has been previously shown to have cold biases over snow-covered regions in winter and to simulate overly high summertime soil moisture and evaporation (García-Díez et al., 2015). Although snow covers a small proportion of land surface in south-eastern Australia during cool months, overestimated soil moisture is a potential explanation for the RCMs’ cold biases. This hypothesis was investigated by a recent study by Di Virgilio et al. (2019b), who evaluated N1.5 reanalysis driven simulations over the Australian continent. They found a strong negative correlation between mean monthly precipitation biases and mean monthly maximum temperature biases over most of Australia including the NARCliM inner domain; thus, suggesting that precipitation overestimation is a likely cause of the large maximum temperature cold bias in these simulations.
In terms of comparison between N1.0 and N1.5, the simulated maximum temperatures in N1.5 show larger biases for all four seasons over most regions. This is attributed to the spread in the sign of the biases in N1.0 which produces a comparatively smaller overall ensemble bias in N1.0 as compared to N1.5.
Both N1.0 (except R3 RCMs) and N1.5 simulations show close agreement with regards to observed minimum temperatures with fairly small biases. Previous studies have noted that this may partially stem from their use of the Mellor-Yamada-Janjic local planetary boundary layer (PBL) scheme, which was found to contribute to an accurate simulation of minimum temperature over Southern Spain (Argueso et al., 2011). For minimum temperature, N1.5 simulations show similar results to N1.0 with no substantial improvement in the overall bias.
For precipitation, both N1.0 and N1.5 simulations showed a widespread strong wet bias. Previous studies have noted that some of the precipitation biases in N1.0 simulations are inherited from the driving GCMs (Di Luca et al., 2016a; Olson et al., 2016). Another reason for wet biases in N1.0 and N1.5 simulations can be related to the WRF model set up chosen to run these simulations. The RCMs selected to run N1.0 and N1.5 simulations are based on their skill to simulate selected 2-week periods around several storm events, rather than their performance on the climatological scale (Evans et al., 2012; Olson et al., 2016). In addition to this, previous studies have also noted that, under several parameterisations, the model shows wet biases due to absence of radiative effects of unresolved cumulus (García-Díez et al., 2015).
However, in contrast to N1.0, the biases in precipitation were found to be smaller in N1.5 simulations. This is again attributed to the driving GCMs in N1.5 simulations, most of which show smaller biases than the driving GCMs in N1.0 simulations. Also, previous studies have suggested that the finer scales of the CMIP5 GCMs leads to more accurate estimation of precipitation than CMIP3 models (Gulizia & Camilloni, 2015). It has always been a challenging task for models to simulate precipitation accurately (Li et al., 2016; Potter et al., 2020). N1.5 simulations show a substantial improvement in capturing the seasonal patterns and magnitudes of precipitation. These results indicate that N1.5 simulations are an improvement on the earlier N1.0 in simulating the historical climate as compared to N1.0 simulations, at least for precipitation.
Future projections of precipitation in N1.0 and N1.5 simulations show contrasting results. For DJF and MAM, N1.0 simulations project very large increases in precipitation for most parts of the domain whereas N1.5 simulations project future decreases in precipitation over most parts of the domain. For SON and JJA, N1.5 simulations project very large decreases in precipitation, whereas the N1.0 ensemble projects uniformly negligible changes in precipitation (except for parts of Victoria). These major differences were partly explained by the driving GCMs in N1.0 and N1.5 simulations, respectively.
The pattern of future changes in both maximum and minimum temperature are found to be comparable in N1.5 and N1.0 simulations. Both sets of simulations project future warming throughout the domain; however, the magnitude of warming is found to be stronger in N1.5 simulations. This is understandable, as N1.5 simulations use a higher emission scenario.
ConclusionsThis paper evaluates the ability of six RCMs within the second iteration of NARCliM (N1.5) to simulate mean precipitation, and maximum and minimum temperature, placed in the context of the performance of the first generation of NARCliM (N1.0) in simulating these variables. It is important to reiterate that differences between the performances of the two ensembles are due to several factors. One of the reasons for these differences can be attributed to the number of GCM-RCM combinations used in N1.0 and N1.5 simulations (12 and 6 respectively). This difference can partially impact the ensemble results as 12 ensemble members in N1.0 capture a larger spread than the 6 ensemble members in N1.5 and thus give a more inclusive result.
Another reason for the differences between N1.0 and N1.5 in their evaluation compared to observations could be due to the improved sophistication of the driving GCMs in N1.5 simulations. However, while the CMIP5 GCMs have undergone further development compared to the CMIP3 GCMs, the resulting simulation of climate remains similar in most respects (Flato et al., 2013; Knutti & Sedláček, 2013; Moise et al., 2015).
The evaluation of N1.0 and N1.5 for the historical climate revealed that N1.5 precipitation is closer to observations in comparison to N1.0. Model agreement with observations of historical climate provides a means to assign model confidence. Here we assert that accurate representation of past climate means a model is a necessary, though not sufficient, condition to produce reliable future projections. With this hypothesis N1.5 projections are more reliable for the future period; however, this is not guaranteed.
This paper also presented combined results of both N1.0 and N1.5, i.e., N1.X both for the historical climate and future projections. This joint N1.X result has its own limitations. For example, the ensemble merges RCMs driven by different generations of CMIP. For the future projections, the ensemble merges two emission scenarios (i.e., SRES A2 and RCP8.5) which are based on different hypotheses and carbon emissions. The ensemble is also biased toward the N1.0 simulations as the number of ensemble members in N1.5 is only 50% of that in N1.0. However, given the limitations, N1.X ensemble still demonstrates the complementary utility of N1.5 with the original N1.0 and the underlying objective that N1.5 simulations do not replace N1.0 simulations, rather, these updated and enhanced simulations complement N1.0 simulations. Together N1.0 and N1.5 provide a more complete sampling of the future change space.
N1.5 uses the RCMs used in N1.0 simulations (i.e., no change in model parametrization) and has similar biases to N1.0 (wet and cold biases, however with significant quantitative improvement). As a part of future work, we suggest performing a rigorous evaluation of other RCMs, inclusive of recent parameterization developments, which may provide substantial improvement in capturing the observed climate variables. We are currently working toward the third iteration of NARCliM simulations, N2.0, to overcome most of the limitations of the N1.5 and N1.0 ensembles.
AcknowledgmentsThis work is made possible by funding from the NSW Climate Change Fund for NSW and ACT Regional Climate Modeling (NARCliM) Project. The modeling work was undertaken on the National Computational Infrastructure (NCI) high performance computers in Canberra, Australia, which is supported by the Australian Commonwealth Government. We thank the climate modeling groups for producing and making available their model output, the Earth System Grid Federation (ESGF) for archiving the data and providing access, and the multiple funding agencies who support CMIP3, CMIP5, and the Earth System Grid Federation (ESGF).
Data Availability StatementNARCliM1.0 and NARCliM1.5 data that support the findings of this study are currently available publicly via the the NSW Climate Data Portal (
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2021. This work is published under http://creativecommons.org/licenses/by-nc/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
The NARCliM project contributes to the CORDEX initiative for Australasia. The first generation of NARCliM (N1.0) used CMIP3 global climate models (GCMs) and provided near and far future estimates of climate change across Australasia at 50-km and southeast Australia at 10-km resolution under a business-as-usual climate scenario. However, multiple sets of 20-year periods in N1.0 did not permit analysis of long-term, inter-annual to decadal trends across the 21st century. Feedback on user needs for regional climate information revealed the desire for multiple emission scenarios and use of newer CMIP5 GCMs for dynamical downscaling. These limitations led to development of the second iteration of NARCliM, namely NARCliM1.5 (N1.5). The N1.5 downscaling exercise uses CMIP5 GCMs and is temporally expanded to cover 150 years (1950–2100) for two future Representative Concentration Pathways (RCP4.5 and RCP8.5). N1.5 simulations remain at the 50-km and 10 km resolutions over the same domains as N1.0, thus producing an expanded and complementary data set for regional climate change. N1.5 simulations substantially improve over N1.0 in capturing the seasonal patterns and magnitudes of precipitation, including improvements in overall bias. Conversely, N1.5 shows similar results to N1.0 for maximum and minimum temperature, with no substantial improvement in overall bias. N1.5 projections project a hotter and drier future relative to N1.0. The combined N1.0 and N1.5 ensemble provides a wider spread of future climates more representative of that found in the full CMIP5 ensemble. Together, N1.0 and N1.5 ensembles provide an improved, more comprehensive data set for studying climate change.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details





1 Science, Economics and Insights Division, NSW Department of Planning, Industry and Environment, Sydney, NSW, Australia
2 Climate Change Research Centre, University of New South Wales, Sydney, NSW, Australia; Australian Research Council Centre of Excellence for Climate Extremes, University of New South Wales, Sydney, NSW, Australia
3 Science, Economics and Insights Division, NSW Department of Planning, Industry and Environment, Sydney, NSW, Australia; Climate Change Research Centre, University of New South Wales, Sydney, NSW, Australia; Australian Research Council Centre of Excellence for Climate Extremes, University of New South Wales, Sydney, NSW, Australia
4 Science, Economics and Insights Division, NSW Department of Planning, Industry and Environment, Sydney, NSW, Australia; Australian Research Council Centre of Excellence for Climate Extremes, University of New South Wales, Sydney, NSW, Australia