Get Complete Project Material File(s) Now! »

## Challenges in LSM representation

The land surface physics includes an extensive collection of complex processes. The balance between model complexity and resolution, subject to computational limitations, represents a fundamental question in the development of LSM. By increasing the comprehension of physical phenomena, LSM can grow into a more complex model adding new processes extrapolated from the environment. In the next section, some of the complexities found in most LSM are presented.

**Surface heterogeneity**

The soil is a complex environment. Its composition includes inorganic and organic particles that determine its inherent characteristics. Its ability to retain water and its density varies drastically depending on the percentage of primitive components (sand, silt, or clay) or structure (ash, fine, or coarse sand). There may be several types of soil in a small area, with their associated features, especially water content. Omission of surface spatial heterogeneity in a LSM can cause errors in flux estimation (Courault et al., 2010, Olioso et al., 2005). Spatial variations in surface heterogeneity are imperative in order to guarantee an accurate simulation of the land-surface fluxes. The sub-grid scale land surface heterogeneity must be parameterized in the surface scheme so that the land characteristics are accounted for in the model (Manrique et al, 2013).

**Numerical representation**

The development process of a physical model into numerical software extends from the physical world to the mathematical model, then to the computational algorithm and finally to the computer implementation, involves a number of approximations: physical effects may be discarded, continuous functions replaced by discretized ones and real numbers replaced by finite precision representations. In consequence, approximation is in the core of scientific software and cannot be neglected. It is important to manage them judiciously.

The accuracy of a computation determines how close the computation (affected by random errors) comes to the true mathematical value. It indicates, therefore, the correctness of the result. In particular, numerical verification, as Rump (1983) mentioned, is required to give confidence that the computed results are acceptable. The precision of a computation reflects the exactness of the computed result without reference to the meaning of the computation. It is, therefore, the number of significant digits affected by round-off error. Arithmetic expressions and variable assignment always produce approximation errors, due to the nature of the floating point arithmetic. Approximation modes in computer software will determines the precision in the several operations made in the model coded. This precision is independent of the code, data or machine. When building LSM numerical representation, we have to be aware of these errors and track their propagation.

**Mathematical representation and model calibration**

Representing a physical model need the definition of its numerical and its discretized form. In terms of simplicity, models have to include the least amount of parameters needed to achieve a good performance in its estimates (parsimonious models). Model building is best achieved by starting with the simplest structure and gradually and accurately increasing the complexity as needed to improve model performance (Wainwright et al., 2004). However, there is no metric that quantifies the estimate improvement by increasing complexity.

The parameters that are required to compute real world outputs estimations are best defined by a model structure that best represent the processes measured in the real world. In practice, this can be difficult to achieve. With our model, we may be interested in trying to reconstruct past events that need some parameters that are impossible to measure. In that case we may have to make reasonable assumptions based on indirect evidence. These assumptions can be made in order to define the model from reality. Several of them will and can be wrong, nonetheless they are necessary for the model development. The output of the model depends completely on the validity and scope of these assumptions. A parameter measurement must be chosen based on the impact the variation of a parameter has on the model output, or the model sensitivity to this parameter.

According to Kirkby et al (1992) there are two types of parameters in a model: the physical parameters which define the physical structure of the model, and the process parameters or multiplying factors, which weigh the magnitude of variables in the model. The physical parameters are determined from experimental measurements. The process parameters are defined from a calibration and adjustment process. In both cases, the definition of the initial parameter value can be a difficult task. The physical parameters are determined on small scales, and then they are extrapolated, given the spatial and temporal variability in the region we are working.

With the purpose of adapting the value of the parameter to a value that reproduces the real world, measurements of a phenomenon are necessary in order to compare to our model estimates. The calibration process consists in an optimization process against a measure of the agreement between model results and a set of observations. It allows the agreement between the model and the available measures; however, this process may give clues to poorly defined processes in the model (Pipunic et al., 2008).

With respect to the LSM, many works have focused on the calibration of the models based on soil moisture measurements, since it is an observation easy to obtain, it is directly measured with high frequency and is the solution of the water budget. There are many sources of data available in a wide range of ecosystem.

**Thesis Challenges**

**State of the art in the use of LST to constrain LSM**

Several works regarding the calibration of LSM based on LST measurements demonstrate the improvement in fluxes estimation, when constraining model parameters.

In Castelli et al. (1999), a variational data assimilation approach is used to include surface energy balance in the estimation procedure as a physical constraint (the adjoint technique). The authors work with satellite data, where soil skin temperature is directly assimilated. As a conclusion, constraining the model with such observation improves model fluxes estimations, with respect to in situ measurements.

In Huang et al. (2003) the authors developed a one-dimensional land data assimilation scheme based on ensemble Kalman filter, used to improve the estimation of soil temperature profile. They conclude that the assimilation of LST into land surface models is a practical and effective way to improve the estimation of land surface state variables and fluxes.

Reichle et al. (2010) performed an assimilation of satellite-derived skin temperature observations using an ensemble-based, offline land data assimilation system. Results suggest that retrieved fluxes provide modest but statistically significant improvements. However, they noted strong biases between LST estimates from in situ observations, land modeling, and satellite retrievals that vary with season and time of day. They highlighted the importance to take these biases properly, or else large errors in surface flux estimates can result. In Ghent et al. (2011), the authors investigate the impacts of data assimilation on terrestrial feedbacks of the climate system. Assimilation of LST helped to constrain simulations of soil moisture and surface heat fluxes. Another study by Ghent et al. 2011, investigates the effect that data assimilation has on terrestrial feedbacks to the climate system. The authors state that representation of highly complex biophysical processes in LSMs over highly heterogeneous land surfaces with limited collections of mathematical equations, and the tendency of over parameterization, infers a degree of uncertainty in their predictions. Assimilation of land surface temperature (LST) to constrain simulations of soil moisture and surface heat fluxes can be integrated into the model to update a quantity simulated by the model with the purpose of reducing the error in the model formulation. The correction applied is derived from the respective weights of the uncertainties of both the model predictions and the observations. The results found in this research suggest that there is potential for LST to act as surrogate for assimilating other state variables into a land surface scheme.

Ridler et al. (2012) tested the effectiveness of using satellite estimates of radiometric surface temperatures and surface soil moisture to calibrate a Soil–Vegetation–Atmosphere Transfer (SVAT) model, based on error minimization of temperature and soil moisture model outputs. Flux simulations were improved when the model is calibrated against in situ surface temperature and surface soil moisture versus satellite estimates of the same fluxes.

In Bateni et al. (2013), the full heat diffusion equation is employed as constrain, in the variational data assimilation scheme. Deviations terms of the evaporation fraction and a scale coefficient are added as penalization terms in the cost function. Weak constraint is applied to data assimilation with model uncertainty, accounting in this way for model error. The cost function in this experience contains a term that penalizes deviation from prior values. When assimilating LST into the model, the authors proved that the heat diffusion coefficients are strongly sensitive to specific deep soil temperature. As a conclusion, it can be seen that the assimilation of LST can get a remarkable improvement in the model simulated flows.

**General objectives**

In this work, the LSM used is ORCHIDEE (Krinner et al., 2005), most specifically the part of the model computing the energy and hydrology balance (SECHIBA, Ducoudré et al, 1993). These models are introduced in Chapter 2.

The general objective of this thesis is to constrain the SECHIBA model parameters by assimilating measurements products in a 4DVAR assimilation system. The parameters, once constrained, allow the model to improve state variables estimation when comparing them to measurements.

From this general purpose, several specific objectives arise, as mandatory steps to implement an effective assimilation system, flexible enough to assimilate different observations and constraining at the same time different model parameters. These specific objectives are:

1. Study of SECHIBA and implementation into YAO: the understanding of the model physics through its standard Fortran code implementation is a mandatory step, in order to extract model dynamics and principal components. By knowing this, the implementation of SECHIBA in YAO can be made, by defining a modular graph representing the model dynamics and physics of the model. Our implementation of SECHIBA in YAO is called SECHIBA-YAO 1D. Once our model is coded, the direct model is verified comparing its output with the original model. The adjoint model is verified by performing a sensitivity analysis, allowing us to obtain, in addition, a parameter hierarchy of the most influential parameter in the estimation of land surface temperature. SECHIBA-YAO 1D aims to run 4DVAR assimilation.

2. Validate the assimilation system, by implementing twin experiments. The idea is to test the robustness of the assimilation system, by computing variable and parameter performances. This phase highlights also the limits of the model when varying the control parameter set

3. Improve model estimation by performing a 4D-VAR assimilation of land surface temperature, using in situ measurements of SMOSREX site, in Toulouse, France. Available measurements of brightness temperature are compared with an equivalent form of temperature estimation added to SECHIBA, constraining model parameters to improve the simulation of the model variables, such as latent heat flux, sensible heat flux, net radiation, brightness temperature and soil moisture.

**Organization**

The thesis is organized such that the theoretical support is presented first, in Chapters 1 to 4. Experiment results concerning sensitivity analysis and variational data assimilation are presented in Chapters 5 to 7. Conclusions of the thesis are presented in Chapter 8. Finally, complementary information is presented in the Appendix section.

In Chapter 1, the introduction to land surface models and the nub of the thesis is presented. In Chapter 2, the land surface model used in this work (ORCHIDEE) is introduced, and more specifically SECHIBA and its main components and features. Equations governing the energy and hydrologic budget computed with SECHIBA are listed. In addition, data sources used in this work are introduced: FLUXNET network stations and SMOSREX in situ measurements.

Chapter 3 concerns variational data assimilation theoretical aspects. Additionally, the modular graph approach to represent models is presented. It is explained how an equivalent of the adjoint and tangent linear model is obtained, by computing the forward and the backward of the model through a modular graph decomposition. This approach is the basic idea of the YAO software, serving as a framework to implement SECHIBA variational data assimilation.

In Chapter 4, the YAO approach is presented. This software served us as an adjoint semi generator. Principal components of a YAO project are introduced, as well as the input/output data management. Finally, a general guide of how SECHIBA model was implemented in YAO is introduced. The different steps from the model conceptualization to the testing phase are explained in detail, serving as a guide to future implementations.

In Chapter 5, once the adjoint of SECHIBA is obtained using its YAO representation, we perform a variational sensitivity analysis with the idea of validating the adjoint of the model, by comparing the gradients obtained with SECHIBA-YAO 1D to the ones computed with finite differences, using the direct model outputs. In addition, we build a parameter hierarchy in order to determine the most influential parameters in the computation of land surface temperature. The sensitivity study was performed using FLUXNET sites (Kruger Park and Harvard Forest).

In Chapter 6, twin experiments are presented, using the FLUXNET data set. Different scenarios were tested in different experiments in order to account for the effect of assimilating synthetic observations of land surface temperature. In this chapter we show the potential of our assimilation system by using land surface temperature as observation.

In Chapter 7, assimilation of in situ measurements is presented using the SMOSREX site forcing. With different scenarios, this chapter shows the performance of assimilating land surface temperature with different initial conditions of parameters, time frames, among others. The idea is to show if the optimization of land surface temperature allows to constrain our model parameters and to better simulate surface fluxes

**Description of the Land Surface Model ORCHIDEE and datasets**

**ORCHIDEE**

ORCHIDEE is a model representing the continental biosphere and its different processes, comprising the simulation of soil and vegetation mechanism and simulating different fluxes between the soil-atmosphere interface (Polcher et al., 1998, Krinner et al., 2005, Brender, 2012). ORCHIDEE has different time scales: energy and matter has a 30-minutes time scale. Species competition processes at 1-year time scale. The vegetation is grouped into 13 Plant Functional Type (PFT). The equations governing the processes are general, with specific parameters for each PFT. ORCHIDEE is used in a grid-point mode (one given location), forced with the corresponding local half hourly gap-filled meteorological measurements.

**Modules**

SECHIBA (Schématisation des Echanges Hydriques à l’Interface Biosphère-Atmosphère) (Ducoudré et al, 1993) is a biophysical model. It calculates the radiation and energy budgets of the surface, and the soil water budget every half hour. The energy and water fluxes between the atmosphere and the ground integrate all the vegetation layers; the retrieved temperature represents the canopy ensemble and the soil surface. The main fluxes modeled are the sensible and latent heat flux between the atmosphere and biosphere, the soil temperature and the water reservoirs evolution, the stomata conductance and gross primary productivity of the canopy.

STOMATE (Saclay Toulouse Orsay Model for the Analysis of Terrestrial Ecosystems) is a biogeochemical model. It represents the process related to the carbon cycle, such as carbon dynamics, the allocation of photosynthesis (Friedlingstein et al, 1999), respiration and growth maintenance, heterotrophic respiration (Ruimy et al., 1993) and phenology (Botta, 1999). STOMATE simulates the dynamics of continental carbon with no time every day. It links between processes at short time scales determined by SECHIBA and slower processes described by the following module.

LPJ (Lund-Potsdam-Jena) (Sitch et al, 2003) is a model of global dynamics of the vegetation. It incorporates the phenomena of interspecific competition for sunlight, fire occurrence, seedling establishment, plant mortality, and deduce the dynamic long-term (annual time step) of vegetation.

#### Biosphere Characterization

The surface model SECHIBA aims at representing the water and energy exchanges at the land surface. However, for a given moisture condition, they are highly dependent on soil type and vegetation cover. ORCHIDEE considers the diversity of a given ecosystem by defining 13 Plant functional Type (PFT). The vegetation is classified according to their ecophysiologic characteristics. Twelve common PFT exists, plus the bare soil; they are presented in Table 2.1. This classification depends on several parameters such as the appearance of the plant (tree or herb), the type of leaf (needle or leaves), the method of photosynthesis (C3 or C4) and the phenology type.

**Table of contents :**

**CHAPTER 1 INTRODUCTION**

**LAND SURFACE MODELS. OBJECTIVES AND ORGANIZATION OF THE THESIS**

1.1 INTRODUCTION

1.2 COMPONENTS OF LAND SURFACE MODELS

1.2.1 WATER PROCESSES

1.2.2 SOIL THERMODYNAMICS

1.3 IMPORTANCE OF REPRESENTING THE PHYSICS OF THE SOIL SURFACE CORRECTLY

1.4 CHALLENGES IN LSM REPRESENTATION

1.4.1. SURFACE HETEROGENEITY

1.4.2 NUMERICAL REPRESENTATION

1.4.3. MATHEMATICAL REPRESENTATION AND MODEL CALIBRATION

1.5 THESIS CHALLENGES

1.5.1. STATE OF THE ART IN THE USE OF LST TO CONSTRAIN LSM

1.5.2 GENERAL OBJECTIVES

1.5.3 ORGANIZATION

**CHAPTER 2**

**DESCRIPTION OF THE LAND SURFACE MODEL ORCHIDEE AND DATASETS**

2.1. ORCHIDEE

2.1.1 MODULES

2.1.2 BIOSPHERE CHARACTERIZATION

2.2 SECHIBA

2.2.1 FORCING

2.2.2 ENERGY BUDGET

2.2.3 HYDROLOGICAL BUDGET

2.2.4. SECHIBA PARAMETERS

2.3 DATA

2.3.1. EDDY COVARIANCE MEASUREMENTS

2.3.2 SMOSREX

**CHAPTER 3**

**THEORETICAL PRINCIPLES OF VARIATIONAL DATA ASSIMILATION**

3.1 INTRODUCTION AND NOTATION

3.2 ADJOINT METHOD

3.3. REPRESENTING A MODEL AND ITS ADJOINT THROUGH A MODULAR GRAPH

3.3.1. DEPLOYMENT OF A MODULAR GRAPH

3.4. DIAGNOSTIC TOOLS FOR THE ASSIMILATION SYSTEM

3.4.1 TEST THE CORRECTNESS OF THE ADJOINT MODEL

3.4.2. TEST THE CORRECTNESS OF THE COST FUNCTION GRADIENTS

3.4.3. DERIVATIVE TEST

3.5. SUMMARY

**CHAPTER 4**

**THE YAO APPROACH: THEORETICAL ASPECTS AND IMPLEMENTATION OF SECHIBA-YAO 1D**

4.1 INTRODUCTION

4.2 YAO APPROACH

4.3. CREATING A PROJECT WITH YAO

4.3.1 INPUT / OUTPUT MANAGEMENT

4.3.2 DIAGNOSTIC TOOLS FOR THE GENERATED PROJECT

4.4. DEVELOPMENT OF SECHIBA-YAO 1D

4.4.1 IMPLEMENTATION OUTLINE

4.4.2 DIRECT MODEL VALIDATION

4.4.3 ADJOINT MODEL VALIDATION

**CHAPTER 5**

**SENSITIVITY ANALYSIS OF THE SECHIBA-YAO 1D MODEL USING FLUXNET DATASET**

5.1 INTRODUCTION

5.2 VARIATIONAL SENSITIVITY ANALYSIS

5.2.1. SENSITIVITY ANALYSIS WITH LAND SURFACE TEMPERATURE

**CHAPTER 6**

**TWIN EXPERIMENTS WITH SECHIBA-YAO 1D USING FLUXNET MEASUREMENTS**

6.1 INTRODUCTION

6.2 EXPERIMENT DEFINITION

6.3. RESULTS

6.3.1 EFFECT OF THE OBSERVATION SAMPLING

6.3.2 EFFECT OF RANDOM NOISE IN THE OBSERVATION

6.3.3 EFFECT OF THE CONTROL PARAMETER SET SIZE

6.4. DISCUSSION

**CHAPTER 7**

**REAL MEASUREMENTS STUDY USING SMOSREX DATASET**

7.1 INTRODUCTION

7.2 KEY PARAMETERS TO PERFORM THE OPTIMIZATION

7.3 LST DATA ASSIMILATION WITH PARAMETER STANDARD VALUES

7.3.1. SIMULATED VS. OBSERVED MEASUREMENTS

7.3.2. BRIGHTNESS TEMPERATURE SENSITIVITY ANALYSIS

7.3.3. BRIGHTNESS TEMPERATURE ASSIMILATION DURING A SINGLE DAY

7.3.4. BRIGHTNESS TEMPERATURE ASSIMILATION DURING A WEEK

7.3.5. DISCUSSION

7.4 LST VARIATIONAL DATA ASSIMILATION WITH DIFFERENT PRIOR VALUES

7.4.1. SIMULATED VS. OBSERVED MEASUREMENTS

7.4.2. BRIGHTNESS TEMPERATURE SENSITIVITY ANALYSIS

7.4.3. BRIGHTNESS TEMPERATURE ASSIMILATION DURING A SINGLE DAY

7.4.4. BRIGHTNESS TEMPERATURE ASSIMILATION DURING A WEEK

7.5. ANALYSIS OF THE ASSIMILATION SYSTEM THROUGH TWIN EXPERIMENTS

7.6 CONCLUSION

**CHAPTER 8**

**CONCLUSION AND PERSPECTIVES**