Get Complete Project Material File(s) Now! »

Ecosystem Acoustic

Over a wide range of wavelengths, the ocean is opaque to electromagnetic radiation but relatively transparent to sound. Underwater, the sound suffers very much less attenuation than the light. In clear ocean water, sunlight may be detectable (with instruments) down to 1000 m, but the range at which humans can see details of objects is rarely more than 50 m, and usually less. On the other hand, sound waves can be detected over vast distances and are a much better vehicle for undersea information than light (Talley et al., 2011). However, although sound (energy) can be transmitted over long distances by the periodic compression and expansion (waves), water is nevertheless an imperfect acoustic medium. Indeed, the emitted sound energy is removed (scattered, backscattered and refracted) as it encounters a suspended obstacle, be it solids, biota or entrained gas, or simply converted into heat by physical absorption (Simmonds and MacLennan, 2005; Talley et al., 2011). The ocean stratification is the main responsible physical structure for the generation of beams (convergent and/or divergent), depending on the change in the sound speed due to the depth (Pensieri and Bozzano, 2017; Talley et al., 2011). All these effects are why the sound has motivated a growing group of researchers to explore the possibilities of using sound as a tool for ocean measurement.
The Acoustic Oceanography is defined as the use of sound to study physical parameters and processes, as well as biological species and behaviours, at sea (Medwin and Clay, 1997). References to underwater acoustic can be traced back as far as mediaeval times. Urick (1983) mentions a notebook, dated 1490, in which Leonardo da Vinci observed that by listening to one end of a long tube, with the other end in the sea, ‘you will hear ships at a great distance’. However, practical applications had to await more advanced technology, notably the piezo-electric transducer which was invented by the French physicist Langevin in 1917 (Sabra, 2015). As a result of research instigated by the First World War, it was discovered that submarines could be detected by listening to the echo of a sound transmission. Since then, the benefits of underwater acoustics were proportional to the technological developments in both hardware and software components, especially for oceanographic applications (e.g., Pensieri and Bozzano, 2017).
The many successes of the underwater acoustic range from the identification, counting and monitor aquatic fauna, structures and physical processes (e.g., internal waves, ocean frontal systems, eddies, and others) in the water column and the shape of the sea floor (Medwin, 1997; Wood, 1935). In some cases, a natural sound in the sea is analysed to reveal the physical or biological characteristics of the sound source and is called passive acoustics approach (Medwin and Clay, 1997). Each source has unique spectral characteristics that can be used to classify its type (physical, biological, anthropogenic, atmospheric), even if the background noise levels may differ between basins (Fig. 5).

Functional Data Analysis (FDA)

FDA is a branch of statistic that provides tools for describing and modelling sets of functions (or curves) rather than vectors of discrete values (Ramsay, 2006). The guiding idea of this approach is to describe data as parameterized functions, and to use these parameters for clustering, comparing or interpolating functions. In particular, classical statistical tools can be adapted to functional data such as functional principal component analysis (fPCA) to summarize and characterize significant variation in finite dimension among a sample curves (Dabo-Niang and Ferraty, 2008; Ramsay, 2006). Functional analysis of variance (fANOVA) uses all the information of each mean functional curve to test the possible differences in the datasets, based on the shape and temporal (along the depth) variability of the curves (Cuevas et al., 2004; Ramsay, 2006).
Here, each temperature and salinity profile were treated as a separate function (curve) in FDA, and we used a variety of functional statistical methods to define and characterise the 3D thermohaline structure. A summary flowchart documenting the methodology sequence for the use of FDA is presented in Fig. 3 and described below. All analyses (i.e., fitting, clustering, and kriging) were done separately for salinity and temperature profiles.

Fitting B-spline function

To apply FDA (Ramsay, 2006), the first step consists in transforming the data into function (Fig. 3, step 1). To describe a vertical profile as a single entity, the existence of a smooth function giving = ( ) + (Eq. 3) rise to the observed data is assumed. It is expressed as: that is ( , ), = 1, … , , of each basis function = 1, …, K such that: ( ) is a remainder.
where are the data, is depth and temperature or salinity, expected ( ) is in the form of a linear combination to be as small as possible. The function ( 1, 1),…,( , ) ( )=∑ =1( ) (Eq. 4) estimated by penalized regression using pointwise data whereare coefficients (Ramsay, 2006).
Spline functions to smooth data are the most common choice of approximation system for non-periodic functional data. It combines fast computation of polynomials approach with substantially greater flexibility. Therefore, a B-spline basis of degree 3 (Boor, 2001) has been used. Then, to reduce the impact of noise in the measurement when interpolating, a solution is to determine a roughness parameter , such that parameters of will be estimated by minimizing the penalized squared difference (error): − ( ) ) [ ( )] + ∫ = ∑ = ( 2 2 2 (Eq. 5).

Comparing ABRACOS and ancillary datasets

Once the hydrographic profiles have been described with a relevant functional basis, the next step was to compare the ABRACOS and ancillary data. Indeed, we aimed at determining if these datasets could be merged to be representative of spring and fall canonical states in the study region. This question has been addressed in two steps (Fig. 3, steps 2 and 3) using the most recent ABRACOS observations as a reference.
First (Fig. 3, step 2), we characterised the ABRACOS profiles by applying fPCA and a functional hierarchical clustering (fCluster), achieved on the coefficients of the fPCA decomposition (Febrero-Bande and Fuente, 2012). Reproducing the aim of PCA, the idea of the fPCA is to summarize multivariate dataset with principal component seen as a linear combination of the variables (the same apply for fCluster). In comparison, the method is adapted to deal with function rather than vector and the principal components corresponds to dominant modes of variation of functional data (Shang, 2014). fPCA allowed identifying the main patterns of variation of the vertical profiles while fCluster were used to statistically define homogeneous groups of profiles. These groups were then plotted spatially to determine if they correspond to specific areas.
Second (Fig. 3, step 3), once homogeneous areas were defined from ABRACOS profiles, we applied a fANOVA to compare ABRACOS and ancillary profiles present in each area. fANOVA is based on the so-called one-way analysis of variance for univariate functional data using L2-norm-based parametric bootstrap test for homoscedastic samples (Cuevas et al., 2004). This ( ), =1… procedure considers groups of independent random functions such that each function of the group is assumed to be a stochastic process with mean function and tests the null hypothesis: 0: 1( )=⋯= ( ). This analysis has been performed with the use of “fdanova” R package (Febrero-Bande and Fuente, 2012).

READ  The Value of Endangered Forest Elephants to Local Communities in a Transboundary Conservation Landscape 

Data characterisation and 3D spatial interpolation.

In case of no rejection of the null hypothesis (Fig. 3, step 4), profiles from the different datasets can be merged to build complete dataset representative of spring and fall canonical states. Then fPCA, fCluster and fANOVA can be applied on the complete merged dataset to characterise the profiles and their spatial variability (Fig. 3, step 5).
To interpolate temperature and salinity profiles in 3D (Fig. 3, step 6) we applied a functional geostatistical analysis (Giraldo et al., 2007). Geostatistics (or kriging methods) are well-known tools for model-based spatial interpolation, taking into account the spatial auto-correlation of an estimated set of variables (Matheron, 1963). The fundamental premise of kriging methods is that spatial data constitute a joint realization of spatially dependent random variables, collectively referred as a random function. Ordinary kriging refers to spatial prediction under the assumption of stationarity as specified by Cressie (1993) and Wackernagel (2003). More precisely, the random function is supposed stationary, meaning that the expectation of the random function is independent of the position. This framework has been generalized to be useful within the FDA context. Yet, because we are dealing with functions, the stumbling block remains in the estimation of a spatial covariance between curves.
The functional geostatistical analysis has been performed according to ordinary trace kriging method (Giraldo et al., 2011) in order to described the spatial autocorrelation structure of functional data. The method is implemented in “fdagstat” R package (https://github. com/ogru/fdagstat) and has involved the two following steps: (i) the analysis of spatial structure (i.e., through the calculation and fitting of a variogram); (ii) and the use of this structure for predicting functional data at unknown location. Spherical model has been used to fit the different variograms to temperature and salinity profiles.
To evaluate the quality of the functional geostatistical models we performed cross-validation analyses. For that, we randomly removed 2% of the sampled curves (profiles) and performed the functional Kriging predictor for the remaining 98% curves to predict the removed curves. Observed profiles, fitted with 41 and 44 B-spline basis (K) to temperature and salinity, respectively, were 53 compared with a Kriging- predicted curve for each site. This procedure was applied the number of times enough to cover all profiles sampled in situ. We used the Mean Square (MSD) differences i.e., the mean of the squared error between predicted and real profiles (for all depth) to evaluate the quality of the prediction. Finally (Fig. 3, step 7), we applied the criteria defining the MLD, the BLT, the upper thermocline depth, and the lower thermocline/pycnocline depth to the interpolated fields of temperature and salinity, to obtain a comprehensive vision of the thermohaline structure in the SWTA. In each area and for each season, we also calculated the BL occurrence frequency (BLF, in %) defined as the percentage of area with BLT higher than 5 m (Zeng and Wang, 2017).

3D thermohaline patterns

From functional geostatistics, we characterised the 3D temperature and salinity fields for the canonical state of spring and fall in the SWTA. Validation procedure (Fig. 7 for spring, see Sup. Fig. S2 for fall) shows that the ordinary trace-kriging provides an acceptable estimation of temperature and salinity profiles at any locations of the sampled domain, independent of the areas (A1, A2 and A3). Examples of original profiles for randomly selected sites and their predicted equivalents are shown, for each profile the root mean square of the differences between predicted and observed values (applied along the profile) was calculated to estimate the accuracy. The boxplots of the MSD (Fig. 7) and associated global statistics show that in most cases the goodness of fit was good with similar predicted and observed smoothed curves. However, in some extreme cases the goodness of fit was not optimal (e.g., profiles 2 and 4 in Fig. 7) but the shape was preserved.

Table of contents :

1. Preface
2. The Southwestern Tropical Atlantic
3. Ecosystem Acoustic
4. Motivation and objectives
Objective 1: Characterise the 3D thermohaline structure of the SWTA in austral spring and fall by the functional statistics approach. (Chapter II)
Objective 2: Test if the variation of the thermohaline structure could be extracted from acoustic data in the SWTA. (Chapter III)
Objective 3: Investigate how environmental factors may be driving the vertical distribution of organisms in the southwestern tropical Atlantic. (Chapter IV)
1. Introduction
2. Material and Methods
2.1. Data
2.2. Defining the thermohaline structure
2.3. Functional Data Analysis (FDA)
3. Results
3.1. Merging the datasets
3.2. Typology of thermohaline patterns
3.3. 3D thermohaline patterns
4. Discussion
4.1. Thermohaline structure and associated processes
5. Conclusion
Declaration of Competing Interest
Supplementary material
From Chapter II to Chapter III
1. Introduction
2. Material and Methods
2.1. Data
2.2. Thermocline structure
2.3. Linking acoustic profiles to thermohaline properties
3. Results and Discussion
3.1. Thermohaline limits vs. cumulative sum of the acoustic echoes
3.2. Thermohaline limits vs. acoustic backscattering gradient
3.3. Wavelet approach
4. Conclusion
From Chapter III to Chapter IV
1. Introduction
2. Methods
2.1. Data
2.2. Data analysis
3. Results
3.1. Oceanscape
3.2. Epipelagic echoscape
3.3. Cross-correlations in the epipelagic layer
3.4. Mesopelagic echoscape
3.5. Cross-correlation in the mesopelagic layer
4. Discussion
4.1. Epipelagic layer
4.2. Mesopelagic layer
5. Conclusion
Supplementary Material
Objective 1: Characterise the 3D thermohaline structure of the SWTA in austral spring and fall by the functional statistics approach. (Chapter II)
Objective 2: Test if the variation of the thermohaline structure could be extracted from acoustic data in the SWTA. (Chapter III)
Objective 3: Investigate how environmental factors may be driving the vertical distribution of organisms in the southwestern tropical Atlantic. (Chapter IV)


Related Posts