Bias Correction (bias + correction)

Distribution by Scientific Domains


Selected Abstracts


A Three-step Method for Choosing the Number of Bootstrap Repetitions

ECONOMETRICA, Issue 1 2000
Donald W. K. Andrews
This paper considers the problem of choosing the number of bootstrap repetitions B for bootstrap standard errors, confidence intervals, confidence regions, hypothesis tests, p -values, and bias correction. For each of these problems, the paper provides a three-step method for choosing B to achieve a desired level of accuracy. Accuracy is measured by the percentage deviation of the bootstrap standard error estimate, confidence interval length, test's critical value, test's p -value, or bias-corrected estimate based on B bootstrap simulations from the corresponding ideal bootstrap quantities for which B=,. The results apply quite generally to parametric, semiparametric, and nonparametric models with independent and dependent data. The results apply to the standard nonparametric iid bootstrap, moving block bootstraps for time series data, parametric and semiparametric bootstraps, and bootstraps for regression models based on bootstrapping residuals. Monte Carlo simulations show that the proposed methods work very well. [source]


Prediction of sea surface temperature from the global historical climatology network data

ENVIRONMETRICS, Issue 3 2004
Samuel S. P. Shen
Abstract This article describes a spatial prediction method that predicts the monthly sea surface temperature (SST) anomaly field from the land only data. The land data are from the Global Historical Climatology Network (GHCN). The prediction period is 1880,1999 and the prediction ocean domain extends from 60°S to 60°N with a spatial resolution 5°×5°. The prediction method is a regression over the basis of empirical orthogonal functions (EOFs). The EOFs are computed from the following data sets: (a) the Climate Prediction Center's optimally interpolated sea surface temperature (OI/SST) data (1982,1999); (b) the National Climatic Data Center's blended product of land-surface air temperature (1992,1999) produced from combining the Special Satellite Microwave Imager and GHCN; and (c) the National Centers for Environmental Prediction/National Center for Atmospheric Research Reanalysis data (1982,1999). The optimal prediction method minimizes the first- M -mode mean square error between the true and predicted anomalies over both land and ocean. In the optimization process, the data errors of the GHCN boxes are used, and their contribution to the prediction error is taken into account. The area-averaged root mean square error of prediction is calculated. Numerical experiments demonstrate that this EOF prediction method can accurately recover the global SST anomalies during some circulation patterns and add value to the SST bias correction in the early history of SST observations and the validation of general circulation models. Our results show that (i) the land only data can accurately predict the SST anomaly in the El Nino months when the temperature anomaly structure has very large correlation scales, and (ii) the predictions for La Nina, neutral, or transient months require more EOF modes because of the presence of the small scale structures in the anomaly field. Copyright © 2004 John Wiley & Sons, Ltd. [source]


Comparison of Pb Purification by Anion-Exchange Resin Methods and Assessment of Long-Term Reproducibility of Th/U/Pb Ratio Measurements by Quadrupole ICP-MS

GEOSTANDARDS & GEOANALYTICAL RESEARCH, Issue 2 2009
Balz S. Kamber
échange d'anions; purification de Pb; élimination de matrice A comparison between HBr-HCl and HBr-HNO3 based anion chemistry is presented to test the efficiency of Pb purification in the preparation of samples for isotope ratio measurement by ICP-MS. It was found that the small advantages in yield and blank offered by the HNO3 -based method were more than compensated by the more effective matrix removal of the HCl-based method. Apart from very zinc rich matrices (e.g., sphalerite), a careful single pass purification using HBr and HCl removed more than 99.9% of the matrix. In preparation for the isotope ratio analysis, a small (2,5% m/v) liquid sample aliquot was analysed to determine U, Th and Pb concentrations by solution quadrupole ICP-MS. This allowed accurate prediction of the expected ion signal and permitted optimal spiking with Tl, if desired, for mass bias correction. Long-term results for international rock reference materials showed reproducibilities of better than 1% (Th/U) and 1.5% (U/Pb). For most geological applications, such analyses obviate the need for isotope dilution concentration measurements. Les échantillons dont on veut mesurer la composition isotopique de Pb par ICP-MS sont préparés par élution sur résines anioniques. Une comparaison a été effectuée entre la chimie utilisant HBr-HCl et celle utilisant HBr-HNO3, pour vérifier leurs efficacités respectives. On montre que le léger avantage présenté par HNO3 pour ce qui concerne la qualité des blancs et le rendement était plus que compensé par la bien meilleure élimination de la matrice effectuée par la méthode utilisant HCl. A l'exception des matrices très riches en zinc (par ex sphalérite) une seule purification soigneuse avec HBr et HCl éliminait plus de 99.9% de la matrice. Pour préparer l'analyse isotopique, un petit volume (2,5% m/v) était prélevé afin d'analyser les concentrations en U, Th et Pb par ICP-MS à quadrupôle. Ceci permettait de prédire de façon précise l'intensité du signal espérée et d'optimiser la quantité de solution de spike de Tl à mettre si on souhaite corriger ainsi le biais de masse. Les résultats sur le long terme obtenus sur des roches, matériaux de référence internationale, ont montré des reproductibilités meilleures que 1% (Th/U) et 1.5% (U/Pb). Pour la plupart des applications, des analyses de cette qualité suppriment le besoin de mesures de concentrations par dilution isotopique. [source]


Quantitative evaluation of automated skull-stripping methods applied to contemporary and legacy images: Effects of diagnosis, bias correction, and slice location

HUMAN BRAIN MAPPING, Issue 2 2006
Christine Fennema-Notestine
Abstract Performance of automated methods to isolate brain from nonbrain tissues in magnetic resonance (MR) structural images may be influenced by MR signal inhomogeneities, type of MR image set, regional anatomy, and age and diagnosis of subjects studied. The present study compared the performance of four methods: Brain Extraction Tool (BET; Smith [2002]: Hum Brain Mapp 17:143,155); 3dIntracranial (Ward [1999] Milwaukee: Biophysics Research Institute, Medical College of Wisconsin; in AFNI); a Hybrid Watershed algorithm (HWA, Segonne et al. [2004] Neuroimage 22:1060,1075; in FreeSurfer); and Brain Surface Extractor (BSE, Sandor and Leahy [1997] IEEE Trans Med Imag 16:41,54; Shattuck et al. [2001] Neuroimage 13:856,876) to manually stripped images. The methods were applied to uncorrected and bias-corrected datasets; Legacy and Contemporary T1 -weighted image sets; and four diagnostic groups (depressed, Alzheimer's, young and elderly control). To provide a criterion for outcome assessment, two experts manually stripped six sagittal sections for each dataset in locations where brain and nonbrain tissue are difficult to distinguish. Methods were compared on Jaccard similarity coefficients, Hausdorff distances, and an Expectation-Maximization algorithm. Methods tended to perform better on contemporary datasets; bias correction did not significantly improve method performance. Mesial sections were most difficult for all methods. Although AD image sets were most difficult to strip, HWA and BSE were more robust across diagnostic groups compared with 3dIntracranial and BET. With respect to specificity, BSE tended to perform best across all groups, whereas HWA was more sensitive than other methods. The results of this study may direct users towards a method appropriate to their T1 -weighted datasets and improve the efficiency of processing for large, multisite neuroimaging studies. Hum. Brain Mapping, 2005. © 2005 Wiley-Liss, Inc. [source]


Forecasting high-frequency financial data with the ARFIMA,ARCH model

JOURNAL OF FORECASTING, Issue 7 2001
Michael A. Hauser
Abstract Financial data series are often described as exhibiting two non-standard time series features. First, variance often changes over time, with alternating phases of high and low volatility. Such behaviour is well captured by ARCH models. Second, long memory may cause a slower decay of the autocorrelation function than would be implied by ARMA models. Fractionally integrated models have been offered as explanations. Recently, the ARFIMA,ARCH model class has been suggested as a way of coping with both phenomena simultaneously. For estimation we implement the bias correction of Cox and Reid (1987). For daily data on the Swiss 1-month Euromarket interest rate during the period 1986,1989, the ARFIMA,ARCH (5,d,2/4) model with non-integer d is selected by AIC. Model-based out-of-sample forecasts for the mean are better than predictions based on conditionally homoscedastic white noise only for longer horizons (, > 40). Regarding volatility forecasts, however, the selected ARFIMA,ARCH models dominate. Copyright © 2001 John Wiley & Sons, Ltd. [source]


Global Daily Reference Evapotranspiration Modeling and Evaluation,

JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 4 2008
G.B. Senay
Abstract:, Accurate and reliable evapotranspiration (ET) datasets are crucial in regional water and energy balance studies. Due to the complex instrumentation requirements, actual ET values are generally estimated from reference ET values by adjustment factors using coefficients for water stress and vegetation conditions, commonly referred to as crop coefficients. Until recently, the modeling of reference ET has been solely based on important weather variables collected from weather stations that are generally located in selected agro-climatic locations. Since 2001, the National Oceanic and Atmospheric Administration's Global Data Assimilation System (GDAS) has been producing six-hourly climate parameter datasets that are used to calculate daily reference ET for the whole globe at 1-degree spatial resolution. The U.S. Geological Survey Center for Earth Resources Observation and Science has been producing daily reference ET (ETo) since 2001, and it has been used on a variety of operational hydrological models for drought and streamflow monitoring all over the world. With the increasing availability of local station-based reference ET estimates, we evaluated the GDAS-based reference ET estimates using data from the California Irrigation Management Information System (CIMIS). Daily CIMIS reference ET estimates from 85 stations were compared with GDAS-based reference ET at different spatial and temporal scales using five-year daily data from 2002 through 2006. Despite the large difference in spatial scale (point vs. ,100 km grid cell) between the two datasets, the correlations between station-based ET and GDAS-ET were very high, exceeding 0.97 on a daily basis to more than 0.99 on time scales of more than 10 days. Both the temporal and spatial correspondences in trend/pattern and magnitudes between the two datasets were satisfactory, suggesting the reliability of using GDAS parameter-based reference ET for regional water and energy balance studies in many parts of the world. While the study revealed the potential of GDAS ETo for large-scale hydrological applications, site-specific use of GDAS ETo in complex hydro-climatic regions such as coastal areas and rugged terrain may require the application of bias correction and/or disaggregation of the GDAS ETo using downscaling techniques. [source]


Probabilistic temperature forecast by using ground station measurements and ECMWF ensemble prediction system

METEOROLOGICAL APPLICATIONS, Issue 4 2004
P. Boi
The ECMWF Ensemble Prediction System 2-metre temperature forecasts are affected by systematic errors due mainly to resolution inadequacies. Moreover, other errors sources are present: differences in height above sea level between the station and the corresponding grid point, boundary layer parameterisation, and description of the land surface. These errors are more marked in regions of complex orography. A recursive statistical procedure to adapt ECMWF EPS-2metre temperature fields to 58 meteorological stations on the Mediterranean island of Sardinia is presented. The correction has been made in three steps: (1) bias correction of systematic errors; (2) calibration to adapt the EPS temperature distribution to the station temperature distribution; and (3) doubling the ensemble size with the aim of taking into account the analysis errors. Two years of probabilistic forecasts of freezing are tested by Brier Score, reliability diagram, rank histogram and Brier Skill Score with respect to the climatological forecast. The score analysis shows much better performance in comparison with the climatological forecast and direct model output, for all forecast timse, even after the first step (bias correction). Further gains in skill are obtained by calibration and by doubling the ensemble size. Copyright © 2004 Royal Meteorological Society. [source]


The drift factor in biased futures index pricing models: A new look

THE JOURNAL OF FUTURES MARKETS, Issue 6 2002
W. Brian Barrett
The presence of bias in index futures prices has been investigated in various research studies. Redfield (11) asserted that the U.S. Dollar Index (USDX) futures contract traded on the U.S. Cotton Exchange (now the FINEX division of the New York Board of Trade) could be systematically arbitraged for nontrivial returns because it is expressed in so-called "European terms" (foreign currency units/U.S. dollar). Eytan, Harpaz, and Krull (4) (EHK) developed a theoretical factor using Brownian motion to correct for the European terms and the bias due to the USDX index being expressed as a geometric average. Harpaz, Krull, and Yagil (5) empirically tested the EHK index. They used the historical volatility to proxy the EHK volatility specification. Since 1990, it has become more commonplace to use option-implied volatility for forecasting future volatility. Therefore, we have substituted option implied volatilities into EHK's correction factor and hypothesized that the correction factor is "better" ex ante and therefore should lead to better futures model pricing. We tested this conjecture using twelve contracts from 1995 through 1997 and found that the use of implied volatility did not improve the bias correction over the use of historical volatility. Furthermore, no matter which volatility specification we used, the model futures price appeared to be mis-specified. To investigate further, we added a simple naïve , based on a modification of the adaptive expectations model. Repeating the tests using this naïve "drift" factor, it performed substantially better than the other two specifications. Our conclusion is that there may be a need to take a new look at the drift-factor specification currently in use. © 2002 Wiley Periodicals, Inc. Jrl Fut Mark 22:579,598, 2002 [source]


An observing-system experiment with ground-based GPS zenith total delay data using HIRLAM 3D-Var in the absence of satellite data

THE QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY, Issue 650 2010
Reima Eresmaa
Abstract Ground-based receiver networks of the Global Positioning System (GPS) provide observations of atmospheric water vapour with a high temporal and horizontal resolution. Variational data assimilation allows researchers to make use of zenith total delay (ZTD) observations, which comprise the atmospheric effects on microwave signal propagation. An observing-system experiment (OSE) is performed to demonstrate the impact of GPS ZTD observations on the output of the High Resolution Limited Area Model (HIRLAM). The GPS ZTD observations for the OSE are provided by the EUMETNET GPS Water Vapour Programme, and they are assimilated using three-dimensional variational data assimilation (3D-Var). The OSE covers a five-week period during the late summer of 2008. In parallel with GPS ZTD data assimilation in the regular mode, the impact of a static bias-correction algorithm for the GPS ZTD data is also assessed. Assimilation of GPS ZTD data, without bias correction of any kind, results in a systematic increase in the forecast water-vapour content, temperature and tropospheric relative topography. A slightly positive impact is shown in terms of decreased forecast-error standard deviation of lower and middle tropospheric humidity and lower tropospheric geopotential height. Moreover, verification of categorical forecasts of 12 h accumulated precipitation shows a positive impact. The application of the static bias-correction scheme is positively verified in the case of the mean forecast error of lower tropospheric humidity and when relatively high precipitation accumulations are considered. Copyright © 2010 Royal Meteorological Society [source]


Assimilation of satellite-derived soil moisture from ASCAT in a limited-area NWP model

THE QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY, Issue 648 2010
Jean-François Mahfouf
Abstract A simplified Extended Kalman Filter is developed for the assimilation of satellite-derived surface soil moisture from the Advanced Scatterometer (ASCAT) instrument (on board the polar-orbiting satellite METOP) in a limited-area NWP model where soil water vertical transfers are described by a force,restore method. An analytic formulation of the land surface scheme Jacobians is derived to simplify the coupling between land surface and atmospheric data assimilation systems. Various steps necessary before the assimilation of ASCAT products are defined: projection of satellite data on the model grid, screening based on various criteria, bias correction using a CDF matching technique, and specification of model and observation errors. Three-dimensional variational data assimilation experiments are then performed during a four-week period in May 2009 over western Europe. A control assimilation is also run where the soil moisture evolves freely. Forecasts from these analyses show that the assimilation of ASCAT data slightly reduces the daytime low-level relative humidity positive bias of the control run. Forecast skill scores with respect to other variables are rather neutral. A comparison of the control run with the operational system where soil moisture is corrected from short-range forecast errors of screen-level observations show similar improvements but are more pronounced. These differences come from the fact that the number of screen-level observations from the surface network over Europe is significantly larger than those provided by a polar-orbiting satellite. These results are consistent with those obtained at ECMWF using soil moisture products derived from other satellite instruments (X-band radiometer TMI and C-band scatterometer ERS). Several avenues for improving this preliminary methodology are proposed. Copyright © 2010 Royal Meteorological Society [source]


4D-Var assimilation of MERIS total column water-vapour retrievals over land

THE QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY, Issue 644 2009
Peter Bauer
Abstract Experiments with the active assimilation of total column water-vapour retrievals from Envisat MERIS observations have been performed at the European Centre for Medium-Range Weather Forecasts (ECMWF), focusing on the summer 2006 African Monsoon Multidisciplinary Analysis (AMMA) field campaign period. A mechanism for data quality control, observation error definition and variational bias correction has been developed so that the data can be safely treated within 4D-Var, like other observations that are currently assimilated in the operational system. While data density is limited due to the restriction to daylight and cloud-free conditions, a systematic impact on mean moisture analysis was found, with distinct regional and seasonal features. The impact can last 1--2 days into the forecast but has little effect on forecast accuracy in terms of both moisture and dynamics. This is mainly explained by the weak dynamic activity in the areas of largest data impact. Analysis and short-range forecast evaluation with radiosonde observations revealed a strong dependence on radiosonde type. Compared with Vaisala RS92 observations, the addition of MERIS total column water-vapour observations produced neutral to positive impact, while contradictory results were obtained when all radiosonde types were used in generating the statistics. This highlights the issue of radiosonde moisture biases and the importance of sonde humidity bias correction in numerical weather prediction (NWP). Copyright © 2009 Royal Meteorological Society [source]


Medium-range multimodel ensemble combination and calibration

THE QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY, Issue 640 2009
Christine Johnson
Abstract As part of its contribution to The Observing System Research and Predictability Experiment (THORPEX), the Met Office has developed a global, 15 day multimodel ensemble. The multimodel ensemble combines ensembles from the European Centre for Medium-Range Weather Forecasts (ECMWF), Met Office and National Centers for Environmental Prediction (NCEP) and is calibrated to give further improvements. The ensemble post-processing includes bias correction, model-dependent weights and variance adjustment, all of which are based on linear-filter estimates using past forecast-verification pairs, calculated separately for each grid point and forecast lead time. Verification shows that the multimodel ensemble gives an improvement in comparison with a calibrated single-model ensemble, particularly for surface temperature. However, the benefits are smaller for mean-sea-level pressure (mslp) and 500 hPa height. This is attributed to the higher degree of forecast-error similarity between the component ensembles for mslp and 500 hPa height than for temperature. The results also show only small improvements from the use of the model-dependent weights and the variance adjustment. This is because the component ensembles have similar levels of skill, and the multimodel ensemble variance is already generally well calibrated. In conclusion, we demonstrate that the multimodel ensemble does give benefit over a single-model ensemble. However, as expected, the benefits are small if the ensembles are similar to each other and further post-processing gives only relatively small improvements. © Crown Copyright 2009. Reproduced with the permission of HMSO. Published by John Wiley & Sons Ltd. [source]


Assimilation of Meteosat radiance data within the 4D-Var system at ECMWF: Assimilation experiments and forecast impact

THE QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY, Issue 601 2004
Christina Köpken
Abstract The direct assimilation of water vapour (WV) clear-sky radiances (CSRs) from geostationary satellites within the ECMWF four-dimensional variational assimilation (4D-Var) became operational on 9 April 2002 with the assimilation of radiances from Meteosat-7. To extend the coverage provided by geostationary radiances, the derivation of a similar CSR product from the Geostationary Operational Environmental Satellites GOES-W and GOES-E was initiated and since 14 January 2003 these data have been operationally assimilated as well. This paper discusses results from the pre-operational impact experiments using Meteosat-7 and the subsequent operational implementation of the WV radiance assimilation. The pre-operational data monitoring of the CSRs shows contamination of certain time slots caused by intruding solar stray light and a certain degree of cloud influence present in the CSR. Data quality control is introduced to exclude affected data. When assimilated, the Meteosat WV CSRs correct the upper-tropospheric humidity field in areas of known model problems. While the analysis draws well to Meteosat data, the fit to other conventional observations does not degrade, and the fit to other satellite observations is noticeably improved. This is visible in statistics for the assimilated HIRS-12 as well as for passive Advanced Microwave Sounding Unit B (AMSUB) radiances, both in the pre-operational experiments and in the operational assimilation cycle. The impact on forecast quality is slightly positive to neutral for different areas of the globe. In some experiments a positive impact on upper-level wind fields (around 200 hPa) is seen, especially in the tropics. A relatively large sensitivity is noted of the mean increments and also forecast scores to the bias correction. Copyright © 2004 Royal Meteorological Society [source]


A Semiparametric Estimate of Treatment Effects with Censored Data

BIOMETRICS, Issue 3 2001
Ronghui Xu
Summary. A semiparametric estimate of an average regression effect with right-censored failure time data has recently been proposed under the Cox-type model where the regression effect ,(t) is allowed to vary with time. In this article, we derive a simple algebraic relationship between this average regression effect and a measurement of group differences in K -sample transformation models when the random error belongs to the Gp family of Harrington and Fleming (1982, Biometrika69, 553,566), the latter being equivalent to the conditional regression effect in a gamma frailty model. The models considered here are suitable for the attenuating hazard ratios that often arise in practice. The results reveal an interesting connection among the above three classes of models as alternatives to the proportional hazards assumption and add to our understanding of the behavior of the partial likelihood estimate under nonproportional hazards. The algebraic relationship provides a simple estimator under the transformation model. We develop a variance estimator based on the empirical influence function that is much easier to compute than the previously suggested resampling methods. When there is truncation in the right tail of the failure times, we propose a method of bias correction to improve the coverage properties of the confidence intervals. The estimate, its estimated variance, and the bias correction term can all be calculated with minor modifications to standard software for proportional hazards regression. [source]


Regional Climate Models for Hydrological Impact Studies at the Catchment Scale: A Review of Recent Modeling Strategies

GEOGRAPHY COMPASS (ELECTRONIC), Issue 7 2010
Claudia Teutschbein
This article reviews recent applications of regional climate model (RCM) output for hydrological impact studies. Traditionally, simulations of global climate models (GCMs) have been the basis of impact studies in hydrology. Progress in regional climate modeling has recently made the use of RCM data more attractive, although the application of RCM simulations is challenging due to often considerable biases. The main modeling strategies used in recent studies can be classified into (i) very simple constructed modeling chains with a single RCM (S-RCM approach) and (ii) highly complex and computing-power intensive model systems based on RCM ensembles (E-RCM approach). In the literature many examples for S-RCM can be found, while comprehensive E-RCM studies with consideration of several sources of uncertainties such as different greenhouse gas emission scenarios, GCMs, RCMs and hydrological models are less common. Based on a case study using control-run simulations of fourteen different RCMs for five Swedish catchments, the biases of and the variability between different RCMs are demonstrated. We provide a short overview of possible bias-correction methods and show that inter-RCM variability also has substantial consequences for hydrological impact studies in addition to other sources of uncertainties in the modeling chain. We propose that due to model bias and inter-model variability, the S-RCM approach is not advised and ensembles of RCM simulations (E-RCM) should be used. The application of bias-correction methods is recommended, although one should also be aware that the need for bias corrections adds significantly to uncertainties in modeling climate change impacts. [source]


Development of Cu and Zn Isotope MC-ICP-MS Measurements: Application to Suspended Particulate Matter and Sediments from the Scheldt Estuary

GEOSTANDARDS & GEOANALYTICAL RESEARCH, Issue 2 2008
Jérôme C.J. Petit
isotopes de Cu et Zn; interférences spectrales et non spectrales; fractionnement de masse instrumental; MC-ICP-MS; sédiments The present study evaluates several critical issues related to precision and accuracy of Cu and Zn isotopic measurements with application to estuarine particulate materials. Calibration of reference materials (such as the IRMM 3702 Zn) against the JMC Zn and NIST Cu reference materials were performed in wet and/or dry plasma modes (Aridus I and DSN-100) on a Nu Plasma MC-ICP-MS. Different mass bias correction methods were compared. More than 100 analyses of certified reference materials suggested that the sample-calibrator bracketing correction and the empirical external normalisation methods provide the most reliable corrections, with long term external precisions of 0.06 and 0.07, (2SD), respectively. Investigation of the effect of variable analyte to spike concentration ratios on Zn and Cu isotopic determinations indicated that the accuracy of Cu measurements in dry plasma is very sensitive to the relative Cu and Zn concentrations, with deviations of ,65Cu from ,0.4, (Cu/Zn = 4) to +0.4, (Cu/Zn = 0.2). A quantitative assessment (with instrumental mass bias corrections) of spectral and non-spectral interferences (Ti, Cr, Co, Fe, Ca, Mg, Na) was performed. Titanium and Cr were the most severe interfering constituents, contributing to inaccuracies of ,5.1, and +0.60, on ,68/64Zn, respectively (for 500 ,g l,1 Cu and Zn standard solutions spiked with 1000 ,g l,1 of Ti or Cr). Preliminary isotopic results were obtained on contrasting sediment matrices from the Scheldt estuary. Significant isotopic fractionation of zinc (from 0.21, to 1.13, for ,66Zn) and copper (from ,0.38, to 0.23, for ,65Cu), suggest a control by physical mixing of continental and marine water masses, characterized by distinct Cu and Zn isotopic signatures. These results provide a stepping-stone to further evaluate the use of Cu and Zn isotopes as biogeochemical tracers in estuarine environments. L'étude présentée ici porte sur l'évaluation critique d'un certain nombre de paramètres contrôlant la précision et la justesse des mesures des isotopes de Cu et Zn, dans le cadre d'une application à du matériel particulaire estuarien. Une calibration de matériaux de référence (tels que le Zn IRMM 3702) par rapport aux matériaux de référence JMC Zn et NIST Cu a été effectuée avec des plasmas humides et secs (avec Aridus I et DSN-100) sur un MC-ICP-MS Nu. Différentes méthodes de correction de biais de masse ont été comparées. Plus de 100 analyses de matériaux de référence certifiés ont montré que la correction par l'intercalation d'un calibrateur entre chaque échantillon et la calibration externe empirique fournissaient les corrections les plus fiables, avec des précisions externes sur le long terme de 0.06 et 0.07, (2SD) respectivement. Les effets de la variation des rapports de concentrations entre analyte et spike sur les mesures des rapports isotopiques de Cu et Zn ont montré que la justesse des mesures pour Cu en plasma sec est très tributaire des concentrations relatives de Cu et Zn, avec des déviations de ,65Cu allant de ,0.4, (Cu/Zn = 4) à+0.4, (Cu/Zn = 0.2). Une estimation quantitative des interférences spectrales et non spectrales (Ti, Cr, Co, Fe, Ca, Mg, Na) a été faite. Ti et Cr se sont révélés être les constituants interférents les plus importants pouvant entraîner des erreurs de ,5.1, et +0.60, sur ,68/64Zn respectivement (pour des solutions standards à 500 ,g l,1 de Cu et Zn dopées avec 1000 ,g l,1 de Ti ou Cr). Des données isotopiques préliminaires ont été obtenues sur des matrices sédimentaires très différentes provenant de l'estuaire de Scheldt. Les fractionnements significatifs du zinc (de 0.21,à 1.13, pour ,66Zn) et du cuivre (de ,0.38,à 0.23, pour ,65Cu) suggèrent un contrôle par un processus physique de mélange entre des masses d'eaux continentales et marines ayant des signatures isotopiques de Cu et Zn distinctes. Ces résultats constituent un tremplin vers une utilisation future des isotopes de Cu et Zn comme traceurs biogéochimiques des environnements estuariens. [source]


Toward a consistent reanalysis of the upper stratosphere based on radiance measurements from SSU and AMSU-A

THE QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY, Issue 645 2009
Shinya Kobayashi
Abstract Radiance measurements from the Stratospheric Sounding Unit (SSU) and the Advanced Microwave Sounding Unit (AMSU-A) are the primary source of information for stratospheric temperature in reanalyses of the satellite era. To improve the time consistency of the reanalyses, radiance biases need to be properly understood and accounted for in the assimilation system. The investigation of intersatellite differences between SSU and AMSU-A radiance observations shows that these differences are not accurately reproduced by the operational version of the radiative transfer model for the TIROS Operational Vertical Sounder (RTTOV-8). We found that this deficiency in RTTOV was mainly due to the treatment of the Zeeman effect (splitting of the oxygen absorption lines at 60 GHz) and to changes in the spectral response function of the SSU instrument that are not represented in RTTOV. On this basis we present a revised version of RTTOV that can reproduce SSU and AMSU-A intersatellite radiance differences more accurately. Assimilation experiments performed with the revised version of RTTOV in a four-dimensional variational analysis system (4D-Var) show some improvements in the stratospheric temperature analysis. However, significant jumps in the stratospheric temperature analysis still occur when switching satellites, which is due to the fact that systematic errors in the forecast model are only partially constrained by observations. Using a one-dimensional retrieval equation, we show that both the extent and vertical structure of the partial bias corrections must inevitably change when the nature of the radiance measurement changes with the transition from SSU to AMSU-A. Copyright © 2009 Royal Meteorological Society [source]