Estimating

Distribution by Scientific Domains
Distribution within Mathematics and Statistics

Terms modified by Estimating

  • estimating abundance
  • estimating age
  • estimating equation
  • estimating equation approach
  • estimating equation models
  • estimating function
  • estimating models
  • estimating parameter
  • estimating population size
  • estimating procedure
  • estimating risk
  • estimating treatment effects

  • Selected Abstracts


    ESTIMATING A DOSE-RESPONSE RELATIONSHIP BETWEEN LENGTH OF STAY AND FUTURE RECIDIVISM IN SERIOUS JUVENILE OFFENDERS,

    CRIMINOLOGY, Issue 3 2009
    THOMAS A. LOUGHRAN
    The effect of sanctions on subsequent criminal activity is of central theoretical importance in criminology. A key question for juvenile justice policy is the degree to which serious juvenile offenders respond to sanctions and/or treatment administered by the juvenile court. The policy question germane to this debate is finding the level of confinement within the juvenile justice system that maximizes the public safety and therapeutic benefits of institutional confinement. Unfortunately, research on this issue has been limited with regard to serious juvenile offenders. We use longitudinal data from a large sample of serious juvenile offenders from two large cities to 1) estimate a causal treatment effect of institutional placement, as opposed to probation, on future rate of rearrest and 2) investigate the existence of a marginal effect (i.e., benefit) for longer length of stay once the institutional placement decision had been made. We accomplish the latter by determining a dose-response relationship between the length of stay and future rates of rearrest and self-reported offending. The results suggest that an overall null effect of placement exists on future rates of rearrest or self-reported offending for serious juvenile offenders. We also find that, for the group placed out of the community, it is apparent that little or no marginal benefit exists for longer lengths of stay. Theoretical, empirical, and policy issues are outlined. [source]


    ESTIMATING THE USES OF CURRENCY IN AUSTRALIA

    ECONOMIC PAPERS: A JOURNAL OF APPLIED ECONOMICS AND POLICY, Issue 3 2002
    CHRISTOPHER BAJADA
    First page of article [source]


    ESTIMATING A GEOGRAPHICALLY EXPLICIT MODEL OF POPULATION DIVERGENCE

    EVOLUTION, Issue 3 2007
    L. Lacey Knowles
    Patterns of genetic variation can provide valuable insights for deciphering the relative roles of different evolutionary processes in species differentiation. However, population-genetic models for studying divergence in geographically structured species are generally lacking. Since these are the biogeographic settings where genetic drift is expected to predominate, not only are population-genetic tests of hypotheses in geographically structured species constrained, but generalizations about the evolutionary processes that promote species divergence may also be potentially biased. Here we estimate a population-divergence model in montane grasshoppers from the sky islands of the Rocky Mountains. Because this region was directly impacted by Pleistocene glaciation, both the displacement into glacial refugia and recolonization of montane habitats may contribute to differentiation. Building on the tradition of using information from the genealogical relationships of alleles to infer the geography of divergence, here the additional consideration of the process of gene-lineage sorting is used to obtain a quantitative estimate of population relationships and historical associations (i.e., a population tree) from the gene trees of five anonymous nuclear loci and one mitochondrial locus in the broadly distributed species Melanoplus oregonensis. Three different approaches are used to estimate a model of population divergence; this comparison allows us to evaluate specific methodological assumptions that influence the estimated history of divergence. A model of population divergence was identified that significantly fits the data better compared to the other approaches, based on per-site likelihood scores of the multiple loci, and that provides clues about how divergence proceeded in M. oregonensis during the dynamic Pleistocene. Unlike the approaches that either considered only the most recent coalescence (i.e., information from a single individual per population) or did not consider the pattern of coalescence in the gene genealogies, the population-divergence model that best fits the data was estimated by considering the pattern of gene lineage coalescence across multiple individuals, as well as loci. These results indicate that sampling of multiple individuals per population is critical to obtaining an accurate estimate of the history of divergence so that the signal of common ancestry can be separated from the confounding influence of gene flow,even though estimates suggest that gene flow is not a predominant factor structuring patterns of genetic variation across these sky island populations. They also suggest that the gene genealogies contain information about population relationships, despite the lack of complete sorting of gene lineages. What emerges from the analyses is a model of population divergence that incorporates both contemporary distributions and historical associations, and shows a latitudinal and regional structuring of populations reminiscent of population displacements into multiple glacial refugia. Because the population-divergence model itself is built upon the specific events shaping the history of M. oregonensis, it provides a framework for estimating additional population-genetic parameters relevant to understanding the processes governing differentiation in geographically structured species and avoids the problems of relying on overly simplified and inaccurate divergence models. The utility of these approaches, as well as the caveats and future improvements, for estimating population relationships and historical associations relevant to genetic analyses of geographically structured species are discussed. [source]


    ESTIMATING THE GENERAL EQUILIBRIUM BENEFITS OF LARGE CHANGES IN SPATIALLY DELINEATED PUBLIC GOODS*

    INTERNATIONAL ECONOMIC REVIEW, Issue 4 2004
    Holger Sieg
    The purpose of this article is to report a new approach for measuring the general equilibrium willingness to pay for large changes in spatially delineated public goods such as air quality. We estimate the parameters of a locational equilibrium model and compute equilibria for alternative scenarios characterizing the availability of public goods within a system of communities. Welfare measures take into consideration the adjustments of households in equilibrium to nonmarginal changes in public goods. The framework is used to analyze willingness to pay for reductions in ozone concentrations in Southern California between 1990 and 1995. [source]


    ESTIMATING THE TAX BENEFITS OF DEBT

    JOURNAL OF APPLIED CORPORATE FINANCE, Issue 1 2001
    John Graham
    The standard approach to valuing interest tax shields assumes that full tax benefits are realized on every dollar of interest deduction in every scenario. The approach presented in this paper takes account of the possibility that interest tax shields cannot be used in some scenarios, in part because of variations in the firm's profitability. Because of the dynamic nature of the tax code (e.g., tax-loss carrybacks and carryforwards), it is necessary to consider past and future taxable income when estimating today's effective marginal tax rate. The paper uses a series of numerical examples to show that (1) the incremental value of an extra dollar of interest deduction is equal to the marginal tax rate appropriate for that dollar; and (2) a firm's effective marginal tax rate (and therefore the marginal benefit of incremental interest deductions) can actually decline as the firm takes on additional debt. Based on marginal benefit functions for thousands of firms from 1980,1999, the author concludes that the tax benefits of debt averaged approximately 10% of firm value during the 1980s, while declining to around 8% in the 1990s. By taking maximum advantage of the interest tax shield, the average firm could have increased its value by approximately 15% over the 1980s and 1990s, suggesting that the consequences of being underlevered are significant. Surprisingly, many of the companies that appear best able to service debt (i.e., those with the lowest apparent costs of debt) use the least amount of debt, on average. Treasurers and CFOs should critically reevaluate their companies' debt policies and consider the benefits of additional leverage, even if taking on more debt causes their credit ratings to slip a notch. [source]


    AN EVALUATION OF THE AVAILABLE TECHNIQUES FOR ESTIMATING MISSING FECAL COLIFORM DATA,

    JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 6 2004
    Ashu Jain
    ABSTRACT: This paper presents the findings of a study aimed at evaluating the available techniques for estimating missing fecal coliform (FC) data on a temporal basis. The techniques investigated include: linear and nonlinear regression analysis and interpolation functions, and the use of artificial neural networks (ANNs). In all, seven interpolation, two regression, and one ANN model structures were investigated. This paper also investigates the validity of a hypothesis that estimating missing FC data by developing different models using different data corresponding to different dynamics associated with different trends in the FC data may result in a better model performance. The FC data (counts/100 ml) derived from the North Fork of the Kentucky River in Kentucky were employed to calibrate and validate various models. The performance of various models was evaluated using a wide variety of standard statistical measures. The results obtained in this study are able to demonstrate that the ANNs can be preferred over the conventional techniques in estimating missing FC data in a watershed. The regression technique was not found suitable in estimating missing FC data on a temporal basis. Further, it has been found that it is possible to achieve a better model performance by first decomposing the whole data set into different categories corresponding to different dynamics and then developing separate models for separate categories rather than developing a single model for the composite data set. [source]


    ESTIMATING THE VALUE OF DELIVERY OPTIONS IN FUTURES CONTRACTS

    THE JOURNAL OF FINANCIAL RESEARCH, Issue 3 2005
    Jana Hranaiova
    Abstract We analyze the effect various delivery options embedded in commodity futures contracts have on the futures price. The two embedded options considered are the timing and location options. We show that early delivery is always optimal when only a timing option is present, but not so when joint options are present. The estimates of the combined options are much smaller than the comparable estimates for the timing option alone. The average value of the joint option is about 5% of the average basis on the first day of the maturity month. This suggests that joint options can increase deliverable supplies while potentially having only a small effect on basis behavior. [source]


    Corrigendum: AN EVALUATION OF NON-ITERATIVE METHODS FOR ESTIMATING THE LINEAR-BY-LINEAR PARAMETER OF ORDINAL LOG-LINEAR MODELS

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 2 2010
    Eric J. Beh
    No abstract is available for this article. [source]


    ESTIMATING A PARAMETER WHEN IT IS KNOWN THAT THE PARAMETER EXCEEDS A GIVEN VALUE

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 4 2009
    Ian R. Gordon
    Summary In some statistical problems a degree of explicit, prior information is available about the value taken by the parameter of interest, , say, although the information is much less than would be needed to place a prior density on the parameter's distribution. Often the prior information takes the form of a simple bound, ,, > ,1' or ,, < ,1', where ,1 is determined by physical considerations or mathematical theory, such as positivity of a variance. A conventional approach to accommodating the requirement that,, > ,1,is to replace an estimator,,, of , by the maximum of,,and ,1. However, this technique is generally inadequate. For one thing, it does not respect the strictness of the inequality,, > ,1, which can be critical in interpreting results. For another, it produces an estimator that does not respond in a natural way to perturbations of the data. In this paper we suggest an alternative approach, in which bootstrap aggregation, or bagging, is used to overcome these difficulties. Bagging gives estimators that, when subjected to the constraint,, > ,1, strictly exceed ,1 except in extreme settings in which the empirical evidence strongly contradicts the constraint. Bagging also reduces estimator variability in the important case for which,,is close to ,1, and more generally produces estimators that respect the constraint in a smooth, realistic fashion. [source]


    AN EVALUATION OF NON-ITERATIVE METHODS FOR ESTIMATING THE LINEAR-BY-LINEAR PARAMETER OF ORDINAL LOG-LINEAR MODELS

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 3 2009
    Eric J. Beh
    Summary Parameter estimation for association and log-linear models is an important aspect of the analysis of cross-classified categorical data. Classically, iterative procedures, including Newton's method and iterative scaling, have typically been used to calculate the maximum likelihood estimates of these parameters. An important special case occurs when the categorical variables are ordinal and this has received a considerable amount of attention for more than 20 years. This is because models for such cases involve the estimation of a parameter that quantifies the linear-by-linear association and is directly linked with the natural logarithm of the common odds ratio. The past five years has seen the development of non-iterative procedures for estimating the linear-by-linear parameter for ordinal log-linear models. Such procedures have been shown to lead to numerically equivalent estimates when compared with iterative, maximum likelihood estimates. Such procedures also enable the researcher to avoid some of the computational difficulties that commonly arise with iterative algorithms. This paper investigates and evaluates the performance of three non-iterative procedures for estimating this parameter by considering 14 contingency tables that have appeared in the statistical and allied literature. The estimation of the standard error of the association parameter is also considered. [source]


    ESTIMATING THE FALSE NEGATIVE FRACTION FOR A MULTIPLE SCREENING TEST FOR BOWEL CANCER WHEN NEGATIVES ARE NOT VERIFIED

    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, Issue 4 2004
    Chris J. Lloyd
    Summary This paper aims to estimate the false negative fraction of a multiple screening test for bowel cancer, where those who give negative results for six consecutive tests do not have their true disease status verified. A subset of these same individuals is given a further screening test, for the sole purpose of evaluating the accuracy of the primary test. This paper proposes a beta heterogeneity model for the probability of a diseased individual ,testing positive' on any single test, and it examines the consequences of this model for inference on the false negative fraction. The method can be generalized to the case where selection for further testing is informative, though this did not appear to be the case for the bowel-cancer data. [source]


    ESTIMATING THE RELATIONSHIP BETWEEN IMMIGRANT AND NATIVE WORKERS IN AUSTRALIA: A PRODUCTION THEORY APPROACH

    AUSTRALIAN ECONOMIC PAPERS, Issue 1 2010
    JAAI PARASNIS
    The impact of immigration on labour markets depends, among other factors, on the substitutability or complementarity between immigrants and natives. This relationship is examined by treating migrant and native labour, along with capital, as inputs in production process. Estimated price elasticities of substitution between immigrants and native labour suggest that in Australian context, an increase in the wage rate of one group of workers leads to an increased demand for the other. The estimated elasticities of substitution between immigrant and native workers and the complementary relationship between immigrants and capital provide an insight into the complex effects of immigration. [source]


    How accurately can parameters from exponential models be estimated?

    CONCEPTS IN MAGNETIC RESONANCE, Issue 2 2005
    A Bayesian view
    Abstract Estimating the amplitudes and decay rate constants of exponentially decaying signals is an important problem in NMR. Understanding how the uncertainty in the parameter estimates depends on the data acquisition parameters and on the "true" but unknown values of the exponential signal parameters is an important step in designing experiments and determining the amount and quality of the data that must be gathered to make good parameter estimates. In this article, Bayesian probability theory is applied to this problem. Explicit relationships between the data acquisition parameters and the "true" but unknown exponential signal parameters are derived for the cases of data containing one and two exponential signal components. Because uniform prior probabilities are purposely employed, the results are broadly applicable to experimental parameter estimation. © 2005 Wiley Periodicals, Inc. Concepts Magn Reson Part A 27A: 73,83, 2005 [source]


    Significance of Specimen Databases from Taxonomic Revisions for Estimating and Mapping the Global Species Diversity of Invertebrates and Repatriating Reliable Specimen Data

    CONSERVATION BIOLOGY, Issue 2 2004
    RUDOLF MEIER
    More specifically, we demonstrate for a specimen database assembled during a revision of the robber-fly genus Euscelidia (Asilidae, Diptera) how nonparametric species richness estimators (Chao1, incidence-based coverage estimator, second-order jackknife) can be used to (1) estimate global species diversity, (2) direct future collecting to areas that are undersampled and/or likely to be rich in new species, and (3) assess whether the plant-based global biodiversity hotspots of Myers et al. (2000) contain a significant proportion of invertebrates. During the revision of Euscelidia, the number of known species more than doubled, but estimation of species richness revealed that the true diversity of the genus was likely twice as high. The same techniques applied to subsamples of the data indicated that much of the unknown diversity will be found in the Oriental region. Assessing the validity of biodiversity hotspots for invertebrates is a formidable challenge because it is difficult to decide whether species are hotspot endemics, and lists of observed species dramatically underestimate true diversity. Lastly, conservation biologists need a specimen database analogous to GenBank for collecting specimen records. Such a database has a three-fold advantage over information obtained from digitized museum collections: (1) it is shown for Euscelidia that a large proportion of unrevised museum specimens are misidentified; (2) only the specimen lists in revisionary studies cover a wide variety of private and public collections; and (3) obtaining specimen records from revisions is cost-effective. Resumen:,Sostuvimos que los millones de registros de especimenes publicados en miles de revisiones taxonómicas en décadas anteriores son una fuente de información costo-efectiva de importancia crítica para la incorporación de invertebrados en decisiones de investigación y conservación. Más específicamente, para una base de datos de especimenes de moscas del género Euscelidia (Asilidae, Diptera) demostramos como se pueden utilizar estimadores no paramétricos de riqueza de especies (Chao 1, estimador de cobertura basado en incidencia, navaja de segundo orden) para (1) estimar la diversidad global de especies, (2) dirigir colecciones futuras a áreas que están sub-muestreadas y/o probablemente tengan especies nuevas y (3) evaluar si los sitios globales de importancia para la biodiversidad basados en plantas de Myers et al. (2000) contienen una proporción significativa de invertebrados. Durante la revisión de Euscelidia el número de especies conocidas fue más del doble, pero la estimación de riqueza de especies reveló que la diversidad real del género probablemente también era el doble. Las mismas técnicas aplicadas a las sub-muestras de datos indicaron que gran parte de la diversidad no conocida se encontrará en la Región Oriental. La evaluación de la validez de sitios de importancia para la biodiversidad de invertebrados es un reto formidable porque es difícil decidir si las especies son endémicas de esos sitios y si las listas de especies observadas subestiman dramáticamente la diversidad real. Finalmente, los biólogos de la conservación requieren de una base de datos de especimenes análoga a GenBank, para obtener registros de especimenes. Dicha base de datos tiene una triple ventaja sobre la información obtenida de colecciones de museos digitalizadas. (1) Se muestra para Euscelidia que una gran proporción de especimenes de museo no revisados están mal identificados. (2) Sólo las listas de especimenes en estudios de revisión cubren una amplia variedad de colecciones privadas y públicas. (3) La obtención de registros en revisiones es costo-efectiva. [source]


    Estimating the spatiotemporal pattern of volumetric growth rate from fate maps in chick limb development

    DEVELOPMENTAL DYNAMICS, Issue 2 2009
    Yoshihiro Morishita
    Abstract Morphogenesis is achieved through volumetric growth of tissue at a rate varying over space and time. The volumetric growth rate of each piece of tissue reflects the behaviors of constituent cells such as cell proliferation and death. Hence, clarifying its spatiotemporal pattern accurately is a key to bridge between cell behaviors and organ morphogenesis. We here propose a new method to estimate the spatiotemporal pattern of volumetric growth rate from fate map data with limited resolution on space and time by using a mathematical model. We apply the method to chick wing data along the proximodistal axis, and find that the volumetric growth pattern is biphasic: it is spatially uniform in earlier stages (until stage 23), but in later stages the volumetric growth occurs approximately 4.5 times as fast as in the distal region (within approximately 100 ,m from the limb tip) than in the proximal region. Developmental Dynamics 238:415,422, 2009. © 2009 Wiley-Liss, Inc. [source]


    Estimating the mean speed of laminar overland flow using dye injection-uncertainty on rough surfaces

    EARTH SURFACE PROCESSES AND LANDFORMS, Issue 4 2001
    David Dunkerley
    Abstract A common method for estimating mean flow speeds in studies of surface runoff is to time the travel of a dye cloud across a measured flow path. Motion of the dye front reflects the surface flow speed, and a correction must be employed to derive a value for the profile mean speed, which is always lower. Whilst laminar flow conditions are widespread in the interrill zone, few data are available with which to establish the relationship linking surface and profile mean speeds, and there are virtually none for the flow range 100,<,Re,<,500 (Re,=,Reynolds number) which is studied here. In laboratory experiments on a glued sand board, mean flow speeds were estimated from both dye speeds and the volumetric flow relation v,=,Q/wd with d measured using a computer-controlled needle gauge at 64 points. In order to simulate conditions applicable to many dryland soils, the board was also roughened with plant litter and with ceramic tiles (to simulate surface stone cover). Results demonstrate that in the range 100,<,Re,<,500, there is no consistent relation between surface flow speeds and the profile mean. The mean relationship is v,=,0·56 vsurf, which departs significantly from the theoretical smooth-surface relation v,=,0·67 vsurf, and exhibits a considerable scatter of values that show a dependence on flow depth. Given the inapplicability of any fixed conversion factor, and the dependence on flow depth, it is suggested that the use of dye timing as a method for estimating v be abandoned in favour of precision depth measurement and the use of the relation v,=,Q/wd, at least within the laminar flow range tested. Copyright © 2001 John Wiley & Sons, Ltd. [source]


    Estimating the growth of a newly established moose population using reproductive value

    ECOGRAPHY, Issue 3 2007
    Bernt-Erik Sæther
    Estimating the population growth rate and environmental stochasticity of long-lived species is difficult because annual variation in population size is influenced by temporal autocorrelations caused by fluctuations in the age-structure. Here we use the dynamics of the reproductive value to estimate the long-term growth rate s and the environmental variance of a moose population that recently colonized the island of Vega in northern Norway. We show that the population growth rate was high (,=0.26). The major stochastic influences on the population dynamics were due to demographic stochasticity, whereas the environmental variance was not significantly different from 0. This supports the suggestion that population growth rates of polytocous ungulates are high, and that demographic stochasticity must be assessed when estimating the growth of small ungulate populations. [source]


    Estimating the number of alcohol-attributable deaths: methodological issues and illustration with French data for 2006

    ADDICTION, Issue 6 2010
    Grégoire Rey
    ABSTRACT Aims Computing the number of alcohol-attributable deaths requires a series of hypotheses. Using French data for 2006, the potential biases are reviewed and the sensitivity of estimates to various hypotheses evaluated. Methods Self-reported alcohol consumption data were derived from large population-based surveys. The risks of occurrence of diseases associated with alcohol consumption and relative risks for all-cause mortality were obtained through literature searches. All-cause and cause-specific population alcohol-attributable fractions (PAAFs) were calculated. In order to account for potential under-reporting, the impact of adjustment on sales data was tested. The 2006 mortality data were restricted to people aged between 15 and 75 years. Results When alcohol consumption distribution was adjusted for sales data, the estimated number of alcohol-attributable deaths, the sum of the cause-specific estimates, was 20 255. Without adjustment, the estimate fell to 7158. Using an all-cause mortality approach, the adjusted number of alcohol-attributable deaths was 15 950, while the non-adjusted estimate was a negative number. Other methodological issues, such as computation based on risk estimates for all causes for ,all countries' or only ,European countries', also influenced the results, but to a lesser extent. Discussion The estimates of the number of alcohol-attributable deaths varied greatly, depending upon the hypothesis used. The most realistic and evidence-based estimate seems to be obtained by adjusting the consumption data for national alcohol sales, and by summing the cause-specific estimates. However, interpretation of the estimates must be cautious in view of their potentially large imprecision. [source]


    Estimating the Technology of Cognitive and Noncognitive Skill Formation

    ECONOMETRICA, Issue 3 2010
    Flavio Cunha
    This paper formulates and estimates multistage production functions for children's cognitive and noncognitive skills. Skills are determined by parental environments and investments at different stages of childhood. We estimate the elasticity of substitution between investments in one period and stocks of skills in that period to assess the benefits of early investment in children compared to later remediation. We establish nonparametric identification of a general class of production technologies based on nonlinear factor models with endogenous inputs. A by-product of our approach is a framework for evaluating childhood and schooling interventions that does not rely on arbitrarily scaled test scores as outputs and recognizes the differential effects of the same bundle of skills in different tasks. Using the estimated technology, we determine optimal targeting of interventions to children with different parental and personal birth endowments. Substitutability decreases in later stages of the life cycle in the production of cognitive skills. It is roughly constant across stages of the life cycle in the production of noncognitive skills. This finding has important implications for the design of policies that target the disadvantaged. For most configurations of disadvantage it is optimal to invest relatively more in the early stages of childhood than in later stages. [source]


    Solving, Estimating, and Selecting Nonlinear Dynamic Models Without the Curse of Dimensionality

    ECONOMETRICA, Issue 2 2010
    Viktor Winschel
    We present a comprehensive framework for Bayesian estimation of structural nonlinear dynamic economic models on sparse grids to overcome the curse of dimensionality for approximations. We apply sparse grids to a global polynomial approximation of the model solution, to the quadrature of integrals arising as rational expectations, and to three new nonlinear state space filters which speed up the sequential importance resampling particle filter. The posterior of the structural parameters is estimated by a new Metropolis,Hastings algorithm with mixing parallel sequences. The parallel extension improves the global maximization property of the algorithm, simplifies the parameterization for an appropriate acceptance ratio, and allows a simple implementation of the estimation on parallel computers. Finally, we provide all algorithms in the open source software JBendge for the solution and estimation of a general class of models. [source]


    Estimating the Effects of a Time-Limited Earnings Subsidy for Welfare-Leavers

    ECONOMETRICA, Issue 6 2005
    David Card
    In the Self Sufficiency Project (SSP) welfare demonstration, members of a randomly assigned treatment group could receive a subsidy for full-time work. The subsidy was available for 3 years, but only to people who began working full time within 12 months of random assignment. A simple optimizing model suggests that the eligibility rules created an "establishment" incentive to find a job and leave welfare within a year of random assignment, and an "entitlement" incentive to choose work over welfare once eligibility was established. Building on this insight, we develop an econometric model of welfare participation that allows us to separate the two effects and estimate the impact of the earnings subsidy on welfare entry and exit rates among those who achieved eligibility. The combination of the two incentives explains the time profile of the experimental impacts, which peaked 15 months after random assignment and faded relatively quickly. Our findings suggest that about half of the peak impact of SSP was attributable to the establishment incentive. Despite the extra work effort generated by SSP, the program had no lasting impact on wages and little or no long-run effect on welfare participation. [source]


    Estimating the Fractional Order of Integration of Yields in the Brazilian Fixed Income Market

    ECONOMIC NOTES, Issue 3 2007
    Benjamin M. Tabak
    This paper presents evidence that yields on the Brazilian fixed income market are fractionally integrated, and compares the period before and after the implementation of the Inflation Targeting (IT) regime. The paper employs the commonly used GPH estimator and recently developed wavelets-based estimator of long memory. Empirical results suggest that interest rates are fractionally integrated and that interest rate spreads are fractionally integrated, with a higher order of integration in the period after the implementation of the IT regime. These results have important implications for the development of macroeconomic models for the Brazilian economy and for long-term forecasting. Furthermore, they imply that shocks to interest rates are long-lived. [source]


    Estimating the Impact of a Policy Reform on Benefit Take-up: The 2001 extension to the Minimum Income Guarantee for UK Pensioners

    ECONOMICA, Issue 306 2010
    FRANCESCA ZANTOMIO
    In 2001 the Minimum Income Guarantee for UK pensioners was reformed, changing the structure and level of benefits. We evaluate the behavioural response to this reform, using nonparametric analysis comparing a sample of pensioners interviewed before and another interviewed after the reform, matching their simulated pre- and post-reform entitlements and other characteristics. We compare the results with conventional parametric methods and also ex ante matching, and we consider the effect of measurement error in simulated entitlements. The response of take-up to the reform is found to be significant and positive, with evidence of larger impacts from the nonparametric analysis. [source]


    Estimating the Intergenerational Correlation of Incomes: An Errors,in,Variables Framework

    ECONOMICA, Issue 273 2002
    Ramses H. Abul Naga
    Because the permanent incomes of parents and children are typically unobservable, the permanent income of the parent family is taken to be a latent variable, but it is assumed that a model for its determinants is known to the researcher. I propose two related estimators for the intergenerational correlation: a 2SLS procedure and a more efficient MIMIC estimator. MIMIC also provides estimates of the variance parameters required to evaluate the bias of the OLS estimator. Using US data, I provide estimates for the intergenerational correlation ranging between 0.30 and 0.78. The bias of the OLS estimator is calculated to be in the order of 40%. [source]


    Estimating the burden of disease attributable to illicit drug use and mental disorders: what is ,Global Burden of Disease 2005' and why does it matter?

    ADDICTION, Issue 9 2009
    Louisa Degenhardt
    ABSTRACT Background The estimated impact of illicit drug use and mental disorders upon population health needs to be understood because there is evidence that they produce substantial loss of life and disability, and information is needed on the comparative population health impact of different diseases and risk factors to help focus policy, service and research planning and execution. Aims To provide an overview of a global project, running since the end of 2007,Global Burden of Disease (GBD) 2005. Methods The new GBD aims to update comprehensively the findings of the first GBD exercise. It aims to provide regional and global estimates of the burden of disease attributable to hundreds of diseases, injuries and their risk factors. Groups have been assembled to provide expert advice on the parameters needed to inform these estimates; here, we provide a brief summary of the broad range of work being undertaken by the group examining illicit drug use and mental disorders. Discussion The estimates of the contribution of mental disorders and illicit drugs to GBD will inform and potentially shape the focus of researchers, clinicians and governments in the years to come. We hope that interested readers might be encouraged to submit new data or feedback on the work completed thus far, as well as the work that is still under way and yet to be completed. [source]


    Estimating the probability of bird mortality from pesticide sprays on the basis of the field study record

    ENVIRONMENTAL TOXICOLOGY & CHEMISTRY, Issue 7 2002
    Pierre Mineau
    Abstract The outcome of avian field studies was examined to model the likelihood of mortality. The data were divided into clusters reflecting the type of pesticide application and bird guilds present on site. Logistic regression was used to model the probability of a bird kill. Four independent variables were tested for their explanatory power: a variable reflecting acute oral toxicity and application rate; a variable reflecting the relative oral to dermal toxicity of the pesticides; Henry's law constant; and a variable reflecting possible avoidance of contaminated food items, the hazard factor (HF). All variables except for HF significantly improved model prediction. The relative dermal to oral toxicity, especially, was shown to have a major influence on field outcome and clearly must be incorporated into future avian risk assessments. The probability of avian mortality could be calculated from a number of current pesticide applications and the conclusion was made that avian mortality occurs regularly and frequently in agricultural fields. [source]


    Estimating the number of ozone peaks in Mexico City using a non-homogeneous Poisson model

    ENVIRONMETRICS, Issue 5 2008
    Jorge A. Achcar
    Abstract In this paper, we consider the problem of estimating the number of times an air quality standard is exceeded in a given period of time. A non-homogeneous Poisson model is proposed to analyse this issue. The rate at which the Poisson events occur is given by a rate function ,(t), t,,,0. This rate function also depends on some parameters that need to be estimated. Two forms of ,(t), t,,,0 are considered. One of them is of the Weibull form and the other is of the exponentiated-Weibull form. The parameters estimation is made using a Bayesian formulation based on the Gibbs sampling algorithm. The assignation of the prior distributions for the parameters is made in two stages. In the first stage, non-informative prior distributions are considered. Using the information provided by the first stage, more informative prior distributions are used in the second one. The theoretical development is applied to data provided by the monitoring network of Mexico City. The rate function that best fit the data varies according to the region of the city and/or threshold that is considered. In some cases the best fit is the Weibull form and in other cases the best option is the exponentiated-Weibull. Copyright © 2007 John Wiley & Sons, Ltd. [source]


    Estimating the unknown change point in the parameters of the lognormal distribution

    ENVIRONMETRICS, Issue 2 2007
    V. K. Jandhyala
    Abstract We develop change-point methodology for identifying dynamic trends in the parameters of a two-parameter lognormal distribution. The methodology primarily considers the asymptotic distribution of the maximum likelihood estimate of the unknown change point. Among others, the asymptotic distribution enables one to construct confidence interval estimates for the unknown change point. The methodology is applied to identify changes in the monthly water discharges of the Nacetinsky Creek in the German part of the Ergebirge Mountains. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    Estimating the Costs of Epilepsy: An International Comparison of Epilepsy Cost Studies

    EPILEPSIA, Issue 5 2001
    Irene A. W. Kotsopoulos
    Summary: ,Purpose: To compare systematically the national and per capita estimates of the cost of epilepsy in different countries. Methods: Studies for this literature review were selected by conducting a Medline literature search from January 1966 to March 2000. Key methodologic, country-related, and monetary issues of the selected epilepsy cost studies were evaluated to compare their direct cost estimates and to explore their distribution. The results of the selected studies were made comparable by converting them with different types of conversion factors and expressing them as a proportion of the national expenditure on health care. Results: Ten epilepsy cost studies were reviewed. The proportion of national health care expenditure on epilepsy shows a range of 0.12,1.12% or 0.12,1.05% depending on the type of conversion factor. The list of cost components included in the estimation of the direct costs of epilepsy differs from study to study. A comprehensive list is associated with a decrease in the contribution of drug and hospital costs to the total direct costs of epilepsy. Conclusions: This study highlights the importance of studying the economic consequences of epilepsy and of interpreting the results on the international level. The results of epilepsy cost studies can provide insight into the distribution of the costs of epilepsy and the impact of epilepsy on the national expenditure on health care. [source]


    Estimating the minimum distance of large-block turbo codes using iterative multiple-impulse methods

    EUROPEAN TRANSACTIONS ON TELECOMMUNICATIONS, Issue 5 2007
    Stewart Crozier
    A difficult problem for turbo codes is the efficient and accurate determination of the distance spectrum, or even just the minimum distance, for specific interleavers. This is especially true for large blocks, with many thousands of data bits, if the distance is high. This paper compares a number of recent distance estimation techniques and introduces a new approach, based on using specific event impulse patterns and iterative processing, that is specifically tailored to handle long interleavers with high spread. The new method is as reliable as two previous iterative multiple-impulse methods, but with much lower complexity. A minimum distance of 60 has been estimated for a rate 1/3, 8-state, turbo code with a dithered relative prime (DRP) interleaver of length K,=,65,536. Copyright © 2007 John Wiley & Sons, Ltd. [source]