Normal Distribution (normal + distribution)

Distribution by Scientific Domains

Kinds of Normal Distribution

  • multivariate normal distribution


  • Selected Abstracts


    The Mean of the Inverse of a Punctured Normal Distribution and Its Application

    BIOMETRICAL JOURNAL, Issue 4 2004
    C. D. Lai
    Abstract The fundamental properties of a punctured normal distribution are studied. The results are applied to three issues concerning X/Y where X and Y are independent normal random variables with means ,X and ,Y respectively. First, estimation of ,X/,Y as a surrogate for E(X/Y) is justified, then the reason for preference of a weighted average, over an arithmetic average, as an estimator of ,X/,Y is given. Finally, an approximate confidence interval for ,X/,Y is provided. A grain yield data set is used to illustrate the results. (© 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


    Punctuated Equilibrium in Comparative Perspective

    AMERICAN JOURNAL OF POLITICAL SCIENCE, Issue 3 2009
    Frank R. Baumgartner
    We explore the impact of institutional design on the distribution of changes in outputs of governmental processes in the United States, Belgium, and Denmark. Using comprehensive indicators of governmental actions over several decades, we show that in each country the level of institutional friction increases as we look at processes further along the policy cycle. Assessing multiple policymaking institutions in each country allows us to control for the nature of the policy inputs, as all the institutions we consider cover the full range of social and political issues in the country. We find that all distributions exhibit high kurtosis values, significantly higher than the Normal distribution which would be expected if changes in government attention and activities were proportionate to changes in social inputs. Further, in each country, those institutions that impose higher decision-making costs show progressively higher kurtosis values. The results suggest general patterns that we hypothesize to be related to boundedly rational behavior in a complex social environment. [source]


    New reference for the age at childhood onset of growth and secular trend in the timing of puberty in Swedish

    ACTA PAEDIATRICA, Issue 6 2000
    YX Liu
    The objectives of the present work were to present a new reference for the age at childhood onset of growth and to investigate the secular trend in the timing of puberty in a community-based normal population in Sweden. A total of 2432 children with longitudinal length/height data from birth to adulthood were used to determine the two measures by visual inspection of the measured attained length/height and the change in growth velocity displayed on a computer-generated infancy-childhood-puberty (ICP) based growth chart. The series represents a sample of normal full-term children born around 1974 in Göteborg, Sweden. We found about 10% of children were delayed (>12 mo of age) in the childhood onset of growth based on the previous reported normal range, i.e. 14% in boys and 8% in girls. Distribution of the age at childhood onset of growth was skewed. The medians were 10 and 9 mo for boys and girls, respectively. After natural logarithmic transformation, the mean and standard deviation (SD) were 2.29 (anti-log 9.9 mo) and 0.226 for boys, 2.23 (anti-log 9.3 mo) and 0.220 for girls, respectively. The 95% normal ranges were 6.3-15.4 and 6.0-14.3 for boys and girls, respectively. The distribution of the timing of PHV was close to the normal distribution. The mean values were 13.5 y for boys and 11.6 y for girls with 1 y SD for both sexes. Conclusion: A downward secular trend in the onset of puberty was clearly shown in the population. The age at childhood onset of growth did not correlate with the timing of puberty (r=,0.01 and 0.05, p > 0.7 and 0.1 in boys and girls, respectively). Normal ranges of the age at childhood onset of growth are in need of revise, as this study indicates. The new reference presented here could be a reliable indicator in further studies. [source]


    Experimental study on the extraction and distribution of textual domain keywords

    CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 16 2008
    Xiangfeng Luo
    Abstract Domain keywords of text play a primary role in text classifying, clustering and personalized services. This paper proposes a term frequency inverse document frequency (TFIDF) based method called TDDF (TFIDF direct document frequency of domain) to extract domain keywords from multi-texts. First, we discuss the optimal parameters of TFIDF, which are used to extract textual keywords and domain keywords. Second, TDDF is proposed to extract domain keywords from multi-texts, which takes document frequency of domain into account. Finally, the distribution of domain keywords on scientific texts is studied. Experiments and applications show that TDDF is more effective than the optimal TFIDF in the extraction of domain keywords. Domain keywords accord with normal distribution on a single text after deleting the ubiquitous domain keywords. Copyright © 2008 John Wiley & Sons, Ltd. [source]


    The Distribution of Follicular Units in the Chinese Scalp: Implications for Reconstruction of Natural-Appearing Hairlines in Orientals

    DERMATOLOGIC SURGERY, Issue 6 2002
    Ren-Yeu Tsai MD
    background. Follicular transplantation using hair in its naturally occurring groups, called follicular units (FUs), has become the most popular technique in hair restoration surgery. Recently follicular transplantation was performed with a qualitative and quantitative concept to achieve the best clinical result. The characteristics and distribution of FUs are well studied in Caucasians and widely applied in hair transplantation surgery. objective. In order to understand the normal distribution of FUs in the Chinese scalp, we counted the number of hairs and FUs in normal Chinese scalp to provide general information for surgical planning and design in bald Chinese patients. methods. A total of 50 normal and 50 bald Chinese adults were enrolled to count the hairs on their scalp. One hundred bald patients receiving hairline reconstruction were also prospectively quantitatively evaluated. results. In normal Chinese scalp, an average 71.78 FUs/cm2 and 137.08 hairs/cm2 were calculated with a follicular density of 1.91 hairs/FU. Two-hair FUs are the predominate group (50.29%). In bald patients, an average of 68.07 FUs/cm2 was found, which was less than that of the occipital scalp in normal nonbald patients. In reconstruction of the frontal hairline, a total of 700,1000 FUs were implanted with an average density of 30 FUs/cm2. conclusion. We found the average number of FUs (0.72 FU/mm2) was less than that in Caucasian patients (1 FU/mm2). The average density of 30 FUs/cm2 implanted was suitable to reconstruct a natural frontal hairline in bald Chinese patients, which can achieve about 40% of normal hair density. Our results could provide the hair surgeon with general information about hair distribution on the Chinese scalp for surgical planning and design in their patients. [source]


    Small-scale variability in surface moisture on a fine-grained beach: implications for modeling aeolian transport

    EARTH SURFACE PROCESSES AND LANDFORMS, Issue 10 2009
    Brandon L. Edwards
    Abstract Small-scale variations in surface moisture content were measured on a fine-grained beach using a Delta-T Theta probe. The resulting data set was used to examine the implications of small-scale variability for estimating aeolian transport potential. Surface moisture measurements were collected on a 40 cm × 40 cm grid at 10 cm intervals, providing a total of 25 measurements for each grid data set. A total of 44 grid data sets were obtained from a representative set of beach sub-environments. Measured moisture contents ranged from about 0% (dry) to 25% (saturated), by weight. The moisture content range within a grid data set was found to vary from less than 1% to almost 15%. The magnitude of within-grid variability varied consistently with the mean moisture content of the grid sets, following an approximately normal distribution. Both very wet and very dry grid data sets exhibited little internal variability in moisture content, while intermediate moisture contents were associated with higher levels of variability. Thus, at intermediate moisture contents it was apparent that some portions of the beach surface could be dry enough to allow aeolian transport (i.e. moisture content is below the critical threshold), while adjacent portions are too wet for transport to occur. To examine the implications of this finding, cumulative distribution functions were calculated to model the relative proportions of beach surface area expected to be above or below specified threshold moisture levels (4%, 7%, and 14%). It was found that the implicit inclusion of small-scale variability in surface moisture levels typically resulted in changes of less than 1% in the beach area available for transport, suggesting that this parameter can be ignored at larger spatial scales. Copyright © 2009 John Wiley & Sons, Ltd. [source]


    Multi-phase evolution of gnammas (weathering pits) in a Holocene deglacial granite landscape, Minnesota (USA)

    EARTH SURFACE PROCESSES AND LANDFORMS, Issue 2 2008
    David Domínguez-Villar
    Abstract The morphometry of 85 gnammas (weathering pits) from Big Stone County in western Minnesota allows the assessment of the relative ages of the gnamma population. The ratio between maximum and minimum depths is independent of the initial size of the cavity and only depends on the weathering evolution. Therefore, the distribution of depth ratios can be used to assess the gnamma population age and the history of weathering. The asymmetrical distribution of depth ratios measured in Big Stone County forms three distinct populations. When these sets are analyzed independently, the correlation (r2) between maximum and minimum depths is greater than 0·95. Each single population has a normal distribution of depth ratios and the average depth ratios (, -value) for each population are ,1 = 1·60 ± 0·05, ,2 = 2·09 ± 0·04 and ,3 = 2·42 ± 0·08. The initiation of gnamma formation followed the exhumation of the granite in the region. This granite was till and saprolite covered upon retreat of the ice from the Last Glacial Maximum. Nearby outcrops are striated, but the study site remained buried until it was exhumed by paleofloods issuing from a proglacial lake. These Holocene-aged gnammas in western Minnesota were compared with gnammas of other ages from around the world. Our new results are in accordance with the hypothesis that , -values represent the evolution of gnammas with time under temperate- to cold-climate dynamics. Phases of the formation of new gnammas may result from changes in weathering processes related to climate changes. Copyright © 2007 John Wiley & Sons, Ltd. [source]


    Decision Theory Applied to an Instrumental Variables Model

    ECONOMETRICA, Issue 3 2007
    Gary Chamberlain
    This paper applies some general concepts in decision theory to a simple instrumental variables model. There are two endogenous variables linked by a single structural equation; k of the exogenous variables are excluded from this structural equation and provide the instrumental variables (IV). The reduced-form distribution of the endogenous variables conditional on the exogenous variables corresponds to independent draws from a bivariate normal distribution with linear regression functions and a known covariance matrix. A canonical form of the model has parameter vector (,, ,, ,), where ,is the parameter of interest and is normalized to be a point on the unit circle. The reduced-form coefficients on the instrumental variables are split into a scalar parameter ,and a parameter vector ,, which is normalized to be a point on the (k,1)-dimensional unit sphere; ,measures the strength of the association between the endogenous variables and the instrumental variables, and ,is a measure of direction. A prior distribution is introduced for the IV model. The parameters ,, ,, and ,are treated as independent random variables. The distribution for ,is uniform on the unit circle; the distribution for ,is uniform on the unit sphere with dimension k-1. These choices arise from the solution of a minimax problem. The prior for ,is left general. It turns out that given any positive value for ,, the Bayes estimator of ,does not depend on ,; it equals the maximum-likelihood estimator. This Bayes estimator has constant risk; because it minimizes average risk with respect to a proper prior, it is minimax. The same general concepts are applied to obtain confidence intervals. The prior distribution is used in two ways. The first way is to integrate out the nuisance parameter ,in the IV model. This gives an integrated likelihood function with two scalar parameters, ,and ,. Inverting a likelihood ratio test, based on the integrated likelihood function, provides a confidence interval for ,. This lacks finite sample optimality, but invariance arguments show that the risk function depends only on ,and not on ,or ,. The second approach to confidence sets aims for finite sample optimality by setting up a loss function that trades off coverage against the length of the interval. The automatic uniform priors are used for ,and ,, but a prior is also needed for the scalar ,, and no guidance is offered on this choice. The Bayes rule is a highest posterior density set. Invariance arguments show that the risk function depends only on ,and not on ,or ,. The optimality result combines average risk and maximum risk. The confidence set minimizes the average,with respect to the prior distribution for ,,of the maximum risk, where the maximization is with respect to ,and ,. [source]


    Heart Rate Response to Intravenous Catheter Placement

    ACADEMIC EMERGENCY MEDICINE, Issue 9 2003
    Joel M. Bartfield MD
    Abstract Objective: To investigate the relationship between change in heart rate and pain and anxiety caused by intravenous catheter (IV) placement. Methods: An observational study was performed in a university-based tertiary care emergency department. Patients who required IV placement as part of their management were considered as possible subjects. Heart rates were recorded at the following times: baseline, tourniquet placement, and IV placement. Immediately after IV placement, subjects recorded pain and anxiety scores using 100-mm visual analog scales. Percentage change in heart rate (compared with baseline) was calculated at time of tourniquet placement (anxiety) and IV placement (pain). Simple linear regression analyses were performed comparing pain scores with percent change in heart rate at the time of IV and tourniquet placement. Significance was defined as p < 0.05. Results: Ninety subjects were enrolled. Subjects had a mean age of 48 years, and 54% were women. There was a normal distribution of heart rate changes, with greater than 80% of all subjects having a 10% or less change in heart rates. The results of the analysis of pain scores versus percentage change in heart rate at IV placement yielded a Pearson correlation coefficient of 0.13 (p = 0.2). The results of the analysis of anxiety scores versus percentage change in heart rate at tourniquet placement yielded a Pearson correlation coefficient of 0.014 (p = 0.9). Conclusions: Changes in heart rate do not correlate with pain and anxiety associated with IV placement. [source]


    Skew-symmetric distributions generated by the distribution function of the normal distribution

    ENVIRONMETRICS, Issue 4 2007
    Héctor W. Gómez
    Abstract In this paper we study a general family of skew-symmetric distributions which are generated by the cumulative distribution of the normal distribution. For some distributions, moments are computed which allows computing asymmetry and kurtosis coefficients. It is shown that the range for asymmetry and kurtosis parameters is wider than for the family of models introduced by Nadarajah and Kotz (2003). For the skew- t -normal model, we discuss approaches for obtaining maximum likelihood estimators and derive the Fisher information matrix, discussing some of its properties and special cases. We report results of an application to a real data set related to nickel concentration in soil samples. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    Maximum likelihood estimators of population parameters from doubly left-censored samples

    ENVIRONMETRICS, Issue 8 2006
    Abou El-Makarim A. Aboueissa
    Abstract Left-censored data often arise in environmental contexts with one or more detection limits, DLs. Estimators of the parameters are derived for left-censored data having two detection limits: DL1 and DL2 assuming an underlying normal distribution. Two different approaches for calculating the maximum likelihood estimates (MLE) are given and examined. These methods also apply to lognormally distributed environmental data with two distinct detection limits. The performance of the new estimators is compared utilizing many simulated data sets. Examples are given illustrating the use of these methods utilizing a computer program given in the Appendix. Copyright © 2006 John Wiley & Sons, Ltd. [source]


    Parametric estimation for the location parameter for symmetric distributions using moving extremes ranked set sampling with application to trees data

    ENVIRONMETRICS, Issue 7 2003
    Mohammad Fraiwan Al-Saleh
    Abstract A modification of ranked set sampling (RSS) called moving extremes ranked set sampling (MERSS) is considered parametrically, for the location parameter of symmetric distributions. A maximum likelihood estimator (MLE) and a modified MLE are considered and their properties are studied. Their efficiency with respect to the corresponding estimators based on simple random sampling (SRS) are compared for the case of normal distribution. The method is studied under both perfect and imperfect ranking (with error in ranking). It appears that these estimators can be real competitors to the MLE using (SRS). The procedure is illustrated using tree data. Copyright © 2003 John Wiley & Sons, Ltd. [source]


    Meshing noise effect in design of experiments using computer experiments

    ENVIRONMETRICS, Issue 5-6 2002
    J. P. Caire
    Abstract This work is intended to show the influence of grid length and meshing technique on the empirical modeling of current distribution in an industrial electroplating reactor. This study confirms the interest of usual DOEs for computer experiments. Any 2D mesh generator induced, in this sensitive case, a significant noise representing only less than 5 per cent of the response. The ,experimental error' obeys a normal distribution and the associated replicate SDs represents 20 per cent of the global residual standard deviation. The geometry seems also to influence the corresponding noise. If the current density uniformity could be considered as a severe test, it is obvious that the noise generated by meshing would be amplified for 3D grids that will be in common use in future years. Copyright © 2002 John Wiley & Sons, Ltd. [source]


    Predictive distributions in risk analysis and estimation for the triangular distribution

    ENVIRONMETRICS, Issue 7 2001
    Yongsung Joo
    Abstract Many Monte Carlo simulation studies have been done in the field of risk analysis. This article demonstrates the importance of using predictive distributions (the estimated distributions of the explanatory variable accounting for uncertainty in point estimation of parameters) in the simulations. We explore different types of predictive distributions for the normal distribution, the lognormal distribution and the triangular distribution. The triangular distribution poses particular problems, and we found that estimation using quantile least squares was preferable to maximum likelihood. Copyright © 2001 John Wiley & Sons, Ltd. [source]


    Probabilistic approach to voltage stability analysis with load uncertainty considered

    EUROPEAN TRANSACTIONS ON ELECTRICAL POWER, Issue 2 2009
    B. Wu
    Abstract This paper proposes a new probabilistic assessment algorithm of voltage stability based on Monte-Carlo simulation and modal analysis considering uncertainty of the load level and load parameters. By Monte-Carlo sampling, the bus load level is determined according to the forecasted bus load curve of a research period. The coefficients of the load polynomials, which include induction motors, are treated as random variables with normal distribution. A technique of normal distribution sampling is utilized to simulate these coefficients uncertainty. Voltage stability is evaluated in the form of indices such as the expected maximum loadability and the statistics of system participations, which are obtained from modal analysis near the point of collapse. A case study of the IEEE 118-node system is given to demonstrate the validity of the proposed algorithm, and the effects of load uncertainty and the proportion of motors on probabilistic assessment of voltage stability are investigated. Copyright © 2007 John Wiley & Sons, Ltd. [source]


    Vectorial summation of probabilistic current harmonics in power systems: From a bivariate distribution model towards a univariate probability function

    EUROPEAN TRANSACTIONS ON ELECTRICAL POWER, Issue 1 2000
    Y. J. Wang
    This paper extends the investigation into the bivariate normal distribution (BND) model which has been widely used to study the asymptotic behaviour of the sum of a sufficiently large number of randomly-varying harmonic phasors (of the same frequency). Although the BND model is effective and applicable to most problems involving harmonic summation, its main drawback resides in the computation time required to extract the probability density function of the harmonic magnitude from the two-dimensional BND model. This paper proposes a novel approach to the problem by assimilating the generalized Gamma distribution (GGD) model to the marginal distribution (the magnitude) of the BND using the method of moments. The proposed method can accurately estimate the parameters of the GGD model without time-consuming calculation. A power system containing ten harmonic sources is taken as an example where the comparison of the Monte-Carlo simulation, the BND model and the GGD model is given and discussed. The comparison shows that the GGD model approximates the BND model very well. [source]


    Earnings-Based Bonus Compensation

    FINANCIAL REVIEW, Issue 4 2009
    António Câmara
    G39; M52 Abstract This article studies the cost of contingent earnings-based bonus compensation. We assume that the firm has normal and abnormal earnings. The normal earnings result from normal firm activities and are modeled as an arithmetic Brownian motion. The abnormal earnings result from surprising activities (e.g., introduction of an unexpected new product, an unexpected strike) and are modeled as a compound Poisson process where the earnings jump sizes have a normal distribution. We investigate, in a simple general equilibrium model, how normal and abnormal earnings affect the cost of contingent bonus compensation to the firm. [source]


    Semiparametric variance-component models for linkage and association analyses of censored trait data

    GENETIC EPIDEMIOLOGY, Issue 7 2006
    G. Diao
    Abstract Variance-component (VC) models are widely used for linkage and association mapping of quantitative trait loci in general human pedigrees. Traditional VC methods assume that the trait values within a family follow a multivariate normal distribution and are fully observed. These assumptions are violated if the trait data contain censored observations. When the trait pertains to age at onset of disease, censoring is inevitable because of loss to follow-up and limited study duration. Censoring also arises when the trait assay cannot detect values below (or above) certain thresholds. The latent trait values tend to have a complex distribution. Applying traditional VC methods to censored trait data would inflate type I error and reduce power. We present valid and powerful methods for the linkage and association analyses of censored trait data. Our methods are based on a novel class of semiparametric VC models, which allows an arbitrary distribution for the latent trait values. We construct appropriate likelihood for the observed data, which may contain left or right censored observations. The maximum likelihood estimators are approximately unbiased, normally distributed, and statistically efficient. We develop stable and efficient numerical algorithms to implement the corresponding inference procedures. Extensive simulation studies demonstrate that the proposed methods outperform the existing ones in practical situations. We provide an application to the age at onset of alcohol dependence data from the Collaborative Study on the Genetics of Alcoholism. A computer program is freely available. Genet. Epidemiol. 2006. © 2006 Wiley-Liss, Inc. [source]


    An analysis of P times reported in the Reviewed Event Bulletin for Chinese underground explosions

    GEOPHYSICAL JOURNAL INTERNATIONAL, Issue 2 2005
    A. Douglas
    SUMMARY Analysis of variance is used to estimate the measurement error and path effects in the P times reported in the Reviewed Event Bulletins (REBs, produced by the provisional International Data Center, Arlington, USA) and in times we have read, for explosions at the Chinese Test Site. Path effects are those differences between traveltimes calculated from tables and the true times that result in epicentre error. The main conclusions of the study are: (1) the estimated variance of the measurement error for P times reported in the REB at large signal-to-noise ratio (SNR) is 0.04 s2, the bulk of the readings being analyst-adjusted automatic-detections, whereas for our times the variance is 0.01 s2 and (2) the standard deviation of the path effects for both sets of observations is about 0.6 s. The study shows that measurement error is about twice (,0.2 s rather than ,0.1 s) and path effects about half the values assumed for the REB times. However, uncertainties in the estimated epicentres are poorly described by treating path effects as a random variable with a normal distribution. Only by estimating path effects and using these to correct onset times can reliable estimates of epicentre uncertainty be obtained. There is currently an international programme to do just this. The results imply that with P times from explosions at three or four stations with good SNR (so that the measurement error is around 0.1 s) and well distributed in azimuth, then with correction for path effects the area of the 90 per cent coverage ellipse should be much less than 1000 km2,the area allowed for an on-site inspection under the Comprehensive Test Ban Treaty,and should cover the true epicentre with the given probability. [source]


    Multiple pathology and tails of disability: Space,time structure of disability in longevity

    GERIATRICS & GERONTOLOGY INTERNATIONAL, Issue 4 2003
    Satoru Matsushita
    Disability and the resulting lowered quality of life are serious issues accompanying increased longevity. Curiously, despite its potential contribution to aging theory, complete statistical and etiological structures of this common and unwelcome aging phenotype before death have not been well identified. Another neglected issue in aging and disability is the principles of phylogenesis and morphogenesis, which contemporary life science invariably starts with. In the present review these two related subjects are addressed, with an introduction of an analysis on patients and published data. Statistically rigorous log,normal and normal distributions distinguish disability for its duration and age-wise distribution, respectively. Multiple pathology and diverse effects of various endogenous diseases on disability are confirmed. The robust long-tailed log,normal distribution for various phases of disability validates the fact that patients in disability undergo series of stochastic subprocesses of many independent endogenous diseases until death. For 60% of patients, the log,normal distribution is mimicked by a random walk model. Diseases of core organs are major causes of the long tails. A declining force of natural selection after reproduction and trade-off of life history through pleiotropy of the genes are considered to be the roots of aging. The attenuated selection pressure and the resulting decrease of genetic constraints produce an increased opportunity for chance and stochastics. Elucidated stochastic behaviors of disability underscore the key role of chance in aging. Evolutionary modifications in the development of the structure tend to favor developmentally later stages first. Distal parts are developmentally last, therefore most subject to modification. The rate of molecular evolution of the genes is also found to be relatively slow at the core and rapid at the edge of cells and organs. Therefore, systems at the core must be relatively slow and inactive to comply with pleiotropy and trade-offs in comparison with systems at the edge. Hence, against flat and probabilistic aging, the core organs must be moulded to be more robust with a lower threshold for dysfunction, to age relatively slowly, and should have less of a disease quota in aging. The principle of core protective aging assures possibilities not only to reduce disability but also to accomplish the Third Age as well. Finally, it must also be acknowledged that the principle is a double-edged sword. Paradoxically, the developed biological and societal organization provides protection for the injured core, and so develops long tails of disability. The principle of core protective aging re-emphasizes the key role of prevention in order to reduce the amount of disability. [source]


    Uncertainties in interpretation of isotope signals for estimation of fine root longevity: theoretical considerations

    GLOBAL CHANGE BIOLOGY, Issue 7 2003
    YIQI LUOArticle first published online: 25 JUN 200
    Abstract This paper examines uncertainties in the interpretation of isotope signals when estimating fine root longevity, particularly in forests. The isotope signals are depleted ,13C values from elevated CO2 experiments and enriched ,14C values from bomb 14C in atmospheric CO2. For the CO2 experiments, I explored the effects of six root mortality patterns (on,off, proportional, constant, normal, left skew, and right skew distributions), five levels of nonstructural carbohydrate (NSC) reserves, and increased root growth on root ,13C values after CO2 fumigation. My analysis indicates that fitting a linear equation to ,13C data provides unbiased estimates of longevity only if root mortality follows an on,off model, without dilution of isotope signals by pretreatment NSC reserves, and under a steady state between growth and death. If root mortality follows the other patterns, the linear extrapolation considerably overestimates root longevity. In contrast, fitting an exponential equation to ,13C data underestimates longevity with all the mortality patterns except the proportional one. With either linear or exponential extrapolation, dilution of isotope signals by pretreatment NSC reserves could result in overestimation of root longevity by several-fold. Root longevity is underestimated if elevated CO2 stimulates fine root growth. For the bomb 14C approach, I examined the effects of four mortality patterns (on,off, proportional, constant, and normal distribution) on root ,14C values. For a given ,14C value, the proportional pattern usually provides a shorter estimate of root longevity than the other patterns. Overall, we have to improve our understanding of root growth and mortality patterns and to measure NSC reserves in order to reduce uncertainties in estimated fine root longevity from isotope data. [source]


    A study on un-fully developed slug flow in a vertical tube

    HEAT TRANSFER - ASIAN RESEARCH (FORMERLY HEAT TRANSFER-JAPANESE RESEARCH), Issue 4 2005
    Zhijia Yu
    Abstract Gas-liquid co-current vertical slug flow was studied in a vertical Plexiglas tube. Taylor bubbles and liquid slug lengths and their rising velocities were measured by means of a pair of conductivity probes under un-fully developed flow conditions. The influences of the superficial velocity of gas and liquid on slug flow parameters were examined. Using statistical analysis on the length of Taylor bubbles, the probability distribution of the length of the Taylor bubbles was obtained, which obeyed a normal distribution under a significance level of , = 0.05. © 2005 Wiley Periodicals, Inc. Heat Trans Asian Res, 34(4): 235,242, 2005; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/htj.20063 [source]


    Temporal and spatial variation of annual rainfall on the island of Crete, Greece

    HYDROLOGICAL PROCESSES, Issue 10 2003
    S. Naoum
    Abstract Annual rainfall records from the island of Crete in Greece were used with the aid of a geographical information system (GIS) to study the temporal and spatial rainfall characteristics. The GIS was used to produce a digital elevation model, delineate watersheds and estimate the areal rainfall from a network of raingauges by using different interpolation schemes. The rainfall,elevation correlation was significant, suggesting an orographic type of precipitation for the island. The rainfall records for the majority of the stations were found to fit the normal distribution. Deviation from normal for the rest of the records was attributed to the wettest year of 1977,1978. The year 1989,1990 was the driest, and most rainfall records showed a decrease in rainfall over 30 years with higher negative rainfall gradients at the higher elevations. Frequency analysis of the rainfall records was used to estimate areal rainfall for the island of Crete and its main watersheds for return periods of 2, 5 and 10 years. Copyright © 2003 John Wiley & Sons, Ltd. [source]


    An argument-dependent approach to determining OWA operator weights based on the rule of maximum entropy

    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 2 2007
    Jian Wu
    The methods for determining OWA operator weights have aroused wide attention. We first review the main existing methods for determining OWA operator weights. We next introduce the principle of maximum entropy for setting up probability distributions on the basis of partial knowledge and prove that Xu's normal distribution-based method obeys the principle of maximum entropy. Finally, we propose an argument-dependent approach based on normal distribution, which assigns very low weights to these "false" or "biased" opinions and can relieve the influence of the unfair arguments. A numerical example is provided to illustrate the application of the proposed approach. © 2007 Wiley Periodicals, Inc. Int J Int Syst 22: 209,221, 2007. [source]


    An overview of methods for determining OWA weights

    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 8 2005
    Zeshui Xu
    The ordered weighted aggregation (OWA) operator has received more and more attention since its appearance. One key point in the OWA operator is to determine its associated weights. In this article, I first briefly review existing main methods for determining the weights associated with the OWA operator, and then, motivated by the idea of normal distribution, I develop a novel practical method for obtaining the OWA weights, which is distinctly different from the existing ones. The method can relieve the influence of unfair arguments on the decision results by weighting these arguments with small values. Some of its desirable properties have also been investigated. © 2005 Wiley Periodicals, Inc. Int J Int Syst 20: 843,865, 2005. [source]


    Distributional Shapes and Validity Transport: A comparison of lower bounds

    INTERNATIONAL JOURNAL OF SELECTION AND ASSESSMENT, Issue 1 2008
    Jennifer L. Kisamore
    Organizations use validity transport to assess whether adoption of a particular selection measure may be of value. Current validity transport methodology assumes a normal distribution of validity parameters. Research over the past two decades has questioned this assumption, investigating the types of decision errors likely with various non-normal parameter distributions. The current paper demonstrates that the variance of the parameter distribution and choice of lower bound has a greater impact on the true lower bound of the parameter distribution than does its shape. The current practice of using 80% credibility intervals provides a reasonable compromise in terms of invariance to parameter distribution shape while also limiting the probability of erroneously concluding validity transport is reasonable in a specific case. [source]


    The Early History of the Cumulants and the Gram-Charlier Series

    INTERNATIONAL STATISTICAL REVIEW, Issue 2 2000
    Anders Hald
    Summary The early history of the Gram-Charlier series is discussed from three points of view: (1) a generalization of Laplace's central limit theorem, (2) a least squares approximation to a continuous function by means of Chebyshev-Hermite polynomials, (3) a generalization of Gauss's normal distribution to a system of skew distributions. Thiele defined the cumulants in terms of the moments, first by a recursion formula and later by an expansion of the logarithm of the moment generating function. He devised a differential operator which adjusts any cumulant to a desired value. His little known 1899 paper in Danish on the properties of the cumulants is translated into English in the Appendix. [source]


    Persistent sensory dysfunction in pain-free herniotomy

    ACTA ANAESTHESIOLOGICA SCANDINAVICA, Issue 3 2010
    E. K. AASVANG
    Background: Persistent post-herniotomy pain may be a neuropathic pain state based on the finding of a persistent sensory dysfunction. However, detailed information on the normal distribution of sensory function in pain-free post-herniotomy patients hinders identification of exact pathogenic mechanisms. Therefore, we aimed to establish normative data on sensory function in pain-free patients >1 year after a groin herniotomy. Methods: Sensory thresholds were assessed in 40 pain-free patients by a standardized quantitative sensory testing (QST). Secondary endpoints included comparison of sensory function between the operated and the naïve side, and correlation between sensory function modalities. Results: QST showed that on the operated side, thermal data were normally distributed, but mechanical pressure and pinch thresholds were normalized only after log-transformation, and cold pain and pressure tolerance could not be normalized. Comparison of QST results revealed significant (P<0.01) cutaneous hypoesthesia/hyperalgesia, but also significant pressure hyperalgesia (P<0.01) and decreased pressure tolerance (P=0.02) on the operated vs. the naïve side. Wind-up was seen in 6 (15%) but with a low pain intensity. Conclusion: Persistent sensory dysfunction is common in pain-free post-herniotomy patients. Future studies of sensory function in persistent post-herniotomy pain should compare the findings to the present data in order to characterize individual patients and potentially identify subgroups, which may aid in allocation of patients to pharmacological or surgical treatment. [source]


    Marginal maximum likelihood estimation of item response theory (IRT) equating coefficients for the common-examinee design

    JAPANESE PSYCHOLOGICAL RESEARCH, Issue 2 2001
    Haruhiko Ogasawara
    A method of estimating item response theory (IRT) equating coefficients by the common-examinee design with the assumption of the two-parameter logistic model is provided. The method uses the marginal maximum likelihood estimation, in which individual ability parameters in a common-examinee group are numerically integrated out. The abilities of the common examinees are assumed to follow a normal distribution but with an unknown mean and standard deviation on one of the two tests to be equated. The distribution parameters are jointly estimated with the equating coefficients. Further, the asymptotic standard errors of the estimates of the equating coefficients and the parameters for the ability distribution are given. Numerical examples are provided to show the accuracy of the method. [source]


    Empowering surgical nurses improves compliance rates for antibiotic prophylaxis after caesarean birth

    JOURNAL OF ADVANCED NURSING, Issue 11 2009
    Zvi Shimoni
    Abstract Title.,Empowering surgical nurses improves compliance rates for antibiotic prophylaxis after caesarean birth. Aim., This paper is a report of a study of the effect of empowering surgical nurses to ensure that patients receive antibiotic prophylaxis after caesarean birth. Background., Despite the consensus that single dose antibiotic prophylaxis is beneficial for women have either elective or non-elective caesarean delivery, hospitals need methods to increase compliance rates. Method., In a study in Israel in 2007 surgical nurses were empowered to ensure that a single dose of cefazolin was given to the mother after cord clamping. A computerized system was used to identify women having caesarean births, cultures sent and culture results. Compliance was determined by chart review. Rates of compliance, suspected wound infections, and confirmed wound infections in 2007 were compared to rates in 2006 before the policy change. Relative risks were calculated dividing 2007 rates by those in 2006, and 95% confidence intervals were calculated using Taylor's series that does not assume a normal distribution. Statistical significance was assessed using the chi-square test. Findings., The compliance rate was increased from 25% in 2006 to 100% in 2007 (chi-square test, P < 0·001). Suspected wound infection rates decreased from 16·8% (186/1104) to 12·6% (137/1089) after the intervention (relative risk 0·75, 95% confidence interval, 0·61,0·92). Conclusion., Surgical nurses can ensure universal compliance for antibiotic prophylaxis in women after caesarean birth, leading to a reduction in wound infections. [source]