Results 1  10
of
83
Nonparametric function estimation for clustered data when the predictor is measured without/with error,
 Journal of the American Statistical Association,
, 2000
"... Abstract We consider local polynomial kernel regression with a single covariate for clustered data using estimating equations. We assume that at most m < ∞ observations are available on each cluster. In the case of random regressors, with no measurement error in the predictor, we show that it is ..."
Abstract

Cited by 70 (16 self)
 Add to MetaCart
(Show Context)
Abstract We consider local polynomial kernel regression with a single covariate for clustered data using estimating equations. We assume that at most m < ∞ observations are available on each cluster. In the case of random regressors, with no measurement error in the predictor, we show that it is generally the best strategy to ignore entirely the correlation structure within each cluster, and instead to pretend that all observations are independent. In the further special case of longitudinal data on individuals with fixed common observation times, we show that equivalent to the pooled data approach is the strategy of fitting separate nonparametric regressions at each observation time and constructing an optimal weighted average. We also consider what happens when the predictor is measured with error. Using the SIMEX approach to correct for measurement error, we construct an asymptotic theory for both the pooled and weighted average estimators. Surprisingly, for the same amount of smoothing, the weighted average estimators typically have smaller variances than the pooling strategy. We apply the proposed methods to the analysis of the AIDS Costs and Services Utilization Survey. Nonparametric Function Estimation for Clustered Data When the Predictor is Measured Without/With Error Abstract We consider local polynomial kernel regression with a single covariate for clustered data using estimating equations. We assume that at most m < ∞ observations are available on each cluster. In the case of random regressors, with no measurement error in the predictor, we show that it is generally the best strategy to ignore entirely the correlation structure within each cluster, and instead to pretend that all observations are independent. In the further special case of longitudinal data on individuals with fixed common observation times, we show that equivalent to the pooled data approach is the strategy of fitting separate nonparametric regressions at each observation time and constructing an optimal weighted average. We also consider what happens when the predictor is measured with error. Using the SIMEX approach to correct for measurement error, we construct an asymptotic theory for both the pooled and weighted average estimators. Surprisingly, for the same amount of smoothing, the weighted average estimators typically have smaller variances than the pooling strategy. We apply the proposed methods to the analysis of the AIDS Costs and Services Utilization Survey.
Bayesian Smoothing and Regression Splines for Measurement Error Problems
 Journal of the American Statistical Association
, 2001
"... In the presence of covariate measurement error, estimating a regression function nonparametrically is extremely dicult, the problem being related to deconvolution. Various frequentist approaches exist for this problem, but to date there has been no Bayesian treatment. In this paper we describe Bayes ..."
Abstract

Cited by 41 (7 self)
 Add to MetaCart
In the presence of covariate measurement error, estimating a regression function nonparametrically is extremely dicult, the problem being related to deconvolution. Various frequentist approaches exist for this problem, but to date there has been no Bayesian treatment. In this paper we describe Bayesian approaches to modeling a exible regression function when the predictor variable is measured with error. The regression function is modeled with smoothing splines and regression P{splines. Two methods are described for exploration of the posterior. The rst is called iterative conditional modes (ICM) and is only partially Bayesian. ICM uses a componentwise maximization routine to nd the mode of the posterior. It also serves to create starting values for the second method, which is fully Bayesian and uses Markov chain Monte Carlo techniques to generate observations from the joint posterior distribution. Using the MCMC approach has the advantage that interval estimates that directly model and adjust for the measurement error are easily calculated. We provide simulations with several nonlinear regression functions and provide an illustrative example. Our simulations indicate that the frequentist mean squared error properties of the fully Bayesian method are better than those of ICM and also of previously proposed frequentist methods, at least in the examples we have studied. KEY WORDS: Bayesian methods; Eciency; Errors in variables; Functional method; Generalized linear models; Kernel regression; Measurement error; Nonparametric regression; P{splines; Regression Splines; SIMEX; Smoothing Splines; Structural modeling. Short title. Nonparametric Regression with Measurement Error Author Aliations Scott M. Berry (Email: scott@berryconsultants.com) is Statistical Scientist,...
Nonparametric Regression In The Presence Of Measurement Error
 Biometrika
, 1999
"... This paper develops the two ideas of approximately consistent and regression spline estimation in the presence of measurement error. In ..."
Abstract

Cited by 35 (5 self)
 Add to MetaCart
This paper develops the two ideas of approximately consistent and regression spline estimation in the presence of measurement error. In
Extracting Systematic Social Science Meaning from Text.”.
 American Journal of Political Science
, 2007
"... ..."
(Show Context)
Fitting population dynamic models to timeseries data by gradient matching.
 Ecology
, 2002
"... ..."
(Show Context)
Asymptotics For The Simex Estimator In Structural Measurement Error Models
, 1994
"... Cook & Stefanski (1994) describe a computer intensive method, the SIMEX method, for approximately consistent estimation in regression problems with additive measurement error. In this paper, we derive the asymptotic distribution of their estimators and show how to compute estimated standard erro ..."
Abstract

Cited by 19 (7 self)
 Add to MetaCart
Cook & Stefanski (1994) describe a computer intensive method, the SIMEX method, for approximately consistent estimation in regression problems with additive measurement error. In this paper, we derive the asymptotic distribution of their estimators and show how to compute estimated standard errors. These standard error estimators can either be used alone or as prepivoting devices in a bootstrap analysis. We also give theroetical justification to some of the phenomena observed by Cook & Stefanski in their simulations. Some Key Words: Asymptotics, Bootstrap, Computationally Intensive Methods, Measurement Error Models. Authors' Affiliations R. J. Carroll is Professor of Statistcs at Texas A&M University, College Station, TX 77843 3143. F. Lombard is Professor of Statistics, Rand Afrikaans University, P.O. Box 524, Johannesburg 2000, South Africa. H. Kuchenhoff is Wissenschaftlicher Assistent, Seminar fur Okonometrie und Statistik, Universitat Munchen, Akademiestrasse 1, D80799 Munch...
A DesignAdaptive Local Polynomial Estimator for the ErrorsinVariables Problem
"... Abstract: Local polynomial estimators are popular techniques for nonparametric regression estimation and have received great attention in the literature. Their simplest version, the local constant estimator, can be easily extended to the errorsinvariables context by exploiting its similarity with ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
Abstract: Local polynomial estimators are popular techniques for nonparametric regression estimation and have received great attention in the literature. Their simplest version, the local constant estimator, can be easily extended to the errorsinvariables context by exploiting its similarity with the deconvolution kernel density estimator. The generalization of the higher order versions of the estimator, however, is not straightforward and has remained an open problem for the last 15 years, since the publication of Fan and Truong (1993). We propose an innovative local polynomial estimator of any order in the errorsinvariables context, derive its designadaptive asymptotic properties and study its finite sample performance on simulated examples. We provide not only a solution to a longstanding open problem, but also provide methodological contributions to errorinvariable regression, including local polynomial estimation of derivative functions.
Testing Regions with Nonsmooth Boundaries via Multiscale Bootstrap
, 2006
"... Consider a binary response function of data, for example, whether hierarchical clustering produces a particular dendrogram of interest. An arbitraryshaped region of the parameter space may represent the null hypothesis defined by the binary response to the population. The bootstrap probability is ..."
Abstract

Cited by 18 (6 self)
 Add to MetaCart
(Show Context)
Consider a binary response function of data, for example, whether hierarchical clustering produces a particular dendrogram of interest. An arbitraryshaped region of the parameter space may represent the null hypothesis defined by the binary response to the population. The bootstrap probability is a widely used pvalue, and its calibration has been attempted in the literature; the test bias is estimated as curvature of smooth boundary surfaces of the region. However boundaries are nonsmooth for regions of practical importance such as cones. To treat such singularities, the Fourier transforms of surfaces are employed in this paper. Computation requires only the binary responses to bootstrap samples of size n ′ generated from data of size n. Our method first computes bootstrap probabilities for several values of n ′ around n, and then extrapolates them, after some transformation, back to n ′ = −n. This gives corrected pvalues related to the bootstrap iteration.
Segmented Regression With Errors In Predictors: Semiparametric And Parametric Methods
, 1995
"... this paper, we discuss methods for estimation in the threshold segmented regression model where X is unobservable but is instead subject to additive measurement error, i.e. W = X + ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
this paper, we discuss methods for estimation in the threshold segmented regression model where X is unobservable but is instead subject to additive measurement error, i.e. W = X +
Nonlinear and Nonparametric Regression and Instrumental
 Variables,Journal of the American Statistical Association
, 2004
"... We consider regression when the predictor is measured with error and an instrumental variable (IV) is available. The regression function can be modeled linearly, nonlinearly, or nonparametrically. Our major new result shows that the regression function and all parameters in the measurement error mod ..."
Abstract

Cited by 16 (5 self)
 Add to MetaCart
(Show Context)
We consider regression when the predictor is measured with error and an instrumental variable (IV) is available. The regression function can be modeled linearly, nonlinearly, or nonparametrically. Our major new result shows that the regression function and all parameters in the measurement error model are identified under relatively weak conditions, much weaker than previously known to imply identifiability. In addition, we exploit a characterization of the IV estimator as a classical “correction for attenuation ” method based on a particular estimate of the variance of the measurement error. This estimate of the measurement error variance allows us to construct functional nonparametric regression estimators making no assumptions about the distribution of the unobserved predictor and structural estimators that use parametric assumptions about this distribution. The functional estimators uses simulation extrapolation or deconvolution kernels and the structural method uses Bayesian Markov chain Monte Carlo. The Bayesian estimator is found to significantly outperform the functional approach.