Difference between revisions of "Basic Exam Syllabus"
From NCSU Statistics Graduate Handbook
m |
|||
| (7 intermediate revisions by 2 users not shown) | |||
| Line 1: | Line 1: | ||
<p>This exam covers ST 511-512, ST 521-522, and ST 552. It is required for the [[Master of Statistics]] degree (Master's level pass required), the [[Doctor of Philosophy]] degree (PhD level pass required) and for the [[Doctor of Philosophy-Co-Major]] degree (PhD level pass required).</p> | <p>This exam covers ST 511-512, ST 521-522, and ST 552. It is required for the [[Master of Statistics]] degree (Master's level pass required), the [[Doctor of Philosophy]] degree (PhD level pass required) and for the [[Doctor of Philosophy-Co-Major]] degree (PhD level pass required).</p> | ||
| − | |||
| − | |||
<h2>ST 511-512: Statistical Methods</h2> | <h2>ST 511-512: Statistical Methods</h2> | ||
| − | Note that ST 511 is pre-requisite for the Master's program. | + | Note that ST 511 is a pre-requisite for the Master's program. |
<h3>Representative Texts</h3> | <h3>Representative Texts</h3> | ||
| Line 17: | Line 15: | ||
<li>Snedecor, George W. and Cochran, William G. Statistical Methods, 7th Ed., Iowa State University.</li> | <li>Snedecor, George W. and Cochran, William G. Statistical Methods, 7th Ed., Iowa State University.</li> | ||
<li>Steel, Robert G.D. and Torrie, James H. Principles and Procedures of Statistics: A Biometrical Approach, 2nd Ed., McGraw – Hill.</li> | <li>Steel, Robert G.D. and Torrie, James H. Principles and Procedures of Statistics: A Biometrical Approach, 2nd Ed., McGraw – Hill.</li> | ||
| − | <li>SAS Institute Inc., SAS/STAT User’s Guide, | + | <li>SAS Institute Inc., SAS/STAT User’s Guide, Release 6.03 ED., Cary, NC: SAS Institute Inc., 1988. 1028pp.</li> |
<li>SAS Institute Inc., SAS Language: Reference, Version 6, First Ed., Cary, NC: SAS Institute Inc., 1990. 1042 pp.</li> | <li>SAS Institute Inc., SAS Language: Reference, Version 6, First Ed., Cary, NC: SAS Institute Inc., 1990. 1042 pp.</li> | ||
</ul> | </ul> | ||
| Line 47: | Line 45: | ||
<li>Know what hypotheses can be tested and how to test them, including use of expected mean squares to find appropriate denominator.</li> | <li>Know what hypotheses can be tested and how to test them, including use of expected mean squares to find appropriate denominator.</li> | ||
<li>Know how to estimate and place confidence intervals on meaningful linear combinations of the fixed effects such as treatment contrasts, treatment means and orthogonal polynomials.</li> | <li>Know how to estimate and place confidence intervals on meaningful linear combinations of the fixed effects such as treatment contrasts, treatment means and orthogonal polynomials.</li> | ||
| − | <li>Know how to estimate the variance components in the model and how to use them to obtain | + | <li>Know how to estimate the variance components in the model and how to use them to obtain variances for linear combinations of (estimated) fixed effects (and understand the correlation structure as a function of these variance components.) </li> |
<li>Be able to recognize replication and subsampling, and account for them in the model, AOV table and analysis.</li> | <li>Be able to recognize replication and subsampling, and account for them in the model, AOV table and analysis.</li> | ||
<li>Know how to make multiple comparisons of treatment means using the Tukey (T) and Fisher Least Significant Difference (LSD) procedures and of contrasts using the Scheffe (S) and Bonferroni procedures.</li> | <li>Know how to make multiple comparisons of treatment means using the Tukey (T) and Fisher Least Significant Difference (LSD) procedures and of contrasts using the Scheffe (S) and Bonferroni procedures.</li> | ||
| Line 80: | Line 78: | ||
<li>Homogeneity of multinomial populations</li> | <li>Homogeneity of multinomial populations</li> | ||
<li>Independence of two categorical random variables</li> | <li>Independence of two categorical random variables</li> | ||
| − | <li> | + | <li>Goodness of fit for samples from normal, |
| + | Poisson, etc. samples with or without estimated parameters</li> | ||
<li>Nonindependent samples in 2x2 tables (McNemar test)</li> | <li>Nonindependent samples in 2x2 tables (McNemar test)</li> | ||
</ul> | </ul> | ||
</ul> | </ul> | ||
| − | |||
<h2>ST 521-522: Statistical Theory</h2> | <h2>ST 521-522: Statistical Theory</h2> | ||
| Line 93: | Line 91: | ||
<li>Casella, G. and Berger, R.L. Statistical Inference, 2nd Ed., Wadsworth/Brooks Cole, Pacific Grove, CA, 2001.</li> | <li>Casella, G. and Berger, R.L. Statistical Inference, 2nd Ed., Wadsworth/Brooks Cole, Pacific Grove, CA, 2001.</li> | ||
<li>Hogg, R.V., and Craig, A.T. Introduction to Mathematical Statistics, 4th Ed., MacMillan.</li> | <li>Hogg, R.V., and Craig, A.T. Introduction to Mathematical Statistics, 4th Ed., MacMillan.</li> | ||
| − | <li>Rohatgi, V.K. An Introduction to | + | <li>Rohatgi, V.K. An Introduction to Probability Theory and Mathematical Statistics, John Wiley & Sons, New York, 1976.</li> |
</ul> | </ul> | ||
| Line 100: | Line 98: | ||
<li>Basic probability calculus</li> | <li>Basic probability calculus</li> | ||
<li>Random variables, probability distributions, density functions and distribution functions</li> | <li>Random variables, probability distributions, density functions and distribution functions</li> | ||
| − | <li>Discrete probability models: | + | <li>Discrete probability models: e.g. binomial, Poisson, geometric, negative binomial, hypergeometric, etc.</li> |
| − | <li>Continuous probability models: uniform, exponential, beta, gamma, normal Weibull, Cauchy, extreme value, log-normal</li> | + | <li>Continuous probability models: e.g. uniform, exponential, beta, gamma, normal Weibull, Cauchy, extreme value, log-normal, etc.</li> |
<li>Multivariate probability models: multinomial, bivariate normal</li> | <li>Multivariate probability models: multinomial, bivariate normal</li> | ||
<li>Expected value, variance, covariance, correlation, moments (about zero and about the mean) and moment-generating functions</li> | <li>Expected value, variance, covariance, correlation, moments (about zero and about the mean) and moment-generating functions</li> | ||
<li>Moments of functions of random variables</li> | <li>Moments of functions of random variables</li> | ||
| − | |||
<li>Conditional distributions and expectations</li> | <li>Conditional distributions and expectations</li> | ||
<li>Distributions of functions of random variables, order statistics</li> | <li>Distributions of functions of random variables, order statistics</li> | ||
| Line 112: | Line 109: | ||
<li>Convergence in probability and the weak law of large numbers</li> | <li>Convergence in probability and the weak law of large numbers</li> | ||
<li>Convergence in distribution, the central limit theorem, asymptotic normality, Slutsky’s theorem and the “delta method”</li> | <li>Convergence in distribution, the central limit theorem, asymptotic normality, Slutsky’s theorem and the “delta method”</li> | ||
| − | <li>Bayesian inference: prior and posterior probability distributions, Bayes estimators, loss functions, mean squared error and minimax approach</li> | + | <li>Bayesian inference: prior and posterior probability distributions, Bayes estimators, loss functions, mean squared error and minimax approach; hierarchical models</li> |
<li>Point estimators and confidence intervals</li> | <li>Point estimators and confidence intervals</li> | ||
<li>Properties of estimators</li> | <li>Properties of estimators</li> | ||
| − | <li>Method of moments, maximum likelihood and | + | <li>Method of moments, maximum likelihood and minimum chi-square</li> |
<li>Consistency</li> | <li>Consistency</li> | ||
<li>Cramer-Rao lower bound</li> | <li>Cramer-Rao lower bound</li> | ||
| Line 129: | Line 126: | ||
<li>Chi-square tests and contingency tables</li> | <li>Chi-square tests and contingency tables</li> | ||
</ul> | </ul> | ||
| − | |||
<h2>ST 552: Linear Models</h2> | <h2>ST 552: Linear Models</h2> | ||
| Line 136: | Line 132: | ||
<ul> | <ul> | ||
<li>Graybill, F.A. Theory and Applications of the Linear Model, Duxbury, N. Scituate, Mass, 1976.</li> | <li>Graybill, F.A. Theory and Applications of the Linear Model, Duxbury, N. Scituate, Mass, 1976.</li> | ||
| − | <li> | + | <li>Monahan, J.F. A Primer on Linear Models, CRC Press, 2008.</li> |
| − | <li>Searle, S.R. Linear Models | + | <li>Searle, S.R. Linear Models, John Wiley, New York, 1971.</li> |
| + | <li>Seber, G.A.F. Linear Regression Analysis, Wiley, 2003.</li> | ||
</ul> | </ul> | ||
<h3>Topics</h3> | <h3>Topics</h3> | ||
<ul> | <ul> | ||
| − | <li> | + | <li>Review of linear systems of equations, generalized inverses and projection matrices, vector spaces and subspaces</li> |
| − | + | <li>Linear statistical models and reparameterization</li> | |
| − | <li>Linear statistical models</li> | + | <li>Least squares theory and computation, including normal equations and partition of sums of squares</li> |
| − | <li>Least squares | + | <li>Estimability and estimable linear functions, restricted models</li> |
| − | + | <li>Gauss-Markov theorem, BLUE, and generalized least squares</li> | |
| − | <li>Estimability and estimable linear functions</li> | + | |
| − | <li> | + | <li>Theory and application of multivariate normal distribution and related distributions of quadratic forms: central and noncentral chi-squared, central and noncentral F. Cochran's Theorem</li> |
| − | + | ||
| − | <li> | + | <li>Testing the general linear hypothesis and Likelihood Ratio Tests</li> |
| − | <li> | + | <li>Joint distribution of several BLUEs under normality</li> |
| − | + | <li>Confidence intervals and sets for parameters and predictions</li> | |
| − | <li>Joint distribution of several | + | <li>Random effects, mixed models, and variance component estimation</li> |
| − | <li> | + | |
| − | <li> | + | |
| − | + | ||
| − | + | ||
| − | + | ||
</ul> | </ul> | ||
Latest revision as of 11:16, 15 June 2011
This exam covers ST 511-512, ST 521-522, and ST 552. It is required for the Master of Statistics degree (Master's level pass required), the Doctor of Philosophy degree (PhD level pass required) and for the Doctor of Philosophy-Co-Major degree (PhD level pass required).
Contents |
ST 511-512: Statistical Methods
Note that ST 511 is a pre-requisite for the Master's program.
Representative Texts
- Rao, P.V. Statistical Research Methods in the Life Sciences, Brooks/Cole.
- Ott, R. Lyman and Longnecker, Michael T. Introduction to Statistical Methods and Data Analysis.
- Damon. Jr., Richard A. and Harvey, Walter R. Experimental Design, ANOVA, and Regression, Harper and Row, Publishers, 9187. 508 pp.
- Neeter, John, Wasserman, William, and Kutner, Michael H. Applied Linear Statistical Models 3rd Ed., Richard D. Irwin, Inc., 1990, 1182 pp.
- Ostle, Bernard and Mensing, Richard W. Statistics in Research, 3rd Ed., Iowa State University Press.
- Snedecor, George W. and Cochran, William G. Statistical Methods, 7th Ed., Iowa State University.
- Steel, Robert G.D. and Torrie, James H. Principles and Procedures of Statistics: A Biometrical Approach, 2nd Ed., McGraw – Hill.
- SAS Institute Inc., SAS/STAT User’s Guide, Release 6.03 ED., Cary, NC: SAS Institute Inc., 1988. 1028pp.
- SAS Institute Inc., SAS Language: Reference, Version 6, First Ed., Cary, NC: SAS Institute Inc., 1990. 1042 pp.
Topics to Review
- Definition and computation of elementary descriptive statistics
- Populations and samples
- Sampling distributions and the Central Limit Theorem
- Use of the Z, t, Chi Square, and F tables
- Logical basis of confidence and tolerance intervals and tests of hypothesis
- Inference on mean, variance and proportion of one population
- Inference on means and proportions from two populations – independent and paired samples
- Inference on variances from two populations
- Power and sample size calculations
- Basic concepts of experimental design including the concept of experimental unit, experimental error, replication, relative efficiency, blocking, covariance, and randomization.
- For each of the following experimental designs:
- Completely Randomized Design (CRD)
- Randomized Complete Block Design ( RCBD)
- Split Plot and Repeated Measures designs
- Latin Square design
- Know the models (including alternate parameterizations) and related distributional assumptions.
- Be able to recognize which design is appropriate for a given experiment and understand the advantages and disadvantages of each design.
- Know how to construct the AOV table with degrees of freedom, sums of squares and mean squares.
- Know what hypotheses can be tested and how to test them, including use of expected mean squares to find appropriate denominator.
- Know how to estimate and place confidence intervals on meaningful linear combinations of the fixed effects such as treatment contrasts, treatment means and orthogonal polynomials.
- Know how to estimate the variance components in the model and how to use them to obtain variances for linear combinations of (estimated) fixed effects (and understand the correlation structure as a function of these variance components.)
- Be able to recognize replication and subsampling, and account for them in the model, AOV table and analysis.
- Know how to make multiple comparisons of treatment means using the Tukey (T) and Fisher Least Significant Difference (LSD) procedures and of contrasts using the Scheffe (S) and Bonferroni procedures.
- Know how to account for one or more covariates.
- Know how to uses SAS to carry out the above analyses.
- Basic concepts of treatment designs including:
- Treatments and treatment combinations
- Control versus experimental treatments
- Factorial treatment designs
- Factors and their levels
- Main effects and interactions (1st order, 2nd order, etc.)
- Nested designs
- Nested-Factorial designs
- For balanced data, partitioning of the treatment sum of squares in the AOV table for each of the treatment designs for fixed, random and mixed models and give the expected mean squares.
- Multiple regression using matrix notation, including
- Model and assumptions
- Normal equations and parameter estimators
- Properties of the estimators
- Inference in multiple regression, comparing subsetted models
- Lack of fit and pure error
- Residuals, residual plots and standard regression diagnostics
- Correlation and Fisher’s Z transformation
- Analysis of Covariance
- Analysis of categorical data:
- Chi-square goodness-of-fit tests
- Homogeneity of multinomial populations
- Independence of two categorical random variables
- Goodness of fit for samples from normal, Poisson, etc. samples with or without estimated parameters
- Nonindependent samples in 2x2 tables (McNemar test)
Students Should:
ST 521-522: Statistical Theory
Representative Texts
- Casella, G. and Berger, R.L. Statistical Inference, 2nd Ed., Wadsworth/Brooks Cole, Pacific Grove, CA, 2001.
- Hogg, R.V., and Craig, A.T. Introduction to Mathematical Statistics, 4th Ed., MacMillan.
- Rohatgi, V.K. An Introduction to Probability Theory and Mathematical Statistics, John Wiley & Sons, New York, 1976.
Topics
- Basic probability calculus
- Random variables, probability distributions, density functions and distribution functions
- Discrete probability models: e.g. binomial, Poisson, geometric, negative binomial, hypergeometric, etc.
- Continuous probability models: e.g. uniform, exponential, beta, gamma, normal Weibull, Cauchy, extreme value, log-normal, etc.
- Multivariate probability models: multinomial, bivariate normal
- Expected value, variance, covariance, correlation, moments (about zero and about the mean) and moment-generating functions
- Moments of functions of random variables
- Conditional distributions and expectations
- Distributions of functions of random variables, order statistics
- Chebyshev’s, Markov’s and Jensen’s inequalities
- Normal theory: joint distribution of the sample mean and variance, central and noncentral distributions for Student t, Chi square and F
- Convergence in probability and the weak law of large numbers
- Convergence in distribution, the central limit theorem, asymptotic normality, Slutsky’s theorem and the “delta method”
- Bayesian inference: prior and posterior probability distributions, Bayes estimators, loss functions, mean squared error and minimax approach; hierarchical models
- Point estimators and confidence intervals
- Properties of estimators
- Method of moments, maximum likelihood and minimum chi-square
- Consistency
- Cramer-Rao lower bound
- Exponential families, sufficient statistics, completeness and Basu’s theorem
- Unbiased estimation, UMVU estimation and the Rao-Blackwell theorem
- Confidence interval construction by inversion of hypothesis tests
- Logical basis for and properties of hypothesis tests
- Type I and II error, level of significance and power
- Simple and composite hypotheses
- Unbiased tests
- Neyman-Pearson lemma and UMP and UMPU tests
- Likelihood ratio test
- Chi-square tests and contingency tables
ST 552: Linear Models
Representative Texts
- Graybill, F.A. Theory and Applications of the Linear Model, Duxbury, N. Scituate, Mass, 1976.
- Monahan, J.F. A Primer on Linear Models, CRC Press, 2008.
- Searle, S.R. Linear Models, John Wiley, New York, 1971.
- Seber, G.A.F. Linear Regression Analysis, Wiley, 2003.
Topics
- Review of linear systems of equations, generalized inverses and projection matrices, vector spaces and subspaces
- Linear statistical models and reparameterization
- Least squares theory and computation, including normal equations and partition of sums of squares
- Estimability and estimable linear functions, restricted models
- Gauss-Markov theorem, BLUE, and generalized least squares
- Theory and application of multivariate normal distribution and related distributions of quadratic forms: central and noncentral chi-squared, central and noncentral F. Cochran's Theorem
- Testing the general linear hypothesis and Likelihood Ratio Tests
- Joint distribution of several BLUEs under normality
- Confidence intervals and sets for parameters and predictions
- Random effects, mixed models, and variance component estimation
Go to Main page
Go to Master of Statistics
Go to Doctor of Philosophy