What are the 20 basic statistical terms used in research?

There are so many statistical terms used in research for development, fabrication, optimization, etc.

Statistics is an important subject and plays a key role in research. Good statistical knowledge may help the researchers to systematic approach toward data collection and handling. Decisions related to the adopted process and parameters can take easily by using statistics. A few important statistical terms are being used in research, given as below:

Mean:

Adding all the numbers together and dividing by the number of items in the set is known as the mean. For example: (10 +20 + 30 + 40 + 50) / 5 = 75.

 

Sum of Squares:

Sum of the squared differences between the overall average and the amount of variation explained by that row source.

 

Degrees of Freedom (df):

The number of estimated parameters used to compute the source’s sum of squares.

 

Mean Square (Variance):

The sum of squares is divided by the degrees of freedom.

 

Model:

How much variation in the response is explained by the model along with the overall model test for significance?

 

Terms:

The model is separated into individual terms and tested independently. Input parameters are the terms such as A, B, C, and D.

 

Linear Model:

The sequential sum of squares for the linear terms such as A, B, C, and D.

 

2FI Model:

The sequential sum of squares for the two-factor interaction terms such as AB, BC, CD, AD, BD, and AC.

 

Quadratic Model:

The sequential sum of squares for the quadratic terms such as A2, B2, C2, ABC, etc.

 

Cubic Model:

The sequential sum of squares for the cubic terms

Note: In the above models, the F-value tests the significance of adding interaction terms to the models. A small p-value less than 0.05 (Prob>F) indicates that adding interaction terms has improved the model.

 

Residual Error:

It shows how much variation in the response is still unexplained. It is also called error variance or unexplained variance. Mathematically,

Residual = Experimental value – Predicted value

 

Pure Error:

It reflects the variability of the observations within each treatment, in another word it is the amount of difference between replicate runs.

 

Lack of Fit:

The LoF report gives detail about whether the model fits the data well. The report appears only when it is possible to conduct the test.

  • The difference between the error sum of squares (SSE) from the model and the pure error sum (SSPE) of squares is called the lack of fit sum of squares (SSLF).

SSE = SSPE + SSLF

  • In case of the model is inadequate, the lack of fit variation can be significantly greater than pure error variation.
  • LoF is the amount the model predictions miss the observations.
  • SSE (Sum of Squares due to error) is defined as:

 

Where SSPE = Sum of squares due to pure error, SSLF = Sum of squares due to lack of fit

SSPE measures the inherent variability of y which cannot be explained by any model. (Without repeated measurements, SSPE = 0 and hence we cannot conduct the LoF test)

SSLF represents the variability of y that cannot be explained by the given model. This value may be reduced if a better model is used.

 

 

Sum of Squares (SS):

It can be used to identify the dispersion of data as well as how well the data can fit the model. There is some type of SS as follows.

  • The total sum of squares (TSS): It shows the variation of the values of a dependent variable from its mean.

Yi is the observed value, Ȳ is the mean

 

  • Regression sum of Squares (SSR): It shows how well a regression model represents the modeled data. A higher regression sum of squares indicates that the model does not fit the data well. It is also called the sum of squares due to regression or explained sum of squares. It can be calculated as:

 

  • Residual sum of squares (SSE): It measures the amount of variance in a data set that is not explained by a regression model itself. In another word, it measures the variation of modeling errors, also known as the sum of squared errors of prediction.

Residual sum of squares

Cor Total:

It is also called “Corrected Total Sum of Squares” (CTSS), It shows the amount of variation around the mean of the observations. The model explains part of it, the residual explains the rest.

F value:

It is used to know whether the test is statistically significant. F-value is the test for comparing the source’s mean square to the residual mean square. It can be calculated by dividing two mean squares.

P-value:

It is known as a probability value. It is a statistical tool to determine whether the hypothesis is correct or not. P-value is a number that lies between 0 and 1.

  • A small p-value (less than 0.05) calls for rejection of the null hypothesis (there are no factor effects).
  • If the p-value (p-value > F) is less than 0.05, then the source will be tested as significant.
  • There would be a probable effect on the response if model terms are significant.
  • A significant Lack of fit (LoF) shows that the model does not fit the data within the observed replicate variation.

Modeling Statistics

Standard Deviation, Coefficient of Variation, and PRESS

Standard Deviation


R-squared (R2)

Figure: Coefficient of determinationthe is the most important statistical terms used in research
Figure: Coefficient of determination is the most important statistical term used in research

Adj R-squared and Pred R-squared

 

Adj R-squared
Model Precision:

 

It measures the signal-to-noise ratio. A ratio greater than 4 is desirable. The model can be used to navigate the design space for an Adeq value greater than 4. 

 

statistical terms used in research statistical terms used in research statistical terms used in research statistical terms used in research

 


 

References:

  1. Montgomery D C, Peck E A and Vining G G (2012). Introduction to Linear Regression Analysis, 5th edition. Wiley.
  2. Weisberg S (2013). Applied Linear Regression, 4th edition. Wiley.
  3. https://www.statisticshowto.com
  4. https://www.jmp.com
  5. https://stats.stackexchange.com
  6. https://www.weibull.com
  7. https://slideplayer.com
  8. https://www.ibm.com/
  9. https://byjus.com
  10. https://www.mathsisfun.com
  11. https://www.wallstreetmojo.com
  12. https://corporatefinanceinstitute.com
  13. https://en.wikipedia.org
  14. https://www.analyticsvidhya.com/
  15. https://rpubs.com/
  16. https://www.statease.com

5 thoughts on “What are the 20 basic statistical terms used in research?”

Leave a Comment