In statistics, the coefficient of determination, also known as $R^2$, is a measure that provides information about the goodness of fit of a model. It is a statistical measure of how well the data fit the regression model. Specifically, $R^2$ is the proportion of the variance in the dependent variable that is predictable from the independent variable(s) . It is a percentage that ranges between 0 and 1, and it indicates how much of the variation of the dependent variable is explained by the independent variable(s) in a regression model.
To calculate $R^2$, you need to find the sum of the residuals squared and the total sum of squares. The sum squared regression is the sum of the residuals squared, and the total sum of squares is the sum of the distance the data is away from the mean all squared. The formula for $R^2$ is:
$$R^2=1-\frac{\text{sum squared regression (SSR)}}{\text{total sum of squares (SST)}}$$
$$R^2=1-\frac{\sum({y_i}-\hat{y_i})^2}{\sum(y_i-\bar{y})^2}$$
where $\hat{y_i}$ is the predicted value of the dependent variable, $y_i$ is the actual value of the dependent variable, and $\bar{y}$ is the mean of the dependent variable.
Interpreting $R^2$ values can be done in a few ways. For example, an $R^2$ value of 1 means that the model explains all of the variation in the dependent variable, while an $R^2$ value of 0 means that the model explains none of the variation in the dependent variable. A higher $R^2$ value indicates that more of the variation in the dependent variable is explained by the independent variable(s) in the model. However, it is important to note that a high $R^2$ value does not necessarily mean that the model is a good fit for the data, and a low $R^2$ value does not necessarily mean that the model is a poor fit for the data.