| English | Arabic | Home | Login |

Published Journal Articles

2023

A Classification of Outliers in Transformed Variables

2023-04
Journal of University of Duhok (Issue : 1) (Volume : 28)
The diagnostic of outliers is very essential since of their responsibility for producing large interpretative problems in linear regression analysis and nonlinear regression analysis. There has been a lot of work accomplished in identifying outliers in linear but not in nonlinear regression. In practice, it is often the case that the assumption of linear regression is violated, such as when highly influential outliers exist in the dataset, which will adversely impact the validity of the statistical analysis. Finding outliers is important because they are responsible for invalid inferences and inaccurate predictions as they have a larger impact on the computed values of various estimations. The outliers must be divided into vertical outliers (VO), good leverage points (GLP), and bad leverage points (BLP) since only the vertical outliers and bad leverage have an undue effect on parameter estimations. We compare several outlier detection techniques using a robust diagnostic plot to correctly classify good and bad leverage points and vertical outliers, by decreasing both masking and swamping effects for both the untransformed variables and transformed variables. The main idea is to detect of outliers before transformation (original data) and after transformation. The results of generation study and numerical indicate that modified generalized DIFFITS (different of fit) against the Diagnostic Robust Generalized Potential (MGDFF-DRGP) successfully detect outliers in the data
2022

Estimating Regression Coefficients using Robust Bootstrap with application to Covid-19 Data

2022-08
General Letters in Mathematics (Issue : 2) (Volume : 12)
The linear regression model is often used by researchers and data analysts for predictive, descriptive, and inferential purposes. When working with empirical data, this model is based on a set of assumptions that are not always satisfied. In this situation, using more complicated regression algorithms that do not strictly rely on the same assumptions might be one answer. Nevertheless, transformations provide a simpler technique for improving the validity of model assumptions and allow the user to continue using the well-known model of linear regression. The main objective of this project is to provide a transformation for the linear model’s response and predictor variables, as well as parameter estimation methods before the transformation and after the transformation. The bootstrap approach has been effectively used for many statistical estimates and inference issues, according to the paper.

A Nonlinear Transformation Methods Using Covid-19 Data in the Kurdistan Region

2022-04
2022 International Conference on Computer Science and Software Engineering (CSASE) (Issue : 6)
Ordinary Least squares (OLS) are the most widely used due to tradition and their optimal properties to estimate the parameters of linear and nonlinear regression models. Nevertheless, in the presence of outliers in the data, estimates of OLS become inefficient, and even a single unusual point can have a significant impact on the estimation of parameters. In the presence of outliers is the use of robust estimators rather than the method of OLS. They are finding a suitable nonlinear transformation to reduce anomalies, including non-additivity, heteroscedasticity, and non-normality in multiple nonlinear regression. It might be beneficial to transform the response variable or predictor variable, or both together to present the equation in a simple, functional form that is linear in the transformed variables. To illustrate the superior transformation function, we compare the squared correlation coefficient (coefficient of determination), Breusch-Pagan test, and Shapiro Wilk test between the transformation functions.

Back