UNCERTAINTY MANAGEMENT IN THE KRIGING SURROGATE MODEL

dc.contributor.advisorKim, Nam-Ho
dc.contributor.authorAlzahrani, Muhammed
dc.date.accessioned2025-04-15T05:52:28Z
dc.date.issued2025
dc.description.abstractKriging is a surrogate modeling technique widely used for interpolation and prediction based on a given set of training samples. These samples can either be deterministic or include noise, which may result from measurement errors or other external factors. The Kriging algorithm begins by selecting an appropriate model to capture the underlying behavior of the data. Common choices include constant trend functions (ordinary Kriging), linear trend functions, or quadratic trend functions. Following this, a fitting process is conducted to optimize the model by estimating hyperparameters such as the decay rate, regression coefficients, and variance. These hyperparameters are then used for the prediction purposes. Uncertainty in kriging surrogate models significantly affects the accuracy of predictions and the quality of outcomes, resulted from three main sources. The first is aleatory uncertainty, an inherent variability within collected data that is irreducible due to its random nature.This type of uncertainty is inherent and cannot be reduced. The second and third sources fall under epistemic uncertainty, which arises from limited knowledge and can be reduced. One source of epistemic uncertainty is the model form, reflecting the assumptions and structure used to describe the underlying global function. An inappropriate model can introduce uncertainty in the prediction. The other source is the fitting process, which involves estimating the hyperparameters of the model. Epistemic uncertainty can be reduced through techniques such as increasing knowledge, using alternative modeling approaches, or employing more robust fitting processes. Each of these three sources of uncertainty was extensively analyzed in dedicated chapters, examining how to manage them to enhance the accuracy and overall quality of predictions. Data uncertainty, which is a form of aleatory uncertainty, often requires tools to estimate variability and filter noise from the data. When the noise in the data is unknown, kriging serves as an effective filtration method to separate noise and estimate the data noise standard deviation. However, separating process noise from data noise within the covariance matrix is a challenging. Accurate estimation of the data noise standard deviation requires the correlation in the covariance matrix to be far away from identity, with its determinant significantly small, to enable clear separation of process noise and data noise. If the correlation matrix approaches an identity matrix, this separation becomes impossible. The correlation is effected by the decaying rate, a hyperparameter determined by the distance between samples. Larger distances between samples lead to an increased decaying rate, causing the correlation matrix to quickly approach an identity matrix. To overcome this, a sufficient number of samples with reduced spacing is necessary to decrease the decaying rate, ensuring that the correlation matrix diverges from the identity matrix and its determinant approaches zero. This enhances kriging’s ability to filter out noise accurately. Once the noise is effectively filtered, the refined data can be used in kriging to obtain reliable predictions. Therefore, maintaining proper sample density and spacing is crucial for accurately estimating the data noise standard deviation and improving the reliability of kriging predictions. Another source of uncertainty arises from the fitting process, where multiple estimation methods are available to fit the data and determine hyperparameters. This study introduces a proposed method, referred to as local departure normalization, designed to minimize the difference between the ideal local departure and the actual local departure. This method aims to satisfy a fundamental assumption of kriging: local departures follow a Gaussian distribution with zero mean and process variance. While correlation exists, at the sample location, the correlation vector cancels out the correlation matrix at the associated data points. As a result, the correlation has no effect on the local departure at the sample location. An algorithm was implemented to minimize these differences, ensuring the estimated local departures closely match a Gaussian distribution. To evaluate the performance of the proposed method, a test was conducted using a dataset generated from a true model following to a Gaussian distribution. The experiment was structured to examine the method’s ability to estimate hyperparameters and assess how closely the estimated parameters aligned with the predefined true parameters. Additionally, the results obtained using the proposed method were compared with hyperparameters estimated through the maximum likelihood estimation (MLE) method. The findings showed that the proposed local departure normalization method achieved higher accuracy than the MLE approach. Furthermore, the estimated hyperparameters obtained through this method were equal to the true predefined parameters used to generate the data. These results demonstrate the effectiveness of local departure normalization in accurately estimating hyperparameters while maintaining the Gaussian distribution assumption of kriging. The third source of uncertainty is the model form, as different forms yield varying outcomes, particularly when trend functions differ. This study aimed to optimize trend function selection to minimize uncertainty using a sensitivity analysis process. Uncertainty, influenced by regression coefficients, was reduced by systematically identifying and eliminating the least effective coefficient. The algorithm was repeated iteratively until minimum uncertainty was achieved, enhancing the trend function’s flexibility, increasing prediction accuracy, and narrowing the range of outcomes. The proposed method was compared with ordinary kriging, universal kriging, and backward elimination. Unlike backward elimination, which removes components that increase the variance between the trend function and the data, the proposed method eliminates those with the least impact on uncertainty. Results showed that the method significantly reduced uncertainty, especially when sample sizes were small. As sample sizes increase, the proposed uncertainty-sensitivity trend optimization (USTOK) method and backward elimination performed similarly. Overall, USTOK proves effective in reducing uncertainty and delivering reliable predictions, particularly when a limited number of data is available.
dc.format.extent103
dc.identifier.citationChicago Manual of Style
dc.identifier.urihttps://hdl.handle.net/20.500.14154/75192
dc.language.isoen_US
dc.publisherUniversity of Florida
dc.subjectKriging
dc.titleUNCERTAINTY MANAGEMENT IN THE KRIGING SURROGATE MODEL
dc.typeThesis
sdl.degree.departmentDepartment of Mechanical & Aerospace Engineering
sdl.degree.disciplineMechanical Engineering
sdl.degree.grantorUniversity of Florida
sdl.degree.namePh.D

Files

Original bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
SACM-Dissertation .pdf
Size:
5.26 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.61 KB
Format:
Item-specific license agreed to upon submission
Description:

Copyright owned by the Saudi Digital Library (SDL) © 2025