Researchers from the University of Buffalo use explainable artificial intelligence in a study to effectively detect lung and bronchus cancer mortality rates in patients. The system is capable of making high-level predictions about LBC mortality rates.
It is the first research to use ensemble machine learning with an explainable algorithm for visualizing and understanding spatial heterogeneity of the relationships between LBC mortality and risk factors.
The new study was written by Zia U. Ahmed, Kang Sun, Michael Shelly, and Lina Mu, and it uses explainable artificial intelligence or XAI, to identify key risk factors for LBC mortality.
Read More: Quantum Star Launches AI-powered Malware Detection Software
Explainable artificial intelligence (XAI) was used with a stack-ensemble machine learning model framework to examine and display the spatial distribution of known risk factors’ contributions to lung and bronchus cancer (LBC) death rates across the United States.
Researchers say that smoking prevalence, poverty, and a community’s elevation were most important in predicting LBC mortality rates among the risk factors studied. However, the risk factors and LBC mortality rates were found to vary geographically.
The study mentioned, “Explainable artificial intelligence for exploring spatial variability of lung and bronchus cancer mortality rates in the contiguous USA.”
Researchers used five base-learners, namely the generalized linear model (GLM), random forest (R.F.), Gradient boosting machine (GBM), extreme Gradient boosting machine (XGBoost), and Deep Neural Network (DNN), to develop the stack-ensemble models. With more data and multiple models, A.I. algorithms operate better, making the stack ensemble model more effective than any single model.
“The results matter because the U.S. is a spatially heterogeneous environment. There is a wide variety in socioeconomic factors and education levels — essentially, one size does not fit all. Here local interpretation of machine learning models is more important than global interpretation,” said Ahmed.