dc.description.abstract |
Agriculture is the pillar for many countries economy across the globe, particularly for developing
countries like Ethiopia, apart from the source of human daily food, and employment. However,
ensuring food security, bringing stable economy and increasing farmer’s individual prosperity are
among the major remain serious problems in the country. This is due to lack of integration of
emerging technology for agricultural purpose which plays a significant role to overcome food
security problem. Machine learning and deep learning is an emerged technology as part of AI.
Therefore, this study is aimed to predict yield of five commonly cultivated cereal crop wheat,
barley, maize, teff and sorghum from land suitability that helps farmers to ensure their yield upon
cultivation. To achieve this study goal, soil, climate, topographic and yield dataset was collected
from Engineering corporation of Oromia, and Jimma agricultural research center. This dataset is
labeled based on FAO guideline, and agricultural professionals’ involvement. It undergoes several
preprocessing steps such handling missing value using mean imputation strategy, handling
categorical values using label encoder, and feature normalization. Preprocessed dataset divided
randomly into train and test dataset, so that it is ready for deep learning model training. Hence
three deep learning models such as Artificial neural network (ANN), Deep neural network (DNN),
and Linear Regression (LR) are built on top of our dataset to predict cereal crop yield. Model
performance evaluation metrics such as Loss, Mean squared error (MSE), Mean absolute error
(MAE), and Mean squared logarithm error (MSLE) are used to evaluate the model to figure out
the most performing model. Accordingly, DNN model with MSLE outperform the others on
validation loss. This result reveals that DNN is recommended model to predict cereal crop yield.
The finding of this study is to enhance cereal crop yield through yield prediction with regard to
land suitability level rating. |
en_US |