artificial neural network bias
Hi all,
I have developed an artificial neural network with a single hidden layer. The purpose of this is to predict as a probability when a cliff will fail given wave, sea level and precipitation data. To do this I have used the ‘fitnet’ function. However, I am still very new to neural networks and dont fully understand the parameters behind them. Currently, the network is reproducing the test data ok ish however it is struggling to produce the validation data. This is because I have input the cliff failures as a binary column which has cliff failures as 1 and no failures as 0. There are approximately 30 failures which occur across 262 days. The issue is that the network is heavily biased towards reproducing non-failures and therefore when it attempts to predict the validation data does not predict any failures. This means the network ouputs more zeros than required since the training data is mostly zeros since most days experience no cliff failure. I am aware that there is potential to use a grid search to optimise the hyper parameters but I am unsure how to do this. How can I adjust the bias to reproduce more representative results.
Any help would be greatly appreciated.Hi all,
I have developed an artificial neural network with a single hidden layer. The purpose of this is to predict as a probability when a cliff will fail given wave, sea level and precipitation data. To do this I have used the ‘fitnet’ function. However, I am still very new to neural networks and dont fully understand the parameters behind them. Currently, the network is reproducing the test data ok ish however it is struggling to produce the validation data. This is because I have input the cliff failures as a binary column which has cliff failures as 1 and no failures as 0. There are approximately 30 failures which occur across 262 days. The issue is that the network is heavily biased towards reproducing non-failures and therefore when it attempts to predict the validation data does not predict any failures. This means the network ouputs more zeros than required since the training data is mostly zeros since most days experience no cliff failure. I am aware that there is potential to use a grid search to optimise the hyper parameters but I am unsure how to do this. How can I adjust the bias to reproduce more representative results.
Any help would be greatly appreciated. Hi all,
I have developed an artificial neural network with a single hidden layer. The purpose of this is to predict as a probability when a cliff will fail given wave, sea level and precipitation data. To do this I have used the ‘fitnet’ function. However, I am still very new to neural networks and dont fully understand the parameters behind them. Currently, the network is reproducing the test data ok ish however it is struggling to produce the validation data. This is because I have input the cliff failures as a binary column which has cliff failures as 1 and no failures as 0. There are approximately 30 failures which occur across 262 days. The issue is that the network is heavily biased towards reproducing non-failures and therefore when it attempts to predict the validation data does not predict any failures. This means the network ouputs more zeros than required since the training data is mostly zeros since most days experience no cliff failure. I am aware that there is potential to use a grid search to optimise the hyper parameters but I am unsure how to do this. How can I adjust the bias to reproduce more representative results.
Any help would be greatly appreciated. neural network, neural networks, ai, machine learning, binary, fitnet, bias, deep learning MATLAB Answers — New Questions