How does dlgradient handle two output layers?
I have a deep learning image analysis network with a custom training loop and loss function. In this loss function I take the output layer’s output and the second to last layer’s output from the net, perform mse (using the inbuilt function) on both, and sum the results. My output layer is a custom layer, so I have control over it’s backwards function, but I cannot see the automatic backwards in the other layers.
When I use dlgradient on this new loss, how does Matlab handle two simultaneous rounds of differentiation? Does it sum the gradients for each learnable, or does one gradient calculation override the other?
I have tried both adam and SGDM methods of updating gradients, but I am suspecting since my network is not learning as expected that I need to adjust my backwards function in my custom layer to ensure a proper gradient backpropagation.I have a deep learning image analysis network with a custom training loop and loss function. In this loss function I take the output layer’s output and the second to last layer’s output from the net, perform mse (using the inbuilt function) on both, and sum the results. My output layer is a custom layer, so I have control over it’s backwards function, but I cannot see the automatic backwards in the other layers.
When I use dlgradient on this new loss, how does Matlab handle two simultaneous rounds of differentiation? Does it sum the gradients for each learnable, or does one gradient calculation override the other?
I have tried both adam and SGDM methods of updating gradients, but I am suspecting since my network is not learning as expected that I need to adjust my backwards function in my custom layer to ensure a proper gradient backpropagation. I have a deep learning image analysis network with a custom training loop and loss function. In this loss function I take the output layer’s output and the second to last layer’s output from the net, perform mse (using the inbuilt function) on both, and sum the results. My output layer is a custom layer, so I have control over it’s backwards function, but I cannot see the automatic backwards in the other layers.
When I use dlgradient on this new loss, how does Matlab handle two simultaneous rounds of differentiation? Does it sum the gradients for each learnable, or does one gradient calculation override the other?
I have tried both adam and SGDM methods of updating gradients, but I am suspecting since my network is not learning as expected that I need to adjust my backwards function in my custom layer to ensure a proper gradient backpropagation. deep learning, neural network MATLAB Answers — New Questions