How to get a dlgradient result for a blackbox function
I’m working on a tomography reconstruction code using a neural representation, which I’d like to train. After some debugging I discovered that:
(1) you can’t use the dlarrays as input to the radon() function;
(2) dlgradient will not work with untraced variables, therefore all variables must be dlarrays;
(3) Using extractdata() to get the input of the radon() function converted to double will fail because dlgradient will lose its trace.
So now I have a conundrum. I need to compute that gradient. dlgradient() won’t do the job. What’s the best alternative way to do it and have the same format as adamupdate() requires?
See below what my loss function looks like. The way it’s written now it will fail at the Sinogram=radon()… line, because Intensity is a dlarray(). As I previously said, if you fix that by extracting the data, then dlgradient() fails.
Thoughts would be appreciated.
function [loss, gradients]=modelLoss(net, XY, Mask, SinogramRef, theta)
Intensity = predict(net,XY);
Intensity = reshape(Intensity, size(Mask));
Intensity(Mask)=0;
Sinogram=radon(Intensity, theta);
Sinogram=dlarray(Sinogram);
loss = sum((Sinogram(~Mask)-SinogramRef(~Mask)).^2);
loss = loss / sum(~Mask(:));
gradients = dlgradient(loss,net.Learnables);
endI’m working on a tomography reconstruction code using a neural representation, which I’d like to train. After some debugging I discovered that:
(1) you can’t use the dlarrays as input to the radon() function;
(2) dlgradient will not work with untraced variables, therefore all variables must be dlarrays;
(3) Using extractdata() to get the input of the radon() function converted to double will fail because dlgradient will lose its trace.
So now I have a conundrum. I need to compute that gradient. dlgradient() won’t do the job. What’s the best alternative way to do it and have the same format as adamupdate() requires?
See below what my loss function looks like. The way it’s written now it will fail at the Sinogram=radon()… line, because Intensity is a dlarray(). As I previously said, if you fix that by extracting the data, then dlgradient() fails.
Thoughts would be appreciated.
function [loss, gradients]=modelLoss(net, XY, Mask, SinogramRef, theta)
Intensity = predict(net,XY);
Intensity = reshape(Intensity, size(Mask));
Intensity(Mask)=0;
Sinogram=radon(Intensity, theta);
Sinogram=dlarray(Sinogram);
loss = sum((Sinogram(~Mask)-SinogramRef(~Mask)).^2);
loss = loss / sum(~Mask(:));
gradients = dlgradient(loss,net.Learnables);
end I’m working on a tomography reconstruction code using a neural representation, which I’d like to train. After some debugging I discovered that:
(1) you can’t use the dlarrays as input to the radon() function;
(2) dlgradient will not work with untraced variables, therefore all variables must be dlarrays;
(3) Using extractdata() to get the input of the radon() function converted to double will fail because dlgradient will lose its trace.
So now I have a conundrum. I need to compute that gradient. dlgradient() won’t do the job. What’s the best alternative way to do it and have the same format as adamupdate() requires?
See below what my loss function looks like. The way it’s written now it will fail at the Sinogram=radon()… line, because Intensity is a dlarray(). As I previously said, if you fix that by extracting the data, then dlgradient() fails.
Thoughts would be appreciated.
function [loss, gradients]=modelLoss(net, XY, Mask, SinogramRef, theta)
Intensity = predict(net,XY);
Intensity = reshape(Intensity, size(Mask));
Intensity(Mask)=0;
Sinogram=radon(Intensity, theta);
Sinogram=dlarray(Sinogram);
loss = sum((Sinogram(~Mask)-SinogramRef(~Mask)).^2);
loss = loss / sum(~Mask(:));
gradients = dlgradient(loss,net.Learnables);
end deep learning, neural network, radon MATLAB Answers — New Questions