Email: helpdesk@telkomuniversity.ac.id

This Portal for internal use only!

  • My Download
  • Checkout
Application Package Repository Telkom University
All Categories

All Categories

  • Visual Paradigm
  • IBM
  • Adobe
  • Google
  • Matlab
  • Microsoft
    • Microsoft Apps
    • Analytics
    • AI + Machine Learning
    • Compute
    • Database
    • Developer Tools
    • Internet Of Things
    • Learning Services
    • Middleware System
    • Networking
    • Operating System
    • Productivity Tools
    • Security
    • VLS
      • Windows
      • Office
  • Opensource
  • Wordpress
    • Plugin WP
    • Themes WP
  • Others

Search

0 Wishlist

Cart

Categories
  • Microsoft
    • Microsoft Apps
    • Office
    • Operating System
    • VLS
    • Developer Tools
    • Productivity Tools
    • Database
    • AI + Machine Learning
    • Middleware System
    • Learning Services
    • Analytics
    • Networking
    • Compute
    • Security
    • Internet Of Things
  • Adobe
  • Matlab
  • Google
  • Visual Paradigm
  • WordPress
    • Plugin WP
    • Themes WP
  • Opensource
  • Others
More Categories Less Categories
  • Get Pack
    • Product Category
    • Simple Product
    • Grouped Product
    • Variable Product
    • External Product
  • My Account
    • Download
    • Cart
    • Checkout
    • Login
  • About Us
    • Contact
    • Forum
    • Frequently Questions
    • Privacy Policy
  • Forum
    • News
      • Category
      • News Tag

iconTicket Service Desk

  • My Download
  • Checkout
Application Package Repository Telkom University
All Categories

All Categories

  • Visual Paradigm
  • IBM
  • Adobe
  • Google
  • Matlab
  • Microsoft
    • Microsoft Apps
    • Analytics
    • AI + Machine Learning
    • Compute
    • Database
    • Developer Tools
    • Internet Of Things
    • Learning Services
    • Middleware System
    • Networking
    • Operating System
    • Productivity Tools
    • Security
    • VLS
      • Windows
      • Office
  • Opensource
  • Wordpress
    • Plugin WP
    • Themes WP
  • Others

Search

0 Wishlist

Cart

Menu
  • Home
    • Download Application Package Repository Telkom University
    • Application Package Repository Telkom University
    • Download Official License Telkom University
    • Download Installer Application Pack
    • Product Category
    • Simple Product
    • Grouped Product
    • Variable Product
    • External Product
  • All Pack
    • Microsoft
      • Operating System
      • Productivity Tools
      • Developer Tools
      • Database
      • AI + Machine Learning
      • Middleware System
      • Networking
      • Compute
      • Security
      • Analytics
      • Internet Of Things
      • Learning Services
    • Microsoft Apps
      • VLS
    • Adobe
    • Matlab
    • WordPress
      • Themes WP
      • Plugin WP
    • Google
    • Opensource
    • Others
  • My account
    • Download
    • Get Pack
    • Cart
    • Checkout
  • News
    • Category
    • News Tag
  • Forum
  • About Us
    • Privacy Policy
    • Frequently Questions
    • Contact
Home/Matlab/How to get a dlgradient result for a blackbox function

How to get a dlgradient result for a blackbox function

PuTI / 2025-01-18
How to get a dlgradient result for a blackbox function
Matlab News

I’m working on a tomography reconstruction code using a neural representation, which I’d like to train. After some debugging I discovered that:
(1) you can’t use the dlarrays as input to the radon() function;
(2) dlgradient will not work with untraced variables, therefore all variables must be dlarrays;
(3) Using extractdata() to get the input of the radon() function converted to double will fail because dlgradient will lose its trace.

So now I have a conundrum. I need to compute that gradient. dlgradient() won’t do the job. What’s the best alternative way to do it and have the same format as adamupdate() requires?

See below what my loss function looks like. The way it’s written now it will fail at the Sinogram=radon()… line, because Intensity is a dlarray(). As I previously said, if you fix that by extracting the data, then dlgradient() fails.

Thoughts would be appreciated.
function [loss, gradients]=modelLoss(net, XY, Mask, SinogramRef, theta)
Intensity = predict(net,XY);
Intensity = reshape(Intensity, size(Mask));
Intensity(Mask)=0;

Sinogram=radon(Intensity, theta);
Sinogram=dlarray(Sinogram);

loss = sum((Sinogram(~Mask)-SinogramRef(~Mask)).^2);
loss = loss / sum(~Mask(:));

gradients = dlgradient(loss,net.Learnables);
endI’m working on a tomography reconstruction code using a neural representation, which I’d like to train. After some debugging I discovered that:
(1) you can’t use the dlarrays as input to the radon() function;
(2) dlgradient will not work with untraced variables, therefore all variables must be dlarrays;
(3) Using extractdata() to get the input of the radon() function converted to double will fail because dlgradient will lose its trace.

So now I have a conundrum. I need to compute that gradient. dlgradient() won’t do the job. What’s the best alternative way to do it and have the same format as adamupdate() requires?

See below what my loss function looks like. The way it’s written now it will fail at the Sinogram=radon()… line, because Intensity is a dlarray(). As I previously said, if you fix that by extracting the data, then dlgradient() fails.

Thoughts would be appreciated.
function [loss, gradients]=modelLoss(net, XY, Mask, SinogramRef, theta)
Intensity = predict(net,XY);
Intensity = reshape(Intensity, size(Mask));
Intensity(Mask)=0;

Sinogram=radon(Intensity, theta);
Sinogram=dlarray(Sinogram);

loss = sum((Sinogram(~Mask)-SinogramRef(~Mask)).^2);
loss = loss / sum(~Mask(:));

gradients = dlgradient(loss,net.Learnables);
end I’m working on a tomography reconstruction code using a neural representation, which I’d like to train. After some debugging I discovered that:
(1) you can’t use the dlarrays as input to the radon() function;
(2) dlgradient will not work with untraced variables, therefore all variables must be dlarrays;
(3) Using extractdata() to get the input of the radon() function converted to double will fail because dlgradient will lose its trace.

So now I have a conundrum. I need to compute that gradient. dlgradient() won’t do the job. What’s the best alternative way to do it and have the same format as adamupdate() requires?

See below what my loss function looks like. The way it’s written now it will fail at the Sinogram=radon()… line, because Intensity is a dlarray(). As I previously said, if you fix that by extracting the data, then dlgradient() fails.

Thoughts would be appreciated.
function [loss, gradients]=modelLoss(net, XY, Mask, SinogramRef, theta)
Intensity = predict(net,XY);
Intensity = reshape(Intensity, size(Mask));
Intensity(Mask)=0;

Sinogram=radon(Intensity, theta);
Sinogram=dlarray(Sinogram);

loss = sum((Sinogram(~Mask)-SinogramRef(~Mask)).^2);
loss = loss / sum(~Mask(:));

gradients = dlgradient(loss,net.Learnables);
end deep learning, neural network, radon MATLAB Answers — New Questions

​

Tags: matlab

Share this!

Related posts

Generate ST code from a look-up table with CONSTANT attribute
2025-05-22

Generate ST code from a look-up table with CONSTANT attribute

“no healthy upstream” error when trying to access My Account
2025-05-22

“no healthy upstream” error when trying to access My Account

MATLAB Answers is provisionally back?
2025-05-21

MATLAB Answers is provisionally back?

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Search

Categories

  • Matlab
  • Microsoft
  • News
  • Other
Application Package Repository Telkom University

Tags

matlab microsoft opensources
Application Package Download License

Application Package Download License

Adobe
Google for Education
IBM
Matlab
Microsoft
Wordpress
Visual Paradigm
Opensource

Sign Up For Newsletters

Be the First to Know. Sign up for newsletter today

Application Package Repository Telkom University

Portal Application Package Repository Telkom University, for internal use only, empower civitas academica in study and research.

Information

  • Telkom University
  • About Us
  • Contact
  • Forum Discussion
  • FAQ
  • Helpdesk Ticket

Contact Us

  • Ask: Any question please read FAQ
  • Mail: helpdesk@telkomuniversity.ac.id
  • Call: +62 823-1994-9941
  • WA: +62 823-1994-9943
  • Site: Gedung Panambulai. Jl. Telekomunikasi

Copyright © Telkom University. All Rights Reserved. ch

  • FAQ
  • Privacy Policy
  • Term

This Application Package for internal Telkom University only (students and employee). Chiers... Dismiss