Ensuring Non-Negativity and Constraints Satisfaction in Regression with ReLU in MATLAB
Hello, MathWorks community,
I am currently working on a deep learning neural network for a regression problem using MATLAB R2023a. My dataset consists of three inputs and four outputs. The data has been split into training (80%) and testing (20%) subsets.
I am using the ReLU activation function in my network, and the training process is implemented via the TrainNetwork function. For predictions, I utilize the predict function to generate the four outputs based on the test data.
There are two points should be noted which are:
The actual (target) outputs for both training and testing datasets are always non-negative (i.e., ≥0).
A specific relationship must hold between the inputs and actual outputs for every instance in the dataset, given as:
(Input #3+Output #1+Output #4) − (Input #1+Input #2+Output #2+Output #3) = 0 (Think about it as balance constraint)
My questions are:
How can I ensure that the predicted outputs generated by the neural network are always non-negative?
How can I guarantee that the predicted outputs strictly satisfy the above relationship with the inputs ?
Any suggestions, insights, or example implementations would be greatly appreciated.
Thank you in advance for your help!
Best regards,Hello, MathWorks community,
I am currently working on a deep learning neural network for a regression problem using MATLAB R2023a. My dataset consists of three inputs and four outputs. The data has been split into training (80%) and testing (20%) subsets.
I am using the ReLU activation function in my network, and the training process is implemented via the TrainNetwork function. For predictions, I utilize the predict function to generate the four outputs based on the test data.
There are two points should be noted which are:
The actual (target) outputs for both training and testing datasets are always non-negative (i.e., ≥0).
A specific relationship must hold between the inputs and actual outputs for every instance in the dataset, given as:
(Input #3+Output #1+Output #4) − (Input #1+Input #2+Output #2+Output #3) = 0 (Think about it as balance constraint)
My questions are:
How can I ensure that the predicted outputs generated by the neural network are always non-negative?
How can I guarantee that the predicted outputs strictly satisfy the above relationship with the inputs ?
Any suggestions, insights, or example implementations would be greatly appreciated.
Thank you in advance for your help!
Best regards, Hello, MathWorks community,
I am currently working on a deep learning neural network for a regression problem using MATLAB R2023a. My dataset consists of three inputs and four outputs. The data has been split into training (80%) and testing (20%) subsets.
I am using the ReLU activation function in my network, and the training process is implemented via the TrainNetwork function. For predictions, I utilize the predict function to generate the four outputs based on the test data.
There are two points should be noted which are:
The actual (target) outputs for both training and testing datasets are always non-negative (i.e., ≥0).
A specific relationship must hold between the inputs and actual outputs for every instance in the dataset, given as:
(Input #3+Output #1+Output #4) − (Input #1+Input #2+Output #2+Output #3) = 0 (Think about it as balance constraint)
My questions are:
How can I ensure that the predicted outputs generated by the neural network are always non-negative?
How can I guarantee that the predicted outputs strictly satisfy the above relationship with the inputs ?
Any suggestions, insights, or example implementations would be greatly appreciated.
Thank you in advance for your help!
Best regards, deep learning, regression MATLAB Answers — New Questions