Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Zero mean intensity normalization #172

Open
MarkusDrange opened this issue Apr 28, 2023 · 7 comments
Open

Zero mean intensity normalization #172

MarkusDrange opened this issue Apr 28, 2023 · 7 comments

Comments

@MarkusDrange
Copy link

As a pre-processing step, i do zero mean intensity normalization over all three RGB channels, with the same mean and standard deviation for all patches:

    ADP_MEAN = [0.81233799, 0.64032477, 0.81902153]
    ADP_STD = [0.18129702, 0.25731668, 0.16800649]

    [...]
    transforms.Normalize(ADP_MEAN, ADP_STD)

I understand this is not supported in FAST-Pathology pre processing; would it be possible to add this functionality?

@andreped
Copy link
Contributor

AFAIK, the NeuralNetwork PO has a setMeanAndStandardDeviation method, but it is not possible to set the mean and std values through an Attribute (see here).

It also looks like the mean and std values are floats, and hence it does not support multi-channel input, that is to apply a different mean and std for each channel individually.

@smistad
Copy link
Owner

smistad commented Apr 28, 2023

Multi-channel values are not supported atm.
But if the values are fixed you can just add it as a layer to your model.. so that the first layer in your network performs this normalization on the input image before passing it on to the rest of the layers.

@smistad
Copy link
Owner

smistad commented May 3, 2023

This functionality can be added to the already existing ZeroMeanUnitVariance PO: https://github.com/smistad/FAST/blob/master/source/FAST/Algorithms/IntensityNormalization/ZeroMeanUnitVariance.hpp

@andreped
Copy link
Contributor

andreped commented May 3, 2023

This functionality can be added to the already existing ZeroMeanUnitVariance PO: https://github.com/smistad/FAST/blob/master/source/FAST/Algorithms/IntensityNormalization/ZeroMeanUnitVariance.hpp

Yes, I saw this one, but I believe it has the same problem as the NeuralNetwork PO. It is only designed for single-channel images, as per this. Also one would need to add support for setting mean and std, which the NeuralNetwork PO already supports, so wouldn't it make more sense to add it to the NeuralNetwork PO? Or both?

@andreped
Copy link
Contributor

andreped commented May 3, 2023

But if the values are fixed you can just add it as a layer to your model.. so that the first layer in your network performs this normalization on the input image before passing it on to the rest of the layers.

This I have done for TF models before. Works wonders.

Have you tried the same in PyTorch, @MarkusDrange? Did it resolve the issue?

@MarkusDrange
Copy link
Author

Yes! Worked perfectly after adding it to the forward function of my backbone.

@smistad
Copy link
Owner

smistad commented May 3, 2023

This functionality can be added to the already existing ZeroMeanUnitVariance PO: https://github.com/smistad/FAST/blob/master/source/FAST/Algorithms/IntensityNormalization/ZeroMeanUnitVariance.hpp

Yes, I saw this one, but I believe it has the same problem as the NeuralNetwork PO. It is only designed for single-channel images, as per this. Also one would need to add support for setting mean and std, which the NeuralNetwork PO already supports, so wouldn't it make more sense to add it to the NeuralNetwork PO? Or both?

I meant that this functionality can be implemented and added to the ZeroMeanUnitVariance PO.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants