Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About the file size? #13

Closed
ZhiyeTang opened this issue Oct 14, 2024 · 6 comments
Closed

About the file size? #13

ZhiyeTang opened this issue Oct 14, 2024 · 6 comments

Comments

@ZhiyeTang
Copy link

ZhiyeTang commented Oct 14, 2024

I run your code on BungeeNeRF dataset, and the obtained PLY file is hundreds of MB. For instance, the file size of
point_cloud_quantised_half.ply trained on scene Amsterdam is 367MB, while the size of codebook.pt is 331.86MB. As for other methods, the file size trained on BungeeNeRF should not be 10 times larger than that of MipNeRF360 (as reported in your manuscript, is 29MB). Have you tested your method on BungeeNeRF? Am I making a mistake in calculating the file size? How do you calculate the file sizes reported in your manuscript?

@PanagiotisP
Copy link
Collaborator

PanagiotisP commented Oct 14, 2024

Hello!
Unfortunately, I haven't checked the method with this dataset, but I expect it to work. Have you put the necessary command line arguments? Because running with default arguments runs the baseline (I might need to change that). If that is resolved the file should have the final, reduced size (the ones we report on the paper)
Maybe related to this, currently resolved issue #10

@ZhiyeTang
Copy link
Author

ZhiyeTang commented Oct 15, 2024

I re-run your command exactly like in the script full_eval.py, and the results are as follows:

scene\file size (MB) high variant low variant full final
amsterdam 134.39 324.14 59.05
bilbao 117.96 296.14 58.09
hollywood 132.29 351.27 71.94
pompidou      
quebec 119.09 311.08 50.40
rome 139.08 362.30 64.57

The file sizes I recorded are about point_cloud_quantised_half.ply, is this the correct way to measure your method?

The command I use is:

# For low variant
for scene in "${scenes[@]}"
do
    python train.py -s /data/share/NVS-Datasets/BungeeNeRF/$scene -m outputs/BungeeNeRF/$scene/low --store_grads --lambda_sh_sparsity 0.01 --store_grads --cull_SH 15000 --std_threshold=0.01 --cdist_threshold=1 --mercy_type=redundancy_opacity_opacity --eval
done

# for full final
for scene in "${scenes[@]}"
do
    python train.py -s /data/share/NVS-Datasets/BungeeNeRF/$scene -m outputs/BungeeNeRF/$scene/final --store_grads --lambda_sh_sparsity=0.1 --store_grads --cull_SH 15000 --std_threshold=0.04 --mercy_points --prune_dead_points --store_grads --lambda_alpha_regul=0.001 --std_threshold=0.04 --cdist_threshold=6 --mercy_type=redundancy_opacity_opacity --eval
done

# for high variant
for scene in "${scenes[@]}"
do
    python train.py -s /data/share/NVS-Datasets/BungeeNeRF/$scene -m outputs/BungeeNeRF/$scene/high --store_grads --lambda_sh_sparsity 0.01 --store_grads --cull_SH 15000 --std_threshold=0.06 --cdist_threshold=8 --mercy_type=redundancy_opacity_opacity --eval
done

Note that the training of scene Pompidou crashed because of CUDA OOM, so I left the cell blank.

@PanagiotisP
Copy link
Collaborator

This seems correct. How are the full final results compared to the original? Btw, the codebook.pt is not needed, as its information is stored in the .ply. So yes, at the end you only care about the point_cloud_quantised_half.ply size

@ZhiyeTang
Copy link
Author

What do you mean "compared to the original"? The original file size of point_cloud.ply?

@PanagiotisP
Copy link
Collaborator

On the one hand, yes, as point_cloud.py doesn't contain the quantisation, but I'm mainly referring to training with the baselines' arguments, that is, original 3DGS. In your first comment you said that Amsterdam scene was 367MB while on the table after that it's 59.05MB. Was that 367MB the quantised baseline (default arguments - original 3DGS)?

@ZhiyeTang
Copy link
Author

In the first comment, I run your code without any arguments as simple python train.py, while 367MB refers to the file size of the generated point_cloud_quantised_half.ply. If I understand correctly, it is exactly as you said.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants