Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

training stuck at Renderer #45

Open
YESAndy opened this issue Nov 21, 2024 · 1 comment
Open

training stuck at Renderer #45

YESAndy opened this issue Nov 21, 2024 · 1 comment

Comments

@YESAndy
Copy link

YESAndy commented Nov 21, 2024

Hi,

Thank you very much for releasing your work! I was trying to rerun the gs training but stuck for a long time (~30 mins) when running

im, radius, _, = Renderer(raster_settings=curr_data['cam'])(**rendervar) from get_loss function in train.py

I suspected that it may be because of the installation configuration problem about CUDA. So I tried to reinstall the diff-gaussian-rasterization-w-depth package with CUDA_HOME=/path/to/cuda. But the problem still existed.

I checked the source code of the Renderer which is generated by python binding codes from C++. I have zero idea about how to debug with C++ ;)

Do you have any suggestion about the reason why it stuck? Thank you!

@YESAndy
Copy link
Author

YESAndy commented Nov 21, 2024

problem solved. it is because the mismatch between nvidia driver and torch version:)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant