-
Notifications
You must be signed in to change notification settings - Fork 381
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Could there be a Large Pretrained Model? #86
Comments
May I ask, as the repo mentioned that at least 16GB of memory is essential for pretraining, if that corresponds to only one kind of object such as chairs? |
Hi, @Dandelionym, Thanks for your interest in our work! the 16GB memory is the default training size and it can be applied to either one category or multi-categories (the model itself is agnostic to what the dataset is), but you might need to increase the batch size if the number of categories is huge, which might need to more memory. regarding the release of a public pre-trained model, there are many dataset license issues we need to figure out and it also takes resources for training the model, we hope to release it but couldn't promise that. |
Hi @SteveJunGao |
|
Hi, nv-tlabs. This work is excellent and I would like to know whether your team could release a large public pre-trained model containing almost every general 3d model (as large as possible) around the world, is it possible?
The text was updated successfully, but these errors were encountered: