You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When trying to export a model larger than 2GB, onnx throws the following error:
RuntimeError: The serialized model is larger than the 2GiB limit imposed by the protobuf library. Therefore the output file must be a file path, so that the ONNX external data can be written to the same directory. Please specify the output file name.
File <command-4284593085184757>, line 1
----> 1 engine.export(model=model, export_root=TMP, export_type=ExportType.ONNX, transform=transform)
where model is a Padim with image size 512x512, and a wide_resnet50_2 as a backbone with default parameters (layers and features). Note that TMP already is a path to a directory, and not a path to an .onnx file.
Large onnx models are exported with external data (that is, with multiple files, not everything is included in the .onnx file), as specified here. And, indeed, torch.onnx.export accepts a external_data argument. That argument is is not exposed in Anomalib, since the to_onnx() method defined in Pytorch Lightning , that accepts **kwargs passed to torch.onnx.export, is overridden by the Anomalib ExportMixin (used in the based AnomalyModule).
My guess is that, exposing **kwargs passed to torch.onnx.export should solve the issue and allow exporting large onnx models.
Dataset
Other (please specify in the text field below)
Model
PADiM
Steps to reproduce the behavior
Train a large enough Padim model on any dataset, for example:
image size 512x512
backbone: wide_resnet50_2 with default layers and features
Any other relevant information: training on a custom dataset, but any dataset with image size 512x512 and the specified model parameters should be enough to reproduce the error.
Expected behavior
The methods engine.export() and/or model.to_onnx() should accept extra kwargs that will be passed to torch.onnx.export(), in order to simplify the export process of large models.
Describe the bug
When trying to export a model larger than 2GB, onnx throws the following error:
where model is a
Padim
with image size 512x512, and awide_resnet50_2
as a backbone with default parameters (layers
andfeatures
). Note thatTMP
already is a path to a directory, and not a path to an.onnx
file.Large onnx models are exported with external data (that is, with multiple files, not everything is included in the
.onnx
file), as specified here. And, indeed,torch.onnx.export
accepts aexternal_data
argument. That argument is is not exposed in Anomalib, since theto_onnx()
method defined in Pytorch Lightning , that accepts**kwargs
passed totorch.onnx.export
, is overridden by the AnomalibExportMixin
(used in the basedAnomalyModule
).My guess is that, exposing
**kwargs
passed totorch.onnx.export
should solve the issue and allow exporting large onnx models.Dataset
Other (please specify in the text field below)
Model
PADiM
Steps to reproduce the behavior
Train a large enough Padim model on any dataset, for example:
wide_resnet50_2
with default layers and featuresTry to export the model to onnx with
engine.export(model=model, export_root=TMP, export_type=ExportType.ONNX, transform=transform)
where
TMP
is the path to a local directory.OS information
OS information:
Expected behavior
The methods
engine.export()
and/ormodel.to_onnx()
should accept extra kwargs that will be passed totorch.onnx.export()
, in order to simplify the export process of large models.Screenshots
Pip/GitHub
pip
What version/branch did you use?
No response
Configuration YAML
No configuration used
Logs
Code of Conduct
The text was updated successfully, but these errors were encountered: