[WIP: Feb2023] Implement Torch.ONNX #526
Replies: 6 comments 1 reply
-
[Feb2022] A recent model.load issue suggests how the torchsharp community could coordinate to eventually make loading ONNX feasible in torchsharp. Any comment and suggestion how relevant is this feature for torchsharp community. ========================================================================== This discussion could impact Feedback to the plan for Torch integration to ML.NET Will TorchSharp.ONNX feature redundant compared to ML.NET load ONNX feature for the .NET community? @michaelgsharp => Currently this integration is being planned I am posting here to gather more feedback for the integration team from TorchSharp to ML.NET |
Beta Was this translation helpful? Give feedback.
-
Discussion on implementing TorchSharp convert TorchScript to ONNX |
Beta Was this translation helpful? Give feedback.
-
Not sure where it is better to leave a comment, here or in the other thread. This thread topic definitely fits better. I'm trying to find ends about exporting models trained with TorchSharp to ONNX. There are a few mentionings out there that the work on it is in progress and most of them dated back to around mid-2022. Does anyone know the status on TorchSharp export to ONNX? My question is partially related to ML.NET and presumably its ability to load and run ONNX model even now. Also, as I understand there is another in progress work to bring TorchSharp functionality to ML.NET which again looks like incomplete yet. What I was thinking about doing in meanwhile is using TorchSharp to train my model, save/export it as ONNX, then load and run it in ML.NET. Maybe somebody can suggest how to do it. So, far I could train a model with TorchSharp and save it, but it looks like that file format is very specific to TorchSharp/PyTorch/torch. How do I use it in ML.NET then? |
Beta Was this translation helpful? Give feedback.
-
One of the main motivations of TorchSharp when @NiklasGustafsson actively took over has always been to keep DEEP machine learning activities within the .NET. This motivation is no longer doubt, but needs further optimization, including, eventually and hopefully, adding more PyTorch features e.g. support of ONNX and eventually custom operators KEEP REQUESTING! The Microsoft Team decides based on how strong is a particular request. |
Beta Was this translation helpful? Give feedback.
-
Export to ONNX is not currently a priority for the TorchSharp team at Microsoft (i.e. me :-)), as you can tell from the lack of activity on that front. At the moment, we are studying how (and whether) we can accommodate PyTorch 2.0. Adding @luisquintanilla for the ML.NET perspective. Contributions are, however, always welcome. |
Beta Was this translation helpful? Give feedback.
-
Import pytorch exported onnx model using TorchSharp
import torch
import torch.nn as nn
# Define the neural network
class Net(nn.Module):
def __init__(self):
super().__init__()
self.fc1 = nn.Linear(4, 10)
self.fc2 = nn.Linear(10, 3)
def forward(self, x):
x = torch.relu(self.fc1(x))
x = self.fc2(x)
return x
# Create an instance of the neural network
model = Net()
# Train the model on dummy data
dummy_input = torch.randn(1, 4)
torch.onnx.export(model, dummy_input, "model.onnx")
using Torch;
using System;
namespace LoadONNXModel
{
class Program
{
static void Main(string[] args)
{
// Load the ONNX model
var model = TorchModel.Load("model.onnx");
// Run a sample input through the model
var input = new Tensor(new[] { 1f, 2f, 3f, 4f }, new[] { 1, 4 });
var output = model.Forward(input);
// Display the output
Console.WriteLine(output);
}
}
}
|
Beta Was this translation helpful? Give feedback.
-
[Sep2022]
Long-term feature request
Motivation
Impact on Deep AI: NLP/Computer Vision experience using ML.NET
use case 1
use case 2
Mid-term feature request
Beta Was this translation helpful? Give feedback.
All reactions