You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
"rainy day" experiment to explore humanoid-related mapping features
biped humanoid metadata can be found in both modern formats like VRM and also several legacy systems.
this experiment is to do with looking across three types of seemingly useful metadata:
bone name mappings (arbitrary => Humanoid)
blendshape mappings (arbitrary => "vrm-like?")
mesh subgroups (giving a name to a set of meshes; vrm-like first person culling comes to mind, but not only)
some experimental ROI points:
(or, "what's the point if VRM already does this?")
independently explore/verify notable VRM value propositions at the glTF level.
help provide context to go with discussions around whether or not VRM is relevant.
gather information about potential integration points related to humanoid metadata.
... ie: discover and document tools standing to benefit if definitive humanoid metadata is available with an avatar model.
technical strategy
seems like adapting omi-gltf-transform in an experimental branch would provide the quickest framework for testing.
basically right now this tool does MOZ_* => OMI_* transformation -- experiment would look at VRM <=> ACME_ transformations.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
... idea spawned from discord #omi-content-portability discussion.
"rainy day" experiment to explore humanoid-related mapping features
biped humanoid metadata can be found in both modern formats like VRM and also several legacy systems.
this experiment is to do with looking across three types of seemingly useful metadata:
some experimental ROI points:
(or, "what's the point if VRM already does this?")
technical strategy
seems like adapting omi-gltf-transform in an experimental branch would provide the quickest framework for testing.
basically right now this tool does
MOZ_*
=>OMI_*
transformation -- experiment would look atVRM
<=>ACME_
transformations.Beta Was this translation helpful? Give feedback.
All reactions