We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
您好!
请问有可以封装好的Vim可以调用吗?
类似于mamba这种: import torch from mamba_ssm import Mamba
batch, length, dim = 2, 64, 16 x = torch.randn(batch, length, dim).to("cuda") model = Mamba( # This module uses roughly 3 * expand * d_model^2 parameters d_model=dim, # Model dimension d_model d_state=16, # SSM state expansion factor d_conv=4, # Local convolution width expand=2, # Block expansion factor ).to("cuda") y = model(x) assert y.shape == x.shape
谢谢~
The text was updated successfully, but these errors were encountered:
model = create_model( 'vim_tiny_patch16_224_bimambav2_final_pool_mean_abs_pos_embed_with_midclstok_div2', pretrained=False, num_classes=1000, drop_rate=0.0, drop_path_rate=0.1, drop_block_rate=None, img_size=224 )
Sorry, something went wrong.
mahao18cm
Where to call this code from
No branches or pull requests
您好!
请问有可以封装好的Vim可以调用吗?
类似于mamba这种:
import torch
from mamba_ssm import Mamba
batch, length, dim = 2, 64, 16
x = torch.randn(batch, length, dim).to("cuda")
model = Mamba(
# This module uses roughly 3 * expand * d_model^2 parameters
d_model=dim, # Model dimension d_model
d_state=16, # SSM state expansion factor
d_conv=4, # Local convolution width
expand=2, # Block expansion factor
).to("cuda")
y = model(x)
assert y.shape == x.shape
谢谢~
The text was updated successfully, but these errors were encountered: