We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error when running with vit_small_reg pre-trained model
RuntimeError: Error(s) in loading state_dict for DinoVisionTransformer: Missing key(s) in state_dict: "cls_token", "pos_embed", "register_tokens", "mask_token", "patch_embed.proj.weight", "patch_embed.proj.bias", "blocks.0.0.norm1.weight", "blocks.0.0.norm1.bias", "blocks.0.0.attn.qkv.weight", "blocks.0.0.attn.qkv.bias", "blocks.0.0.attn.proj.weight", "blocks.0.0.attn.proj.bias", "blocks.0.0.ls1.gamma", "blocks.0.0.norm2.weight", "blocks.0.0.norm2.bias", "blocks.0.0.mlp.fc1.weight", "blocks.0.0.mlp.fc1.bias", "blocks.0.0.mlp.fc2.weight", "blocks.0.0.mlp.fc2.bias", "blocks.0.0.ls2.gamma", "blocks.0.1.norm1.weight", "blocks.0.1.norm1.bias", "blocks.0.1.attn.qkv.weight", "blocks.0.1.attn.qkv.bias", "blocks.0.1.attn.proj.weight", "blocks.0.1.attn.proj.bias", "blocks.0.1.ls1.gamma", "blocks.0.1.norm2.weight", "blocks.0.1.norm2.bias", "blocks.0.1.mlp.fc1.weight", "blocks.0.1.mlp.fc1.bias", "blocks.0.1.mlp.fc2.weight", "blocks.0.1.mlp.fc2.bias", "blocks.0.1.ls2.gamma", "blocks.0.2.norm1.weight", "blocks.0.2.norm1.bias", "blocks.0.2.attn.qkv.weight", "blocks.0.2.attn.qkv.bias", "blocks.0.2.attn.proj.weight", "blocks.0.2.attn.proj.bias", "blocks.0.2.ls1.gamma", "blocks.0.2.norm2.weight", "blocks.0.2.norm2.bias", "blocks.0.2.mlp.fc1.weight", "blocks.0.2.mlp.fc1.bias", "blocks.0.2.mlp.fc2.weight", "blocks.0.2.mlp.fc2.bias", "blocks.0.2.ls2.gamma", "blocks.0.3.norm1.weight", "blocks.0.3.norm1.bias", "blocks.0.3.attn.qkv.weight", "blocks.0.3.attn.qkv.bias", "blocks.0.3.attn.proj.weight", "blocks.0.3.attn.proj.bias", "blocks.0.3.ls1.gamma", "blocks.0.3.norm2.weight", "blocks.0.3.norm2.bias", "blocks.0.3.mlp.fc1.weight", "blocks.0.3.mlp.fc1.bias", "blocks.0.3.mlp.fc2.weight", "blocks.0.3.mlp.fc2.bias", "blocks.0.3.ls2.gamma", "blocks.0.4.norm1.weight", "blocks.0.4.norm1.bias", "blocks.0.4.attn.qkv.weight", "blocks.0.4.attn.qkv.bias", "blocks.0.4.attn.proj.weight", "blocks.0.4.attn.proj.bias", "blocks.0.4.ls1.gamma", "blocks.0.4.norm2.weight", "blocks.0.4.norm2.bias", "blocks.0.4.mlp.fc1.weight", "blocks.0.4.mlp.fc1.bias", "blocks.0.4.mlp.fc2.weight", "blocks.0.4.mlp.fc2.bias", "blocks.0.4.ls2.gamma", "blocks.0.5.norm1.weight", "blocks.0.5.norm1.bias", "blocks.0.5.attn.qkv.weight", "blocks.0.5.attn.qkv.bias", "blocks.0.5.attn.proj.weight", "blocks.0.5.attn.proj.bias", "blocks.0.5.ls1.gamma", "blocks.0.5.norm2.weight", "blocks.0.5.norm2.bias", "blocks.0.5.mlp.fc1.weight", "blocks.0.5.mlp.fc1.bias", "blocks.0.5.mlp.fc2.weight", "blocks.0.5.mlp.fc2.bias", "blocks.0.5.ls2.gamma", "blocks.0.6.norm1.weight", "blocks.0.6.norm1.bias", "blocks.0.6.attn.qkv.weight", "blocks.0.6.attn.qkv.bias", "blocks.0.6.attn.proj.weight", "blocks.0.6.attn.proj.bias", "blocks.0.6.ls1.gamma", "blocks.0.6.norm2.weight", "blocks.0.6.norm2.bias", "blocks.0.6.mlp.fc1.weight", "blocks.0.6.mlp.fc1.bias", "blocks.0.6.mlp.fc2.weight", "blocks.0.6.mlp.fc2.bias", "blocks.0.6.ls2.gamma", "blocks.0.7.norm1.weight", "blocks.0.7.norm1.bias", "blocks.0.7.attn.qkv.weight", "blocks.0.7.attn.qkv.bias", "blocks.0.7.attn.proj.weight", "blocks.0.7.attn.proj.bias", "blocks.0.7.ls1.gamma", "blocks.0.7.norm2.weight", "blocks.0.7.norm2.bias", "blocks.0.7.mlp.fc1.weight", "blocks.0.7.mlp.fc1.bias", "blocks.0.7.mlp.fc2.weight", "blocks.0.7.mlp.fc2.bias", "blocks.0.7.ls2.gamma", "blocks.0.8.norm1.weight", "blocks.0.8.norm1.bias", "blocks.0.8.attn.qkv.weight", "blocks.0.8.attn.qkv.bias", "blocks.0.8.attn.proj.weight", "blocks.0.8.attn.proj.bias", "blocks.0.8.ls1.gamma", "blocks.0.8.norm2.weight", "blocks.0.8.norm2.bias", "blocks.0.8.mlp.fc1.weight", "blocks.0.8.mlp.fc1.bias", "blocks.0.8.mlp.fc2.weight", "blocks.0.8.mlp.fc2.bias", "blocks.0.8.ls2.gamma", "blocks.0.9.norm1.weight", "blocks.0.9.norm1.bias", "blocks.0.9.attn.qkv.weight", "blocks.0.9.attn.qkv.bias", "blocks.0.9.attn.proj.weight", "blocks.0.9.attn.proj.bias", "blocks.0.9.ls1.gamma", "blocks.0.9.norm2.weight", "blocks.0.9.norm2.bias", "blocks.0.9.mlp.fc1.weight", "blocks.0.9.mlp.fc1.bias", "blocks.0.9.mlp.fc2.weight", "blocks.0.9.mlp.fc2.bias", "blocks.0.9.ls2.gamma", "blocks.0.10.norm1.weight", "blocks.0.10.norm1.bias", "blocks.0.10.attn.qkv.weight", "blocks.0.10.attn.qkv.bias", "blocks.0.10.attn.proj.weight", "blocks.0.10.attn.proj.bias", "blocks.0.10.ls1.gamma", "blocks.0.10.norm2.weight", "blocks.0.10.norm2.bias", "blocks.0.10.mlp.fc1.weight", "blocks.0.10.mlp.fc1.bias", "blocks.0.10.mlp.fc2.weight", "blocks.0.10.mlp.fc2.bias", "blocks.0.10.ls2.gamma", "blocks.0.11.norm1.weight", "blocks.0.11.norm1.bias", "blocks.0.11.attn.qkv.weight", "blocks.0.11.attn.qkv.bias", "blocks.0.11.attn.proj.weight", "blocks.0.11.attn.proj.bias", "blocks.0.11.ls1.gamma", "blocks.0.11.norm2.weight", "blocks.0.11.norm2.bias", "blocks.0.11.mlp.fc1.weight", "blocks.0.11.mlp.fc1.bias", "blocks.0.11.mlp.fc2.weight", "blocks.0.11.mlp.fc2.bias", "blocks.0.11.ls2.gamma", "norm.weight", "norm.bias". Unexpected key(s) in state_dict: "model_state_dict".
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Error when running with vit_small_reg pre-trained model
RuntimeError: Error(s) in loading state_dict for DinoVisionTransformer:
Missing key(s) in state_dict: "cls_token", "pos_embed", "register_tokens", "mask_token", "patch_embed.proj.weight", "patch_embed.proj.bias", "blocks.0.0.norm1.weight", "blocks.0.0.norm1.bias", "blocks.0.0.attn.qkv.weight", "blocks.0.0.attn.qkv.bias", "blocks.0.0.attn.proj.weight", "blocks.0.0.attn.proj.bias", "blocks.0.0.ls1.gamma", "blocks.0.0.norm2.weight", "blocks.0.0.norm2.bias", "blocks.0.0.mlp.fc1.weight", "blocks.0.0.mlp.fc1.bias", "blocks.0.0.mlp.fc2.weight", "blocks.0.0.mlp.fc2.bias", "blocks.0.0.ls2.gamma", "blocks.0.1.norm1.weight", "blocks.0.1.norm1.bias", "blocks.0.1.attn.qkv.weight", "blocks.0.1.attn.qkv.bias", "blocks.0.1.attn.proj.weight", "blocks.0.1.attn.proj.bias", "blocks.0.1.ls1.gamma", "blocks.0.1.norm2.weight", "blocks.0.1.norm2.bias", "blocks.0.1.mlp.fc1.weight", "blocks.0.1.mlp.fc1.bias", "blocks.0.1.mlp.fc2.weight", "blocks.0.1.mlp.fc2.bias", "blocks.0.1.ls2.gamma", "blocks.0.2.norm1.weight", "blocks.0.2.norm1.bias", "blocks.0.2.attn.qkv.weight", "blocks.0.2.attn.qkv.bias", "blocks.0.2.attn.proj.weight", "blocks.0.2.attn.proj.bias", "blocks.0.2.ls1.gamma", "blocks.0.2.norm2.weight", "blocks.0.2.norm2.bias", "blocks.0.2.mlp.fc1.weight", "blocks.0.2.mlp.fc1.bias", "blocks.0.2.mlp.fc2.weight", "blocks.0.2.mlp.fc2.bias", "blocks.0.2.ls2.gamma", "blocks.0.3.norm1.weight", "blocks.0.3.norm1.bias", "blocks.0.3.attn.qkv.weight", "blocks.0.3.attn.qkv.bias", "blocks.0.3.attn.proj.weight", "blocks.0.3.attn.proj.bias", "blocks.0.3.ls1.gamma", "blocks.0.3.norm2.weight", "blocks.0.3.norm2.bias", "blocks.0.3.mlp.fc1.weight", "blocks.0.3.mlp.fc1.bias", "blocks.0.3.mlp.fc2.weight", "blocks.0.3.mlp.fc2.bias", "blocks.0.3.ls2.gamma", "blocks.0.4.norm1.weight", "blocks.0.4.norm1.bias", "blocks.0.4.attn.qkv.weight", "blocks.0.4.attn.qkv.bias", "blocks.0.4.attn.proj.weight", "blocks.0.4.attn.proj.bias", "blocks.0.4.ls1.gamma", "blocks.0.4.norm2.weight", "blocks.0.4.norm2.bias", "blocks.0.4.mlp.fc1.weight", "blocks.0.4.mlp.fc1.bias", "blocks.0.4.mlp.fc2.weight", "blocks.0.4.mlp.fc2.bias", "blocks.0.4.ls2.gamma", "blocks.0.5.norm1.weight", "blocks.0.5.norm1.bias", "blocks.0.5.attn.qkv.weight", "blocks.0.5.attn.qkv.bias", "blocks.0.5.attn.proj.weight", "blocks.0.5.attn.proj.bias", "blocks.0.5.ls1.gamma", "blocks.0.5.norm2.weight", "blocks.0.5.norm2.bias", "blocks.0.5.mlp.fc1.weight", "blocks.0.5.mlp.fc1.bias", "blocks.0.5.mlp.fc2.weight", "blocks.0.5.mlp.fc2.bias", "blocks.0.5.ls2.gamma", "blocks.0.6.norm1.weight", "blocks.0.6.norm1.bias", "blocks.0.6.attn.qkv.weight", "blocks.0.6.attn.qkv.bias", "blocks.0.6.attn.proj.weight", "blocks.0.6.attn.proj.bias", "blocks.0.6.ls1.gamma", "blocks.0.6.norm2.weight", "blocks.0.6.norm2.bias", "blocks.0.6.mlp.fc1.weight", "blocks.0.6.mlp.fc1.bias", "blocks.0.6.mlp.fc2.weight", "blocks.0.6.mlp.fc2.bias", "blocks.0.6.ls2.gamma", "blocks.0.7.norm1.weight", "blocks.0.7.norm1.bias", "blocks.0.7.attn.qkv.weight", "blocks.0.7.attn.qkv.bias", "blocks.0.7.attn.proj.weight", "blocks.0.7.attn.proj.bias", "blocks.0.7.ls1.gamma", "blocks.0.7.norm2.weight", "blocks.0.7.norm2.bias", "blocks.0.7.mlp.fc1.weight", "blocks.0.7.mlp.fc1.bias", "blocks.0.7.mlp.fc2.weight", "blocks.0.7.mlp.fc2.bias", "blocks.0.7.ls2.gamma", "blocks.0.8.norm1.weight", "blocks.0.8.norm1.bias", "blocks.0.8.attn.qkv.weight", "blocks.0.8.attn.qkv.bias", "blocks.0.8.attn.proj.weight", "blocks.0.8.attn.proj.bias", "blocks.0.8.ls1.gamma", "blocks.0.8.norm2.weight", "blocks.0.8.norm2.bias", "blocks.0.8.mlp.fc1.weight", "blocks.0.8.mlp.fc1.bias", "blocks.0.8.mlp.fc2.weight", "blocks.0.8.mlp.fc2.bias", "blocks.0.8.ls2.gamma", "blocks.0.9.norm1.weight", "blocks.0.9.norm1.bias", "blocks.0.9.attn.qkv.weight", "blocks.0.9.attn.qkv.bias", "blocks.0.9.attn.proj.weight", "blocks.0.9.attn.proj.bias", "blocks.0.9.ls1.gamma", "blocks.0.9.norm2.weight", "blocks.0.9.norm2.bias", "blocks.0.9.mlp.fc1.weight", "blocks.0.9.mlp.fc1.bias", "blocks.0.9.mlp.fc2.weight", "blocks.0.9.mlp.fc2.bias", "blocks.0.9.ls2.gamma", "blocks.0.10.norm1.weight", "blocks.0.10.norm1.bias", "blocks.0.10.attn.qkv.weight", "blocks.0.10.attn.qkv.bias", "blocks.0.10.attn.proj.weight", "blocks.0.10.attn.proj.bias", "blocks.0.10.ls1.gamma", "blocks.0.10.norm2.weight", "blocks.0.10.norm2.bias", "blocks.0.10.mlp.fc1.weight", "blocks.0.10.mlp.fc1.bias", "blocks.0.10.mlp.fc2.weight", "blocks.0.10.mlp.fc2.bias", "blocks.0.10.ls2.gamma", "blocks.0.11.norm1.weight", "blocks.0.11.norm1.bias", "blocks.0.11.attn.qkv.weight", "blocks.0.11.attn.qkv.bias", "blocks.0.11.attn.proj.weight", "blocks.0.11.attn.proj.bias", "blocks.0.11.ls1.gamma", "blocks.0.11.norm2.weight", "blocks.0.11.norm2.bias", "blocks.0.11.mlp.fc1.weight", "blocks.0.11.mlp.fc1.bias", "blocks.0.11.mlp.fc2.weight", "blocks.0.11.mlp.fc2.bias", "blocks.0.11.ls2.gamma", "norm.weight", "norm.bias".
Unexpected key(s) in state_dict: "model_state_dict".
The text was updated successfully, but these errors were encountered: