You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
TypeError: 'NoneType' object is not callable
final text_encoder_type: bert-base-uncased
Some weights of the model checkpoint at bert-base-uncased were not used when initializing BertModel: ['cls.predictions.bias', 'cls.seq_relationship.bias', 'cls.predictions.transform.dense.bias', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.decoder.weight', 'cls.seq_relationship.weight', 'cls.predictions.transform.dense.weight']
This IS expected if you are initializing BertModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
This IS NOT expected if you are initializing BertModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
Model loaded from /root/.cache/huggingface/hub/models--camenduru--GroundingDINO/snapshots/3d3869e41435a5dc9620b6b9bd37d2abb071e1c1/groundingdino_swint_ogc.pth
=> _IncompatibleKeys(missing_keys=[], unexpected_keys=['label_enc.weight'])
/usr/local/lib/python3.9/dist-packages/transformers/modeling_utils.py:830: FutureWarning: The device argument is deprecated and will be removed in v5 of Transformers.
warnings.warn(
/usr/local/lib/python3.9/dist-packages/torch/utils/checkpoint.py:31: UserWarning: None of the inputs have requires_grad=True. Gradients will be None
warnings.warn("None of the inputs have requires_grad=True. Gradients will be None")
The text was updated successfully, but these errors were encountered:
TypeError: 'NoneType' object is not callable
final text_encoder_type: bert-base-uncased
Some weights of the model checkpoint at bert-base-uncased were not used when initializing BertModel: ['cls.predictions.bias', 'cls.seq_relationship.bias', 'cls.predictions.transform.dense.bias', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.decoder.weight', 'cls.seq_relationship.weight', 'cls.predictions.transform.dense.weight']
Model loaded from /root/.cache/huggingface/hub/models--camenduru--GroundingDINO/snapshots/3d3869e41435a5dc9620b6b9bd37d2abb071e1c1/groundingdino_swint_ogc.pth
=> _IncompatibleKeys(missing_keys=[], unexpected_keys=['label_enc.weight'])
/usr/local/lib/python3.9/dist-packages/transformers/modeling_utils.py:830: FutureWarning: The
device
argument is deprecated and will be removed in v5 of Transformers.warnings.warn(
/usr/local/lib/python3.9/dist-packages/torch/utils/checkpoint.py:31: UserWarning: None of the inputs have requires_grad=True. Gradients will be None
warnings.warn("None of the inputs have requires_grad=True. Gradients will be None")
The text was updated successfully, but these errors were encountered: