First, let me thank you very much for this excellent resource!
I have a question:
When using the latest git HEAD, I run tdi.tl.get_pretrained_tcr_embedding(...) with the following bert_config:
tdi.model.modeling_bert.get_human_config(bert_type='small', vocab_size=36)
and using the checkpoint file human_bert_pseudosequence.tcr_v2.ckpt, I receive the following warning:
Warning: model.bert.embeddings.position_ids not found in the model. Ignoring model.bert.embeddings.position_ids in the provided state dict.
Is this something with which I should be concerned? Will the X_tcr values generated still be comparable to those in human_tcr_reference_v2.h5ad?
Thank you!