Skip to content

Is the TCR embeddings generation code still compatible with the Zenodo model weights? #1

@nodrogluap

Description

@nodrogluap

First, let me thank you very much for this excellent resource!

I have a question:

When using the latest git HEAD, I run tdi.tl.get_pretrained_tcr_embedding(...) with the following bert_config:

tdi.model.modeling_bert.get_human_config(bert_type='small', vocab_size=36)

and using the checkpoint file human_bert_pseudosequence.tcr_v2.ckpt, I receive the following warning:

Warning: model.bert.embeddings.position_ids not found in the model. Ignoring model.bert.embeddings.position_ids in the provided state dict.

Is this something with which I should be concerned? Will the X_tcr values generated still be comparable to those in human_tcr_reference_v2.h5ad?

Thank you!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions