PMC:7782580 / 33213-33446
Annnotations
LitCovid-sentences
{"project":"LitCovid-sentences","denotations":[{"id":"T255","span":{"begin":0,"end":233},"obj":"Sentence"}],"namespaces":[{"prefix":"_base","uri":"http://pubannotation.org/ontology/tao.owl#"}],"text":"We adopted a knowledge distillation method in the training phrase; a small model (called a student network) was trained to mimic the ensemble of multiple models (called teacher networks) to obtain a small model with high performance."}