PMC:7782580 / 33213-33446 JSONTXT

Annnotations TAB TSV DIC JSON TextAE

Id Subject Object Predicate Lexical cue
T255 0-233 Sentence denotes We adopted a knowledge distillation method in the training phrase; a small model (called a student network) was trained to mimic the ensemble of multiple models (called teacher networks) to obtain a small model with high performance.