> top > docs > PMC:7782580 > spans > 33213-33446 > annotations

PMC:7782580 / 33213-33446 JSONTXT

Annnotations TAB JSON ListView MergeView

LitCovid-sentences

Id Subject Object Predicate Lexical cue
T255 0-233 Sentence denotes We adopted a knowledge distillation method in the training phrase; a small model (called a student network) was trained to mimic the ensemble of multiple models (called teacher networks) to obtain a small model with high performance.