> top > docs > PMC:7796058 > spans > 78355-78578 > annotations

PMC:7796058 / 78355-78578 JSONTXT

Annnotations TAB JSON ListView MergeView

LitCovid-sentences

Id Subject Object Predicate Lexical cue
T622 0-67 Sentence denotes 1 Rectified Linear Unit (ReLU) is a non-linear activation function.
T623 68-132 Sentence denotes Max pooling layer applies a max pooling operation to its inputs.
T624 133-223 Sentence denotes 2 Fully Connected (FC) layers follow a stack of convolutional layers with different depth.