PMC:7796058 / 78355-78578
Annnotations
Id | Subject | Object | Predicate | Lexical cue |
---|---|---|---|---|
T622 | 0-67 | Sentence | denotes | 1 Rectified Linear Unit (ReLU) is a non-linear activation function. |
T623 | 68-132 | Sentence | denotes | Max pooling layer applies a max pooling operation to its inputs. |
T624 | 133-223 | Sentence | denotes | 2 Fully Connected (FC) layers follow a stack of convolutional layers with different depth. |