PMC:7796369 / 17223-17441
Annnotations
Id | Subject | Object | Predicate | Lexical cue |
---|---|---|---|---|
T142 | 0-111 | Sentence | denotes | Figure 4 Each layer in our network model consists of a fully connected tensor with a Relu activation function. |
T143 | 112-218 | Sentence | denotes | The first four all utilize a 20% dropout layer for regularization, while the final feature layer does not. |