Id |
Subject |
Object |
Predicate |
Lexical cue |
T415 |
0-16 |
Sentence |
denotes |
Regressor block: |
T416 |
17-145 |
Sentence |
denotes |
The regressor block consisted of multiple linear layers, a convolution layer, a BN layer, and a ReLu layer, as shown in Fig. 6d. |
T417 |
146-288 |
Sentence |
denotes |
A skip-connection architecture was adopted to retain the features and increase the ability of the block to represent non-linear relationships. |
T418 |
289-419 |
Sentence |
denotes |
The convolution block in the skip-connection structure was a convolution layer with multiple numbers of 1 × 1 convolution kernels. |
T419 |
420-575 |
Sentence |
denotes |
The number of the convolution kernels was the same as that of the output size of the second linear layer to ensure the consistency of the vector dimension. |
T420 |
576-677 |
Sentence |
denotes |
The input size and output size of each linear layer were adjustable to be applicable to actual cases. |