PMC:7782580 / 50410-53253
Annnotations
{"target":"https://pubannotation.org/docs/sourcedb/PMC/sourceid/7782580","sourcedb":"PMC","sourceid":"7782580","source_url":"https://www.ncbi.nlm.nih.gov/pmc/7782580","text":"ResBlock-A: As discussed in ref. 57, the residual block is a CNN-based block that allows the CNN models to reuse features, thus accelerating the training speed of the models. In this study, we developed a residual block (ResBlock-A) that utilized a skip-connection for retaining features in different layers in the forward propagation. This block (Fig. 6a) consisted of a multiple-input multiple-output structure with two branches (an upper branch and a bottom branch), where input 1 and input 2 have the same size, but the values may be different. In contrast, output 1 and output 2 had the same size, but output 1 did not have a ReLu layer. The upper branch consisted of a max-pooling layer (Max-Pooling), a convolution layer (Conv 1 × 1), and a batch norm layer (BN). The Max-Pooling had a kernel size of 3 × 3 and a stride of 2 to downsample the input 1 for retaining the features and ensuring the same size as the output layer before the element-wise add operation was conducted in the bottom branch. The Conv 1 × 1 consisted of multiple 1 × 1 convolution kernels with the same number as that in the second convolution layer in the bottom branch to adjust the number of channels. The BN used a regulation function to ensure the input in each layer of the model followed a normal distribution with a mean of 0 and a variance of 1. The bottom branch consisted of two convolution layers, two BN layers, and two ReLu layers. The first convolution layer in the bottom branch consisted of multiple 3 × 3 convolution kernels with a stride of 2 and a padding of 1 to reduce the size of the feature maps when local features were obtained. The second convolution layer in the bottom branch consisted of multiple 3 × 3 convolution kernels with a stride of 1 and a padding of 1. The ReLu function was used as the activation function to ensure a non-linear relationship between the different layers. The output of the upper branch and the output of the bottom branch after the second BN were fused using an element-wise add operation. The fused result was output 1, and the fused result after the ReLu layer was output 2.\nFig. 6 The four units of the proposed framework.\na ResBlock-A architecture, containing two convolution layers with 3 × 3 kernels, one convolution layer with a 1 × 1 kernel, three batch normalization layers, two ReLu layers, and one max-pooling layer with a 3 × 3 kernel. b ResBlock-B architecture; the basic unit is the same as the ResBlock-A, except for output 1. c The Control Gate Block has a synaptic-based frontend architecture that controls the direction of the feature map flow and the overall optimization direction of the framework. d The Regressor architecture is a skip-connection architecture containing one convolution layer with 3 × 3 kernels, one batch normalization layer, one ReLu layer, and three linear layers.","divisions":[{"label":"label","span":{"begin":2114,"end":2120}},{"label":"title","span":{"begin":2121,"end":2162}}],"tracks":[{"project":"LitCovid-sentences","denotations":[{"id":"T376","span":{"begin":12,"end":32},"obj":"Sentence"},{"id":"T377","span":{"begin":33,"end":174},"obj":"Sentence"},{"id":"T378","span":{"begin":175,"end":335},"obj":"Sentence"},{"id":"T379","span":{"begin":336,"end":548},"obj":"Sentence"},{"id":"T380","span":{"begin":549,"end":642},"obj":"Sentence"},{"id":"T381","span":{"begin":643,"end":770},"obj":"Sentence"},{"id":"T382","span":{"begin":771,"end":1005},"obj":"Sentence"},{"id":"T383","span":{"begin":1006,"end":1184},"obj":"Sentence"},{"id":"T384","span":{"begin":1185,"end":1334},"obj":"Sentence"},{"id":"T385","span":{"begin":1335,"end":1425},"obj":"Sentence"},{"id":"T386","span":{"begin":1426,"end":1634},"obj":"Sentence"},{"id":"T387","span":{"begin":1635,"end":1771},"obj":"Sentence"},{"id":"T388","span":{"begin":1772,"end":1891},"obj":"Sentence"},{"id":"T389","span":{"begin":1892,"end":2026},"obj":"Sentence"},{"id":"T390","span":{"begin":2027,"end":2113},"obj":"Sentence"},{"id":"T391","span":{"begin":2114,"end":2162},"obj":"Sentence"},{"id":"T392","span":{"begin":2163,"end":2843},"obj":"Sentence"}],"namespaces":[{"prefix":"_base","uri":"http://pubannotation.org/ontology/tao.owl#"}],"attributes":[{"subj":"T376","pred":"source","obj":"LitCovid-sentences"},{"subj":"T377","pred":"source","obj":"LitCovid-sentences"},{"subj":"T378","pred":"source","obj":"LitCovid-sentences"},{"subj":"T379","pred":"source","obj":"LitCovid-sentences"},{"subj":"T380","pred":"source","obj":"LitCovid-sentences"},{"subj":"T381","pred":"source","obj":"LitCovid-sentences"},{"subj":"T382","pred":"source","obj":"LitCovid-sentences"},{"subj":"T383","pred":"source","obj":"LitCovid-sentences"},{"subj":"T384","pred":"source","obj":"LitCovid-sentences"},{"subj":"T385","pred":"source","obj":"LitCovid-sentences"},{"subj":"T386","pred":"source","obj":"LitCovid-sentences"},{"subj":"T387","pred":"source","obj":"LitCovid-sentences"},{"subj":"T388","pred":"source","obj":"LitCovid-sentences"},{"subj":"T389","pred":"source","obj":"LitCovid-sentences"},{"subj":"T390","pred":"source","obj":"LitCovid-sentences"},{"subj":"T391","pred":"source","obj":"LitCovid-sentences"},{"subj":"T392","pred":"source","obj":"LitCovid-sentences"}]}],"config":{"attribute types":[{"pred":"source","value type":"selection","values":[{"id":"LitCovid-sentences","color":"#ec93ca","default":true}]}]}}