PMC:7782580 / 49852-59677
Annnotations
{"target":"https://pubannotation.org/docs/sourcedb/PMC/sourceid/7782580","sourcedb":"PMC","sourceid":"7782580","source_url":"https://www.ncbi.nlm.nih.gov/pmc/7782580","text":"Model architecture and training\nIn this study, we proposed a modular CNNCF to identify the COVID-19 cases in the medical images and a CNNRF to determine the relationships between the lesion areas in the medical images and the five clinical indicators of COVID-19. Both proposed frameworks consisted of two units (ResBlock-A and ResBlock-B). The CNNCF and CNNRF had unique units, namely the control gate block and regressor block, respectively. Both frameworks were implemented using two NVIDIA GTX 1080TI graphics cards and the open-source PyTorch framework.ResBlock-A: As discussed in ref. 57, the residual block is a CNN-based block that allows the CNN models to reuse features, thus accelerating the training speed of the models. In this study, we developed a residual block (ResBlock-A) that utilized a skip-connection for retaining features in different layers in the forward propagation. This block (Fig. 6a) consisted of a multiple-input multiple-output structure with two branches (an upper branch and a bottom branch), where input 1 and input 2 have the same size, but the values may be different. In contrast, output 1 and output 2 had the same size, but output 1 did not have a ReLu layer. The upper branch consisted of a max-pooling layer (Max-Pooling), a convolution layer (Conv 1 × 1), and a batch norm layer (BN). The Max-Pooling had a kernel size of 3 × 3 and a stride of 2 to downsample the input 1 for retaining the features and ensuring the same size as the output layer before the element-wise add operation was conducted in the bottom branch. The Conv 1 × 1 consisted of multiple 1 × 1 convolution kernels with the same number as that in the second convolution layer in the bottom branch to adjust the number of channels. The BN used a regulation function to ensure the input in each layer of the model followed a normal distribution with a mean of 0 and a variance of 1. The bottom branch consisted of two convolution layers, two BN layers, and two ReLu layers. The first convolution layer in the bottom branch consisted of multiple 3 × 3 convolution kernels with a stride of 2 and a padding of 1 to reduce the size of the feature maps when local features were obtained. The second convolution layer in the bottom branch consisted of multiple 3 × 3 convolution kernels with a stride of 1 and a padding of 1. The ReLu function was used as the activation function to ensure a non-linear relationship between the different layers. The output of the upper branch and the output of the bottom branch after the second BN were fused using an element-wise add operation. The fused result was output 1, and the fused result after the ReLu layer was output 2.\nFig. 6 The four units of the proposed framework.\na ResBlock-A architecture, containing two convolution layers with 3 × 3 kernels, one convolution layer with a 1 × 1 kernel, three batch normalization layers, two ReLu layers, and one max-pooling layer with a 3 × 3 kernel. b ResBlock-B architecture; the basic unit is the same as the ResBlock-A, except for output 1. c The Control Gate Block has a synaptic-based frontend architecture that controls the direction of the feature map flow and the overall optimization direction of the framework. d The Regressor architecture is a skip-connection architecture containing one convolution layer with 3 × 3 kernels, one batch normalization layer, one ReLu layer, and three linear layers.\nResBlock-B: The ResBlock-B (Fig. 6b) was a multiple-input single-output block that was similar to the ResBlock-A, except that there was no output 1. The value of the stride and padding in each layer of the ResBlock-A and ResBlock-B could be adjusted using hyper-parameters based on the requirements.\nControl Gate Block: As shown in Fig. 6c, the Control Gate Block was a multiple-input single-output block consisting of a predictor module, a counter module, and a synapses module to control the optimization direction while controlling the information flow in the framework. The pipeline of the predictor module is shown in Supplementary Fig. 19a, where the Input S1 is the output of the ResBlock-B. The Input S1 was then flattened to a one-dimensional feature vector as the input of the linear layer. The output of the linear layer was converted to a probability of each category using the softmax function. A sensitivity calculator used the Vpred and Vtrue as inputs to calculate the TP, TN, FP, and false-negative (FN) rates to calculate the sensitivity. The sensitivity calculation was followed by a step function to control the output of the predictor. The ths was a threshold value; if the calculated sensitivity was greater or equal to ths, the step function output 1; otherwise, the output was 0. The counter module was a conditional counter, as shown in Supplementary Fig. 19b. If the input n was zero, the counter was cleared and set to zero. Otherwise, the counter increased by 1. The output of the counter was num. The synapses block mimicked the synaptic structure, and the input variable num was similar to a neurotransmitter, as shown in Supplementary Fig. 19c. The input num was the input parameter of the step function. The ths was a threshold value; if the input num was greater or equal to ths, the step function output 1; otherwise, it output 0. An element-wise multiplication was performed between the input S1 and the output of the synapses block. The multiplied result was passed on to a discriminator. If the sum of each element in the result was not zero, the Input S1 was passed on to the next layer. Otherwise, the input S1 information was not passed on.\nRegressor block: The regressor block consisted of multiple linear layers, a convolution layer, a BN layer, and a ReLu layer, as shown in Fig. 6d. A skip-connection architecture was adopted to retain the features and increase the ability of the block to represent non-linear relationships. The convolution block in the skip-connection structure was a convolution layer with multiple numbers of 1 × 1 convolution kernels. The number of the convolution kernels was the same as that of the output size of the second linear layer to ensure the consistency of the vector dimension. The input size and output size of each linear layer were adjustable to be applicable to actual cases.\nBased on the four blocks, two frameworks were designed for the classification task and regression task, respectively.Classification framework: The CNNCF consisted of stage I and stage II, as shown in Fig. 3a. Stage I was duplicated Q times in the framework (in this study, Q = 1). It consisted of multiple ResBlock-A with a number of M (in this study, M = 2), one ResBlock-B, and one Control Gate Block. Stage II consisted of multiple ResBlock-A with a number of N (in this study, N = 2) and one ResBlock-B. The weighted cross-entropy loss function was used and was minimized using the SGD optimizer with a learning rate of a1 (in this study, a1 = 0.01). A warm-up strategy58 was used in the initialization of the learning rate for a smooth training start, and a reduction factor of b1 (in this study, b1 = 0.1) was used to reduce the learning rate after every c1 (in this study, c1 = 10) training epochs. The model was trained for d1 (in this study, d1 = 40) epochs, and the model parameters saved in the last epoch was used in the test phase.\nRegression framework: The CNNRF (Fig. 3b) consisted of two parts (stage II and the regressor). The inputs to the regression framework were the images of the lesion areas, and the output was the corresponding vector with five dimensions, representing the five clinical indicators (all clinical indicators were normalized to a range of 0–1). The stage II structure was the same as that in the classification framework, except for some parameters. The loss function was the MSE loss function, which was minimized using the SGD optimizer with a learning rate of a2 (in this study, a2 = 0.01). A warm-up strategy was used in the initialization of the learning rate for a smooth training start, and a reduction factor of b2 (in this study, b2 = 0.1) was used to reduce the learning rate after every c2 (in this study, c2 = 50) training epochs. The framework was trained for d2 (in this study, d2 = 200) epochs, and the model parameters saved in the last epoch were used in the test phase.\nThe workflow of the classification framework. The workflow of the classification framework was demonstrated in Fig. 3c. The preprocessed images are sent to the first convolution block to expand the channels and processed as the input for the CNNCF. Given the input Fi with a size of M × N × 64, the stage I output feature maps F′i with a size of M/8 × N/8 × 256 in the default configuration. As we introduced above, the Control Gate Block controls the optimization direction while controlling the information flow in the framework. If the Control Gate Block is open, the feature maps F′i are passed on to stage II. Given the input F′i, the stage II output the feature maps F″i with a size of M/64 × N/64 × 512 which is defined as follows:1 Fi′=S1(Fi)Fi″=S2(Fi′)⊗CGB(Fi′),where S1 denotes the stage I block, S2 denotes the stage II block, and CGB is the Control Gate Block. ⊗ is the element-wise multiplication operation. Stage II is Followed by a global average pooling layer (GAP) and a fully connect layer (FC layer) with a softmax function to generate the final predictions. Given F″i as input, the GAP is adopted to generate a vector Vf with a size of 1 × 1 × 512. Given Vf as input, the FC layer with the softmax function outputs a vector Vc with a size of 1 × 1 × C.2 Vf=GAPFi′Vc=SMaxFCVf,where GAP is the global average pooling layer, the FC is the fully connect layer, SMax is the softmax function, Vf is the feature vector generated by the GAP, Vc is the prediction vector, and C is the number of case types used in this study.","divisions":[{"label":"title","span":{"begin":0,"end":31}},{"label":"p","span":{"begin":32,"end":6260}},{"label":"p","span":{"begin":558,"end":3401}},{"label":"figure","span":{"begin":2672,"end":3401}},{"label":"label","span":{"begin":2672,"end":2678}},{"label":"caption","span":{"begin":2679,"end":3401}},{"label":"title","span":{"begin":2679,"end":2720}},{"label":"p","span":{"begin":2721,"end":3401}},{"label":"p","span":{"begin":3402,"end":3701}},{"label":"p","span":{"begin":3702,"end":5582}},{"label":"p","span":{"begin":5583,"end":6260}},{"label":"p","span":{"begin":6261,"end":8288}},{"label":"p","span":{"begin":6378,"end":7305}},{"label":"p","span":{"begin":7306,"end":8288}},{"label":"label","span":{"begin":9027,"end":9028}},{"label":"label","span":{"begin":9561,"end":9562}}],"tracks":[{"project":"LitCovid-PubTator","denotations":[{"id":"350","span":{"begin":4563,"end":4566},"obj":"Chemical"},{"id":"351","span":{"begin":4644,"end":4647},"obj":"Chemical"},{"id":"352","span":{"begin":5142,"end":5145},"obj":"Chemical"},{"id":"353","span":{"begin":5210,"end":5213},"obj":"Chemical"},{"id":"355","span":{"begin":5680,"end":5682},"obj":"Chemical"},{"id":"358","span":{"begin":91,"end":99},"obj":"Disease"},{"id":"359","span":{"begin":254,"end":262},"obj":"Disease"},{"id":"361","span":{"begin":6782,"end":6809},"obj":"Disease"},{"id":"363","span":{"begin":9131,"end":9134},"obj":"Gene"}],"attributes":[{"id":"A350","pred":"tao:has_database_id","subj":"350","obj":"MESH:D013910"},{"id":"A351","pred":"tao:has_database_id","subj":"351","obj":"MESH:D013910"},{"id":"A352","pred":"tao:has_database_id","subj":"352","obj":"MESH:D013910"},{"id":"A353","pred":"tao:has_database_id","subj":"353","obj":"MESH:D013910"},{"id":"A358","pred":"tao:has_database_id","subj":"358","obj":"MESH:C000657245"},{"id":"A359","pred":"tao:has_database_id","subj":"359","obj":"MESH:C000657245"},{"id":"A361","pred":"tao:has_database_id","subj":"361","obj":"MESH:C537866"},{"id":"A363","pred":"tao:has_database_id","subj":"363","obj":"Gene:93659"},{"subj":"350","pred":"source","obj":"LitCovid-PubTator"},{"subj":"351","pred":"source","obj":"LitCovid-PubTator"},{"subj":"352","pred":"source","obj":"LitCovid-PubTator"},{"subj":"353","pred":"source","obj":"LitCovid-PubTator"},{"subj":"355","pred":"source","obj":"LitCovid-PubTator"},{"subj":"358","pred":"source","obj":"LitCovid-PubTator"},{"subj":"359","pred":"source","obj":"LitCovid-PubTator"},{"subj":"361","pred":"source","obj":"LitCovid-PubTator"},{"subj":"363","pred":"source","obj":"LitCovid-PubTator"}],"namespaces":[{"prefix":"Tax","uri":"https://www.ncbi.nlm.nih.gov/taxonomy/"},{"prefix":"MESH","uri":"https://id.nlm.nih.gov/mesh/"},{"prefix":"Gene","uri":"https://www.ncbi.nlm.nih.gov/gene/"},{"prefix":"CVCL","uri":"https://web.expasy.org/cellosaurus/CVCL_"}]},{"project":"LitCovid-sentences","denotations":[{"id":"T371","span":{"begin":0,"end":31},"obj":"Sentence"},{"id":"T372","span":{"begin":32,"end":263},"obj":"Sentence"},{"id":"T373","span":{"begin":264,"end":340},"obj":"Sentence"},{"id":"T374","span":{"begin":341,"end":443},"obj":"Sentence"},{"id":"T375","span":{"begin":444,"end":569},"obj":"Sentence"},{"id":"T376","span":{"begin":570,"end":590},"obj":"Sentence"},{"id":"T377","span":{"begin":591,"end":732},"obj":"Sentence"},{"id":"T378","span":{"begin":733,"end":893},"obj":"Sentence"},{"id":"T379","span":{"begin":894,"end":1106},"obj":"Sentence"},{"id":"T380","span":{"begin":1107,"end":1200},"obj":"Sentence"},{"id":"T381","span":{"begin":1201,"end":1328},"obj":"Sentence"},{"id":"T382","span":{"begin":1329,"end":1563},"obj":"Sentence"},{"id":"T383","span":{"begin":1564,"end":1742},"obj":"Sentence"},{"id":"T384","span":{"begin":1743,"end":1892},"obj":"Sentence"},{"id":"T385","span":{"begin":1893,"end":1983},"obj":"Sentence"},{"id":"T386","span":{"begin":1984,"end":2192},"obj":"Sentence"},{"id":"T387","span":{"begin":2193,"end":2329},"obj":"Sentence"},{"id":"T388","span":{"begin":2330,"end":2449},"obj":"Sentence"},{"id":"T389","span":{"begin":2450,"end":2584},"obj":"Sentence"},{"id":"T390","span":{"begin":2585,"end":2671},"obj":"Sentence"},{"id":"T391","span":{"begin":2672,"end":2720},"obj":"Sentence"},{"id":"T392","span":{"begin":2721,"end":3401},"obj":"Sentence"},{"id":"T393","span":{"begin":3402,"end":3413},"obj":"Sentence"},{"id":"T394","span":{"begin":3414,"end":3550},"obj":"Sentence"},{"id":"T395","span":{"begin":3551,"end":3701},"obj":"Sentence"},{"id":"T396","span":{"begin":3702,"end":3721},"obj":"Sentence"},{"id":"T397","span":{"begin":3722,"end":3975},"obj":"Sentence"},{"id":"T398","span":{"begin":3976,"end":4100},"obj":"Sentence"},{"id":"T399","span":{"begin":4101,"end":4202},"obj":"Sentence"},{"id":"T400","span":{"begin":4203,"end":4309},"obj":"Sentence"},{"id":"T401","span":{"begin":4310,"end":4458},"obj":"Sentence"},{"id":"T402","span":{"begin":4459,"end":4558},"obj":"Sentence"},{"id":"T403","span":{"begin":4559,"end":4705},"obj":"Sentence"},{"id":"T404","span":{"begin":4706,"end":4787},"obj":"Sentence"},{"id":"T405","span":{"begin":4788,"end":4853},"obj":"Sentence"},{"id":"T406","span":{"begin":4854,"end":4892},"obj":"Sentence"},{"id":"T407","span":{"begin":4893,"end":4927},"obj":"Sentence"},{"id":"T408","span":{"begin":4928,"end":5077},"obj":"Sentence"},{"id":"T409","span":{"begin":5078,"end":5137},"obj":"Sentence"},{"id":"T410","span":{"begin":5138,"end":5266},"obj":"Sentence"},{"id":"T411","span":{"begin":5267,"end":5370},"obj":"Sentence"},{"id":"T412","span":{"begin":5371,"end":5426},"obj":"Sentence"},{"id":"T413","span":{"begin":5427,"end":5527},"obj":"Sentence"},{"id":"T414","span":{"begin":5528,"end":5582},"obj":"Sentence"},{"id":"T415","span":{"begin":5583,"end":5599},"obj":"Sentence"},{"id":"T416","span":{"begin":5600,"end":5728},"obj":"Sentence"},{"id":"T417","span":{"begin":5729,"end":5871},"obj":"Sentence"},{"id":"T418","span":{"begin":5872,"end":6002},"obj":"Sentence"},{"id":"T419","span":{"begin":6003,"end":6158},"obj":"Sentence"},{"id":"T420","span":{"begin":6159,"end":6260},"obj":"Sentence"},{"id":"T421","span":{"begin":6261,"end":6403},"obj":"Sentence"},{"id":"T422","span":{"begin":6404,"end":6469},"obj":"Sentence"},{"id":"T423","span":{"begin":6470,"end":6541},"obj":"Sentence"},{"id":"T424","span":{"begin":6542,"end":6664},"obj":"Sentence"},{"id":"T425","span":{"begin":6665,"end":6768},"obj":"Sentence"},{"id":"T426","span":{"begin":6769,"end":6915},"obj":"Sentence"},{"id":"T427","span":{"begin":6916,"end":7166},"obj":"Sentence"},{"id":"T428","span":{"begin":7167,"end":7305},"obj":"Sentence"},{"id":"T429","span":{"begin":7306,"end":7327},"obj":"Sentence"},{"id":"T430","span":{"begin":7328,"end":7400},"obj":"Sentence"},{"id":"T431","span":{"begin":7401,"end":7645},"obj":"Sentence"},{"id":"T432","span":{"begin":7646,"end":7750},"obj":"Sentence"},{"id":"T433","span":{"begin":7751,"end":7894},"obj":"Sentence"},{"id":"T434","span":{"begin":7895,"end":8143},"obj":"Sentence"},{"id":"T435","span":{"begin":8144,"end":8288},"obj":"Sentence"},{"id":"T436","span":{"begin":8289,"end":8334},"obj":"Sentence"},{"id":"T437","span":{"begin":8335,"end":8408},"obj":"Sentence"},{"id":"T438","span":{"begin":8409,"end":8537},"obj":"Sentence"},{"id":"T439","span":{"begin":8538,"end":8680},"obj":"Sentence"},{"id":"T440","span":{"begin":8681,"end":8820},"obj":"Sentence"},{"id":"T441","span":{"begin":8821,"end":8903},"obj":"Sentence"},{"id":"T442","span":{"begin":8904,"end":9209},"obj":"Sentence"},{"id":"T443","span":{"begin":9210,"end":9366},"obj":"Sentence"},{"id":"T444","span":{"begin":9367,"end":9457},"obj":"Sentence"},{"id":"T445","span":{"begin":9458,"end":9825},"obj":"Sentence"}],"namespaces":[{"prefix":"_base","uri":"http://pubannotation.org/ontology/tao.owl#"}],"attributes":[{"subj":"T371","pred":"source","obj":"LitCovid-sentences"},{"subj":"T372","pred":"source","obj":"LitCovid-sentences"},{"subj":"T373","pred":"source","obj":"LitCovid-sentences"},{"subj":"T374","pred":"source","obj":"LitCovid-sentences"},{"subj":"T375","pred":"source","obj":"LitCovid-sentences"},{"subj":"T376","pred":"source","obj":"LitCovid-sentences"},{"subj":"T377","pred":"source","obj":"LitCovid-sentences"},{"subj":"T378","pred":"source","obj":"LitCovid-sentences"},{"subj":"T379","pred":"source","obj":"LitCovid-sentences"},{"subj":"T380","pred":"source","obj":"LitCovid-sentences"},{"subj":"T381","pred":"source","obj":"LitCovid-sentences"},{"subj":"T382","pred":"source","obj":"LitCovid-sentences"},{"subj":"T383","pred":"source","obj":"LitCovid-sentences"},{"subj":"T384","pred":"source","obj":"LitCovid-sentences"},{"subj":"T385","pred":"source","obj":"LitCovid-sentences"},{"subj":"T386","pred":"source","obj":"LitCovid-sentences"},{"subj":"T387","pred":"source","obj":"LitCovid-sentences"},{"subj":"T388","pred":"source","obj":"LitCovid-sentences"},{"subj":"T389","pred":"source","obj":"LitCovid-sentences"},{"subj":"T390","pred":"source","obj":"LitCovid-sentences"},{"subj":"T391","pred":"source","obj":"LitCovid-sentences"},{"subj":"T392","pred":"source","obj":"LitCovid-sentences"},{"subj":"T393","pred":"source","obj":"LitCovid-sentences"},{"subj":"T394","pred":"source","obj":"LitCovid-sentences"},{"subj":"T395","pred":"source","obj":"LitCovid-sentences"},{"subj":"T396","pred":"source","obj":"LitCovid-sentences"},{"subj":"T397","pred":"source","obj":"LitCovid-sentences"},{"subj":"T398","pred":"source","obj":"LitCovid-sentences"},{"subj":"T399","pred":"source","obj":"LitCovid-sentences"},{"subj":"T400","pred":"source","obj":"LitCovid-sentences"},{"subj":"T401","pred":"source","obj":"LitCovid-sentences"},{"subj":"T402","pred":"source","obj":"LitCovid-sentences"},{"subj":"T403","pred":"source","obj":"LitCovid-sentences"},{"subj":"T404","pred":"source","obj":"LitCovid-sentences"},{"subj":"T405","pred":"source","obj":"LitCovid-sentences"},{"subj":"T406","pred":"source","obj":"LitCovid-sentences"},{"subj":"T407","pred":"source","obj":"LitCovid-sentences"},{"subj":"T408","pred":"source","obj":"LitCovid-sentences"},{"subj":"T409","pred":"source","obj":"LitCovid-sentences"},{"subj":"T410","pred":"source","obj":"LitCovid-sentences"},{"subj":"T411","pred":"source","obj":"LitCovid-sentences"},{"subj":"T412","pred":"source","obj":"LitCovid-sentences"},{"subj":"T413","pred":"source","obj":"LitCovid-sentences"},{"subj":"T414","pred":"source","obj":"LitCovid-sentences"},{"subj":"T415","pred":"source","obj":"LitCovid-sentences"},{"subj":"T416","pred":"source","obj":"LitCovid-sentences"},{"subj":"T417","pred":"source","obj":"LitCovid-sentences"},{"subj":"T418","pred":"source","obj":"LitCovid-sentences"},{"subj":"T419","pred":"source","obj":"LitCovid-sentences"},{"subj":"T420","pred":"source","obj":"LitCovid-sentences"},{"subj":"T421","pred":"source","obj":"LitCovid-sentences"},{"subj":"T422","pred":"source","obj":"LitCovid-sentences"},{"subj":"T423","pred":"source","obj":"LitCovid-sentences"},{"subj":"T424","pred":"source","obj":"LitCovid-sentences"},{"subj":"T425","pred":"source","obj":"LitCovid-sentences"},{"subj":"T426","pred":"source","obj":"LitCovid-sentences"},{"subj":"T427","pred":"source","obj":"LitCovid-sentences"},{"subj":"T428","pred":"source","obj":"LitCovid-sentences"},{"subj":"T429","pred":"source","obj":"LitCovid-sentences"},{"subj":"T430","pred":"source","obj":"LitCovid-sentences"},{"subj":"T431","pred":"source","obj":"LitCovid-sentences"},{"subj":"T432","pred":"source","obj":"LitCovid-sentences"},{"subj":"T433","pred":"source","obj":"LitCovid-sentences"},{"subj":"T434","pred":"source","obj":"LitCovid-sentences"},{"subj":"T435","pred":"source","obj":"LitCovid-sentences"},{"subj":"T436","pred":"source","obj":"LitCovid-sentences"},{"subj":"T437","pred":"source","obj":"LitCovid-sentences"},{"subj":"T438","pred":"source","obj":"LitCovid-sentences"},{"subj":"T439","pred":"source","obj":"LitCovid-sentences"},{"subj":"T440","pred":"source","obj":"LitCovid-sentences"},{"subj":"T441","pred":"source","obj":"LitCovid-sentences"},{"subj":"T442","pred":"source","obj":"LitCovid-sentences"},{"subj":"T443","pred":"source","obj":"LitCovid-sentences"},{"subj":"T444","pred":"source","obj":"LitCovid-sentences"},{"subj":"T445","pred":"source","obj":"LitCovid-sentences"}]}],"config":{"attribute types":[{"pred":"source","value type":"selection","values":[{"id":"LitCovid-PubTator","color":"#c3ec93","default":true},{"id":"LitCovid-sentences","color":"#ec93dd"}]}]}}