PMC:7796058 / 78355-78578 JSONTXT

Annnotations TAB JSON ListView MergeView

    LitCovid-sentences

    {"project":"LitCovid-sentences","denotations":[{"id":"T622","span":{"begin":0,"end":67},"obj":"Sentence"},{"id":"T623","span":{"begin":68,"end":132},"obj":"Sentence"},{"id":"T624","span":{"begin":133,"end":223},"obj":"Sentence"}],"namespaces":[{"prefix":"_base","uri":"http://pubannotation.org/ontology/tao.owl#"}],"text":"1 Rectified Linear Unit (ReLU) is a non-linear activation function. Max pooling layer applies a max pooling operation to its inputs. 2 Fully Connected (FC) layers follow a stack of convolutional layers with different depth."}