PMC:7796058 / 78355-78578 JSONTXT

Annnotations TAB JSON ListView MergeView

{"target":"https://pubannotation.org/docs/sourcedb/PMC/sourceid/7796058","sourcedb":"PMC","sourceid":"7796058","source_url":"https://www.ncbi.nlm.nih.gov/pmc/7796058","text":"1 Rectified Linear Unit (ReLU) is a non-linear activation function. Max pooling layer applies a max pooling operation to its inputs. 2 Fully Connected (FC) layers follow a stack of convolutional layers with different depth.","tracks":[{"project":"LitCovid-sentences","denotations":[{"id":"T622","span":{"begin":0,"end":67},"obj":"Sentence"},{"id":"T623","span":{"begin":68,"end":132},"obj":"Sentence"},{"id":"T624","span":{"begin":133,"end":223},"obj":"Sentence"}],"namespaces":[{"prefix":"_base","uri":"http://pubannotation.org/ontology/tao.owl#"}],"attributes":[{"subj":"T622","pred":"source","obj":"LitCovid-sentences"},{"subj":"T623","pred":"source","obj":"LitCovid-sentences"},{"subj":"T624","pred":"source","obj":"LitCovid-sentences"}]}],"config":{"attribute types":[{"pred":"source","value type":"selection","values":[{"id":"LitCovid-sentences","color":"#c993ec","default":true}]}]}}