Id |
Subject |
Object |
Predicate |
Lexical cue |
T78 |
0-2 |
Sentence |
denotes |
3. |
T79 |
3-10 |
Sentence |
denotes |
Results |
T80 |
11-124 |
Sentence |
denotes |
We utilized an 80–20 train-test split to train a 5-layer feed forward neural network, which is shown in Figure 4. |
T81 |
125-258 |
Sentence |
denotes |
We trained the network using the Adam optimizer with a learning rate of 0.001, minimizing the mean absolute error, in batches of 256. |
T82 |
259-355 |
Sentence |
denotes |
Training was stopped after 100 epochs, with the test and validation error displayed in Figure 5. |
T83 |
356-510 |
Sentence |
denotes |
Our final network is 10 Mb in size, which is small enough to be implemented even on the smallest of mobile devices and is able to infer results instantly. |
T84 |
511-778 |
Sentence |
denotes |
More precisely, by entering the parameters (within reasonable ranges) listed in Table 1, our model is able to predict information such as cars and passengers passing through the vaccination center, average wait times throughout the day and overall completion metrics. |
T85 |
779-822 |
Sentence |
denotes |
An example of this can be seen in Figure 6. |
T86 |
823-932 |
Sentence |
denotes |
Our neural network model is capable of prediction orders of magnitudes faster than the full simulation model. |
T87 |
933-1113 |
Sentence |
denotes |
More precisely, predictions were on average computed in 0.027 s, with minimum and maximum times of 0.025 and 0.039 s, respectively, in a simulation of 1000 random input variations. |
T88 |
1114-1166 |
Sentence |
denotes |
Distribution of the results can be seen in Figure 7. |
T89 |
1167-1312 |
Sentence |
denotes |
Notice that the time scale we achieve here is on the order of milliseconds, whereas previously we required minutes to produce these same results. |
T90 |
1313-1439 |
Sentence |
denotes |
We achieve an improvement of over 3000× in terms of speed, largely due to the computation cost of simulating the entire event. |
T91 |
1440-1521 |
Sentence |
denotes |
Furthermore, we do all this locally as opposed to the need for cloud computation. |