Chapter 9 Example 2: MNIST handwritten digits

Last update: Thu Oct 22 16:46:28 2020 -0500 (54a46ea04)

9.1 Code in R

Source: https://github.com/yunjey/pytorch-tutorial/blob/master/tutorials/01-basics/logistic_regression/main.py

9.1.4 Training

#> Epoch [2/5], Step [100/600], Loss: 2.202640 
#> Epoch [2/5], Step [200/600], Loss: 2.131556 
#> Epoch [2/5], Step [300/600], Loss: 2.009567 
#> Epoch [2/5], Step [400/600], Loss: 1.909900 
#> Epoch [2/5], Step [500/600], Loss: 1.807800 
#> Epoch [2/5], Step [600/600], Loss: 1.763934 
#> Epoch [3/5], Step [100/600], Loss: 1.748977 
#> Epoch [3/5], Step [200/600], Loss: 1.719241 
#> Epoch [3/5], Step [300/600], Loss: 1.575805 
#> Epoch [3/5], Step [400/600], Loss: 1.533629 
#> Epoch [3/5], Step [500/600], Loss: 1.441434 
#> Epoch [3/5], Step [600/600], Loss: 1.422432 
#> Epoch [4/5], Step [100/600], Loss: 1.457393 
#> Epoch [4/5], Step [200/600], Loss: 1.446077 
#> Epoch [4/5], Step [300/600], Loss: 1.299167 
#> Epoch [4/5], Step [400/600], Loss: 1.294534 
#> Epoch [4/5], Step [500/600], Loss: 1.208139 
#> Epoch [4/5], Step [600/600], Loss: 1.201451 
#> Epoch [5/5], Step [100/600], Loss: 1.263761 
#> Epoch [5/5], Step [200/600], Loss: 1.262581 
#> Epoch [5/5], Step [300/600], Loss: 1.115774 
#> Epoch [5/5], Step [400/600], Loss: 1.135691 
#> Epoch [5/5], Step [500/600], Loss: 1.052254 
#> Epoch [5/5], Step [600/600], Loss: 1.051521 
#> Epoch [6/5], Step [100/600], Loss: 1.129794 
#> Epoch [6/5], Step [200/600], Loss: 1.133942 
#> Epoch [6/5], Step [300/600], Loss: 0.988441 
#> Epoch [6/5], Step [400/600], Loss: 1.024993 
#> Epoch [6/5], Step [500/600], Loss: 0.942753 
#> Epoch [6/5], Step [600/600], Loss: 0.944552