-
Notifications
You must be signed in to change notification settings - Fork 58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I got loss = nan ..... #8
Comments
I modified the base_lr=0.0001 then it gradual convergence.....:
|
how to add multibux_focal_loss to the code ,this is no .cu file |
@dubiwei the multibux_focal_loss do not need .cu..... |
after 12w iters, i got: |
I modified the max_iter to be 10w but still get a lower map:..... 0.517319 |
Solving VGG_VOC0712_SSD_300x300_train
I1112 16:14:25.373922 12274 solver.cpp:295] Learning Rate Policy: multistep
I1112 16:14:26.191864 12274 solver.cpp:243] Iteration 0, loss = 393.183
I1112 16:14:26.191922 12274 solver.cpp:259] Train net output #0: mbox_loss = 463.676 (* 1 = 463.676 loss)
I1112 16:14:26.191949 12274 sgd_solver.cpp:138] Iteration 0, lr = 0.001
I1112 16:14:37.323004 12274 solver.cpp:243] Iteration 10, loss = 1.70709e+06
I1112 16:14:37.323065 12274 solver.cpp:259] Train net output #0: mbox_loss = 1.00702e+07 (* 1 = 1.00702e+07 loss)
I1112 16:14:37.323082 12274 sgd_solver.cpp:138] Iteration 10, lr = 0.001
I1112 16:14:48.218691 12274 solver.cpp:243] Iteration 20, loss = nan
I1112 16:14:48.218749 12274 solver.cpp:259] Train net output #0: mbox_loss = nan (* 1 = nan loss)
I1112 16:14:48.218765 12274 sgd_solver.cpp:138] Iteration 20, lr = 0.001
I1112 16:14:59.353508 12274 solver.cpp:243] Iteration 30, loss = nan
I1112 16:14:59.353610 12274 solver.cpp:259] Train net output #0: mbox_loss = nan (* 1 = nan loss)
I1112 16:14:59.353627 12274 sgd_solver.cpp:138] Iteration 30, lr = 0.001
I only put the scripts to the /src and /include, then modified the train.prorotxt 。。。。but the loss
is 。。。。emmmm。。。
there must be something wrong....
shoule I do some other change????
The text was updated successfully, but these errors were encountered: