Replies: 22 comments
-
@mli @piiswrong maybe worth fixing before the new release. Or at least explicitly warning about this. |
Beta Was this translation helpful? Give feedback.
-
This issue is closed due to lack of activity in the last 90 days. Feel free to ping me to reopen if this is still an active issue. Thanks! |
Beta Was this translation helpful? Give feedback.
-
I think this is the appropriate behaviour. If you want different seeds, you should seed yourself using time or the systems random pool. |
Beta Was this translation helpful? Give feedback.
-
I strongly disagree. Fixing the random seed to zero is not the default behavior in any package I've come across. It also goes against the purpose of a "random" library. By default, any outputs that are expected to exhibit randomness, should not be deterministic. When debugging or ensuring reproducibility, it is useful to be able to set the random seed, but otherwise it should be set as you say: based on time or system state. At the least, this should raise a warning. It's very difficult to discover this behavior on your own when other commonly used libraries like numpy add an additional source of true randomness on top of mxnet's deterministic default. |
Beta Was this translation helpful? Give feedback.
-
I prefer the fixed random seed by default. It is similar to the random seed setting in C++. Users know what the random seed is and reproduce the experiment with the seed. |
Beta Was this translation helpful? Give feedback.
-
I suggest we fix it in the numpy compatibility project. @reminisce @haojin2 |
Beta Was this translation helpful? Give feedback.
-
As I'vs tested, pytorch behaved like numpy.random. |
Beta Was this translation helpful? Give feedback.
-
I consider the following solution: |
Beta Was this translation helpful? Give feedback.
-
What about going the JAX/scikit-learn way? that we explicitly require RandomSeed to be fed? |
Beta Was this translation helpful? Give feedback.
-
@haojin2 can you comment on if this will be fixed as part of the numpy compatibility effort? |
Beta Was this translation helpful? Give feedback.
-
I think it could be better to warn users if they do not feed a seed. I'm training a model but it cannot converge until I feed the seed manually. It took me much time to check my code... |
Beta Was this translation helpful? Give feedback.
-
@leezu Have we all reached a consensus on what is the direction to go? If there's not even a clear goal yet then I cannot confirm that this will be changed in the 1.6.0 release. |
Beta Was this translation helpful? Give feedback.
-
Would the consensus be already reached based on meta-consenus on numpy compatibility #14253? |
Beta Was this translation helpful? Give feedback.
-
@leezu Having an explicit random seed absolutely helps researchers in terms of reproducibility, as JAX did it, and scikit-learn allows you to provide a RandomState object. I didn't quite think deep on this yet, however. I think @szhengac and @sxjscience are much more familiar with this |
Beta Was this translation helpful? Give feedback.
-
@junrushao1994 I agree about the advantages. The problem is that it doesn't follow numpy API. |
Beta Was this translation helpful? Give feedback.
-
I suggest fixing the seeding issue in 1.6.1. Currently, MXNet is not consistently even if we fix the seed: #16605
#16532
#16705
Also, we should try to accelerate the performance of our default random number generator.
…________________________________
From: Leonard Lausen <[email protected]>
Sent: Friday, November 15, 2019 12:07:06 AM
To: apache/incubator-mxnet <[email protected]>
Cc: Xingjian SHI <[email protected]>; Mention <[email protected]>
Subject: Re: [apache/incubator-mxnet] mxnet random seed should not be fixed by default (#7410)
@junrushao1994<https://github.com/junrushao1994> I agree about the advantages. The problem is that it doesn't follow numpy API.
Would it make sense to add it as part of a npx.random?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub<https://github.com/apache/incubator-mxnet/issues/7410?email_source=notifications&email_token=ABHQH3SM2COXIZSEUSKK5WLQTZKCVA5CNFSM4DWLSD62YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEEEURHQ#issuecomment-554256542>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/ABHQH3X6XWJ23LSGTLKD4HLQTZKCVANCNFSM4DWLSD6Q>.
|
Beta Was this translation helpful? Give feedback.
-
Which of above options would you recommend? |
Beta Was this translation helpful? Give feedback.
-
We can fix the dropout consistency issue in 1.6.1. In 1.7, we can support configuration files in MXNet and create some common config files, e.g, configuration that guarantees consistent random number generator.
…________________________________
From: Leonard Lausen <[email protected]>
Sent: Sunday, November 17, 2019 11:07:25 PM
To: apache/incubator-mxnet <[email protected]>
Cc: Xingjian SHI <[email protected]>; Mention <[email protected]>
Subject: Re: [apache/incubator-mxnet] mxnet random seed should not be fixed by default (#7410)
I suggest fixing the seeding issue in 1.6.1.
Which of above options would you recommend?
Also, changing the default seeding behavior in 1.6.1 point release will break our backwards compatibility promise. We may need to make this change in a proper release.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub<https://github.com/apache/incubator-mxnet/issues/7410?email_source=notifications&email_token=ABHQH3RLZ3MTPQLVXLH5GXDQUI5K3A5CNFSM4DWLSD62YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEEJNCZI#issuecomment-554881381>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/ABHQH3XX54BFQRAE6O2TWC3QUI5K3ANCNFSM4DWLSD6Q>.
|
Beta Was this translation helpful? Give feedback.
-
FYI: Also, JAX requires explicit RNG key for all operators with randomness. I think it's better to resolve the behavior difference between DeepNumpy and Numpy through some kinds of global random seed configuration. |
Beta Was this translation helpful? Give feedback.
-
Marking as bug as the behaviour is not numpy compatible. |
Beta Was this translation helpful? Give feedback.
-
I saw a different result on CPU and GPU by using the same seed value. Is this an expected behavior or am I missing something here ?? scriptimport mxnet as mx
mx.random.seed(99)
t_cpu = mx.nd.random.uniform(shape=(4), ctx=mx.cpu())
print("CPU: ", t_cpu)
mx.random.seed(99)
t_gpu = mx.nd.random.uniform(shape=(4), ctx=mx.gpu(0))
print("GPU: ", t_gpu) OutputCPU:
[0.9509902 0.71019113 0.7611274 0.08560332]
<NDArray 4 @cpu(0)>
GPU:
[0.29560053 0.07938761 0.29997164 0.4891801 ]
<NDArray 4 @gpu(0)> |
Beta Was this translation helpful? Give feedback.
-
This is expected. Setting the seed on a device only guarantees reproducible results on the same device, and does not guarantee the same result on different devices. |
Beta Was this translation helpful? Give feedback.
-
mxnet seems to use a fixed seed for its random number generator. This is not good. To make sure research results are valid, experiments need to be repeated with different random initializations. To the best of my knowledge it is common practice that the seed for the random number generator is chosen by default randomly at startup. At least that is the behaviour of numpy.
As mxnet does not follow the expectated behaviour researchers may wrongly assume that their results are statistically significant, given that they'd expect a random seed.
Compare what happens when running the following two files repeatedly:
I.e. mxnet random number generator uses the same seed..
Beta Was this translation helpful? Give feedback.
All reactions