Model makes different inferences in different envs

#32
by ayseozgun - opened

Hello,

I am using the model for my question answering case. For the same context and question, model generated different answers in different environments.
The parameters and library versions are also same.
What could be the reason?

Thanks in advance,

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment