While training mode, there is some tricky changes such as dropout, batch norm. etc… Hence, giving different outputs is so normal. You can read this post i found.
While training mode, there is some tricky changes such as dropout, batch norm. etc… Hence, giving different outputs is so normal. You can read this post i found.