'Conv2d' object has no attribute 'weight' when debugging on Visual Studio 2017

Dear all.

When I run my code in debug mode on Visual Studio 2017, I get the following error message.

torch.nn.modules.module.ModuleAttributeError: ‘Conv2d’ object has no attribute ‘weight’

This error did not occur, when the code was run in release mode on Visual Studio 2017.
This error did not occur, when the code was run at the command prompt.

module.py
Line 778

    raise ModuleAttributeError("'{}' object has no attribute '{}'".format(
        type(self).__name__, name))

My environment is as follows.

Windows 10.
Visual Studio 2017 Professional.

nvcc -V

nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2019 NVIDIA Corporation
Built on Fri_Feb__8_19:08:26_Pacific_Standard_Time_2019
Cuda compilation tools, release 10.1, V10.1.105

C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.1\include

#define CUDNN_MAJOR 7
#define CUDNN_MINOR 6
#define CUDNN_PATCHLEVEL 4

python -V

Python 3.7.1

python -c “import torch;print(torch.version);”

1.7.1+cu101

python -c “import torch;print(torch.cuda.is_available())”

True

python -c “import torch;print(torch.cuda.get_device_name());”

GeForce GTX 1080 Ti

python -c “import torch;print(torch.cuda.get_device_capability());”

(6, 1)

Call history:

    Module.__getattr__ line 779 Python symbols have been read.
    Module.__getattr__ line 779 Python symbol loaded.
    Module.register_parameter line 315 Python symbol loaded.
    Module.__setattr__ line 796 Python symbols have been read.
   _ConvNd.__init__ line 78 Python symbols have been loaded.
    Conv2d.__init__ line 412 Python symbols have been loaded.

watch:

                      self.__dict__       {'_backward_hooks': OrderedDict([]), '_buffers': OrderedDict([]), '_forward_hooks': OrderedDict([]), '_forward_pre_hooks': OrderedDict([]), '_load_state_dict_pre_hooks': OrderedDict([]), '_modules': OrderedDict([]), '_non_persistent_buffers_set': {}, '_parameters': OrderedDict([]), '_reversed_padding_r...ted_twice': (1, 1, 1, 1), '_state_dict_hooks': OrderedDict([]), 'dilation': (1, 1), 'groups': 1, 'in_channels': 3, 'kernel_size': (3, 3), ...}              dict
                     items()                
                        ['training']           True       bool
                        ['_parameters']  OrderedDict([]) OrderedDict
                        ['_buffers']         OrderedDict([]) OrderedDict
                        ['_non_persistent_buffers_set']   {}           set
                        ['_backward_hooks']       OrderedDict([]) OrderedDict
                        ['_forward_hooks']          OrderedDict([]) OrderedDict
                        ['_forward_pre_hooks']   OrderedDict([]) OrderedDict
                        ['_state_dict_hooks']       OrderedDict([]) OrderedDict
                        ['_load_state_dict_pre_hooks']    OrderedDict([]) OrderedDict
                        ['_modules']       OrderedDict([]) OrderedDict
                        ['in_channels']   3            int
                        ['out_channels'] 64          int
                     ['kernel_size']    (3, 3)     tuple
                     ['stride']              (1, 1)     tuple
                     ['padding']          (1, 1)     tuple
                     ['dilation']           (1, 1)     tuple
                        ['transposed']    False     bool
                     ['output_padding']           (0, 0)     tuple
                        ['groups']            1            int
                        ['padding_mode']            'zeros'    str

I confirmed more.

conv.py
Conv2d.init
Line 412

    super(Conv2d, self).__init__(
        in_channels, out_channels, kernel_size, stride, padding, dilation,
        False, _pair(0), groups, bias, padding_mode)

conv.py
_ConvNd.init
Line 78

        self.weight = Parameter(torch.Tensor(
            out_channels, in_channels // groups, *kernel_size))

module.py
Module.setattr
Line 796

        self.register_parameter(name, value)

module.py
Module.register_parameter
Line 315

    elif hasattr(self, name) and name not in self._parameters:
        raise KeyError("attribute '{}' already exists".format(name))

module.py
Module.getattr
Line 779

    raise ModuleAttributeError("'{}' object has no attribute '{}'".format(
        type(self).__name__, name))

Module.register_parameter method was calling hasattr method.
But getattr method was called.
So the error happened.

How can I fix it?

I found that I can continue debugging the code even when the error occurs.
I think hasattr method is catching the error.