Why does my train build show multiple lines in the output?

I’m building a training data set. While it’s running it shows this:

What not hold it to one line and show the progress as it goes? Did I code something wrong? Below is my training class:

class roof_dataset():
claims = ‘D:\CIS inspection images 0318\train\roof\claims’
no_claims = ‘D:\CIS inspection images 0318\train\roof\no_claims’
LABELS = {claims: 1, no_claims: 0}
training_data =

claim_count = 0
no_claim_count = 0


def make_training_data(self):
    for label in self.LABELS:
        print(label)
        for f in tqdm(os.listdir(label)):
            if "JPG" in f:
                try:
                    path = os.path.join(label, f)
                    img = Image.open(path)
                    pic = train_transform(img)
                    self.training_data.append([pic, np.eye(2)[self.LABELS[label]]])
                    
                    if label == self.claims:
                        self.claim_count += 1
                    elif label == self.no_claims:
                        self.no_claim_count += 1

                except Exception as e:
                    pass
                    #print(label, f, str(e))
                    
    np.random.shuffle(self.training_data)
    np.save("D:\\CIS inspection images 0318\\self_build\\training_data.npy", self.training_data)
    print('claims:',self.claim_count)
    print('no_claims:',self.no_claim_count)

that’s about tqdm not pytorch and iot’s probably because you are printing label

I took the print(label) out and it still is doing it. if it’s a tqdm issue, I’ll close it out here. Thanks for pointing that out.

I also use and sometime i find that issue but couldn’t figure out why.

Instead of

from tqmd import tqmd

, I put:

from tqdm import tqdm_notebook as tqdm
now it works!

what’s the difference between both?

I have no idea. I didn’t have time to look into it.

tqdm_notebook was written for Jupyter notebooks, if I’m not mistaken, since the vanilla tqdm might have some kind of weird behavior (like the one mentioned in the thread).