Too many open files when using MNIST dataloader

I want to perform certain post-processing on each of the MNIST digits but I get OSError: [Errno 24] Too many open files. I am using pytorch==0.4.1. Have tried this both for python 2.7 and 3.6.

I have also tried the other suggestions mentioned in this i.e I have rebooted my ubuntu server as well as tried setting ulimit but I get ulimit: value exceeds hard limit when I try to set something above 4096. Although, I get the output as unlimited when I run ulimit. Is there any other solution for this?.

My code to get the train and the test loaders:

import torch
from torch.utils.data import DataLoader
from torchvision.utils import save_image
import torchvision.datasets as dset
import torchvision.transforms as transforms

trans = transforms.ToTensor()
train_set = dset.MNIST(root=root, train=True, transform=trans, download=True)
test_set = dset.MNIST(root=root, train=False, transform=trans)
train_loader = DataLoader(dataset=train_set, batch_size=1, shuffle=True, num_workers=4)
test_loader = DataLoader(dataset=test_set, batch_size=1, shuffle=True, num_workers=4)

I was able to have a workaround by increasing the hard limit by following the directions as mentioned here.