Data transformation(augmentation) for time series dataset

Is there any tutorial or sample code for data transform with respect to time series data using pytorch library?
The time series data what I want to transform is that the data which composed of series of float numbers.
I already read below tutorial transformation for “Image data” but it does not work for my target data.
https://pytorch.org/tutorials/beginner/data_loading_tutorial.html

FYI, I will give you one sample of my data.

[0.55696202 0.56188262 0.56672545 0.57091988 0.57389749 0.57708458
0.58002406 0.58012048 0.58117079 0.58409229 0.58957055 0.59602237
0.59838107 0.5982587 0.59565217 0.58845208 0.58221681 0.57783161
0.5783424 0.58207147 0.58547008 0.5893958 0.5931677 0.58961681
0.58892307 0.5872627 0.58231707 0.57949029 0.57573928 0.57400722
0.57159976 0.57262403 0.56972586 0.56785714 0.56632956 0.56658739
0.56803327 0.56845238 0.56718192 0.56777645 0.56398104 0.56238912
0.56257379 0.56375442 0.56375442 0.56501182 0.56357185 0.56298048
0.56449704 0.56423919 0.56342183 0.56316411 0.56098998 0.56191037
0.56183745 0.56426886 0.5640118 0.56342183 0.56124852 0.56242638
0.56408742 0.56779159 0.56812796 0.56998813 0.5679669 0.56906729
0.57008245 0.57210123 0.57025279 0.57058823 0.57 0.57
0.5717647 0.57420494 0.57606132 0.57463127 0.57184923 0.57151265
0.57352941 0.57504414 0.57209847 0.56431054 0.55926352 0.55441008
0.55282695 0.55345912 0.55453501 0.55715935 0.56557377 0.56915833
0.57075471 0.57243816 0.56941176 0.56547269 0.56470588 0.56327251
0.56367924 0.56426886 0.56216853 0.55941176 0.55771496 0.55785124
0.55818074 0.55995274 0.55903187 0.55752212 0.5573286 0.55798816
0.55950266 0.56101895 0.55891059 0.55699941 0.55791962 0.55917159
0.56235154 0.56327985 0.5620178 0.56075874 0.5595732 0.5620915
0.5646218 0.56521739 0.56335514 0.56302021 0.55833333 0.55303933
0.53738317 0.54792899 0.55535714 0.56495828 0.57637231 0.59665871
0.61793826 0.66218585 0.70470588 0.747181 0.76656891 0.79171597
0.80227001 0.77625298 0.71761194 0.6323185 0.57294743 0.53832442
0.52304964 0.52411764 0.52511682 0.52774566 0.53585771 0.54264453
0.54509132 0.54540262 0.54228571 0.54264453 0.54388984 0.54446357
0.54623779 0.5454023 0.5427424 0.5420023 0.54561301 0.55202822
0.55845697 0.56138259 0.56328734 0.56438026 0.56445783 0.56608328
0.56763285 0.56823671 0.56407185 0.53872633 0.52434247 0.52171492
0.53744997 0.56610577 0.5681544 0.56884058 0.56719128 0.55267804
0.52835485 0.51472192 0.50892374 0.52188365 0.52516778 0.50977198
0.49709762 0.48962655 0.49141965 0.50132767 0.52444444 0.56391875
0.57350272 0.57350272 0.5718599 0.57315598 0.57272178 0.57203134
0.57349397 0.57659831 0.57850241 0.57859734 0.57878788 0.58050847
0.58222491 0.58439201 0.5834341 0.58308157 0.58464329 0.58695652
0.58997584 0.59479103 0.59564164 0.59793814 0.60048573 0.60303951
0.60363636 0.60727272 0.60885385 0.60872198 0.61040532 0.61165048
0.61505768 0.61664641 0.61907655 0.62127659 0.62423873 0.62439024
0.62296072 0.60069848 0.57856341 0.55454057 0.52832132 0.50419339
0.5041769 0.50467289 0.50295566 0.50708353 0.50610649 0.50415242
0.50415242 0.50415242 0.50170982 0.4987787 0.49731314 0.49340498
0.49438202 0.49242794 0.48949682 0.48803126 0.48510014 0.48216903
0.48119199 0.47923791 0.47826087 0.47679531 0.47484123 0.47435271
0.47386419 0.47435271 0.47337567 0.47191011 0.47044455 0.46946751
0.47044455 0.47142159 0.47044455 0.47044455 0.46946751 0.46946751
0.47044455 0.47142159 0.47142159 0.47044455 0.47044455 0.47044455
0.47093307 0.47142159 0.47191011 0.47142159 0.47093307 0.46995603
0.47093307 0.47239863 0.47142159 0.47239863 0.47191011 0.47142159
0.47239863 0.47386419 0.47239863 0.47337567 0.47386419 0.47386419
0.47484123 0.47484123 0.47435271 0.47435271 0.47288715 0.47337567
0.47386419 0.47581827 0.47532975 0.47484123 0.47288715 0.47386419
0.47435271 0.47532975 0.47581827 0.47435271 0.47337567 0.47337567
0.47435271 0.47484123 0.47386419 0.47239863 0.47191011 0.47191011
0.47288715 0.47337567 0.47435271 0.47239863 0.47239863 0.47044455
0.47142159 0.47239863 0.47093307 0.47044455 0.46849047 0.46849047
0.46897899 0.47093307 0.46995603 0.46849047 0.46800195 0.46702491
0.46800195 0.46849047 0.46800195 0.46702491 0.46604787 0.46507083
0.46653639 0.46800195 0.46702491 0.46653639 0.46458231 0.46409379
0.46604787 0.46702491 0.46653639 0.46604787 0.46507083 0.46360527
0.46409379 0.46604787 0.46604787 0.46555935 0.46555935 0.46409379
0.46555935 0.46604787 0.46507083 0.46409379 0.46311675 0.46409379
0.46458231 0.46702491 0.46555935 0.46409379 0.46262823 0.46262823
0.46311675 0.46360527 0.46458231 0.46458231 0.46311675 0.46262823
0.46409379 0.46458231 0.46507083 0.46507083 0.46507083 0.46604787
0.46800195 0.46995603 0.47093307 0.47093307 0.47044455 0.47044455
0.47142159 0.47337567 0.47337567 0.47337567 0.47288715 0.47239863
0.47386419 0.47484123 0.47337567 0.47337567 0.47191011 0.47191011
0.47191011 0.47044455 0.46897899 0.46702491 0.46946751 0.46995603
0.47142159 0.47142159 0.46995603 0.46702491 0.46555935 0.46409379
0.46507083 0.46409379 0.46311675 0.46165119 0.46116267 0.46116267
0.46018564 0.46067416 0.4592086 0.45725452 0.456766 0.45774304
0.45823156 0.45969712 0.46018564 0.4592086 0.45969712 0.4592086
0.46018564 0.46116267 0.4592086 0.45969712 0.46018564 0.45823156
0.456766 0.45139228 0.4469956 0.44601856 0.44992672 0.45725452
0.46849047 0.48803126 0.52076209 0.55788959 0.59208598 0.62139716
0.64484611 0.65901319 0.65608207 0.629702 0.57694186 0.51441133
0.45823156 0.43234001 0.42794333 0.43136297 0.43624817 0.44553004
0.456766 0.46262823 0.46165119 0.46018564 0.4592086 0.45872008
0.45969712 0.46116267 0.46018564 0.45725452 0.45530044 0.45481192
0.45578896 0.45530044 0.45578896 0.45530044 0.45334636 0.4543234
0.45481192 0.45530044 0.45578896 0.45530044 0.4543234 0.4543234
0.45481192 0.45530044 0.45578896 0.45481192 0.45285784 0.45481192
0.45530044 0.456766 0.45627748 0.45627748 0.45627748 0.45578896
0.45725452 0.45770171 0.45770171 0.45739471 0.45837414 0.45837414
0.45961821 0.46199117 0.46131244 0.46293569 0.46316306 0.46456693
0.46768623 0.47012345 0.47032641 0.47181009 0.47274529 0.47402276
0.47548291 0.4764735 0.47864945 0.47992067 0.48039702 0.48358209
0.48729447 0.49027431 0.4935 0.49424136 0.49449449 0.49624436
0.50075112 0.5037594 0.50576441 0.50601202 0.50803213 0.50979407
0.5141129 0.51616161 0.51844366 0.51868687 0.51997976 0.5209914
0.52327935 0.52660922 0.52714358 0.52663622 0.52669039 0.52803262
0.5292919 0.53010204 0.53043478 0.53046595 0.52685422 0.52636968
0.52432155 0.52276215 0.52071611 0.51922091 0.51853759 0.51649484
0.51468315 0.51446281 0.51293996 0.51086956 0.50908147 0.50778816
0.50622406 0.50570539 0.50571132 0.50363825 0.502079 0.50260688
0.50365726 0.50444793 0.50419287 0.50340849 0.50288411 0.50235972
0.50314795 0.50393701 0.50393701 0.50499212 0.50368033 0.50448076
0.50580781 0.50872554 0.5095339 0.50901378 0.50955414 0.51008492
0.51143009 0.51115834 0.51302498 0.51144225 0.51251998 0.51523249
0.51768488 0.51907576 0.51993534 0.52021563 0.5210356 0.52049622
0.52159827 0.52293578 0.52213823 0.52272727 0.52218614 0.52251763
0.52637303 0.52889858 0.53118162 0.52954048 0.53041096 0.52957283
0.5312843 0.53241758 0.5321252 0.5330033 0.53395914 0.53314917
0.53692393 0.53777777 0.53927576 0.53962053 0.53773057 0.53773057
0.53889199 0.54040404 0.54183043 0.54258319 0.54473386 0.54545454
0.54820308 0.55071633 0.55459272 0.55860058 0.56601539 0.58302808
0.59074074 0.5870098 0.58511287 0.58399511 0.58099878 0.57722592
0.57713248 0.579043 0.58302808 0.58692971 0.58823529 0.58627087
0.58605799 0.58703703 0.58103975 0.57489387 0.56953642 0.56630824
0.56387403 0.56264775 0.55941176 0.55373831 0.54797688 0.54216867
0.54207212 0.54223744 0.54311822 0.54057143 0.5409273 0.54216867
0.54388984 0.54566341 0.54597701 0.54388984 0.54372842 0.54909936
0.55673133 0.5620178 0.56317044 0.5674865 0.56859205 0.5686747
0.56970428 0.57559198 0.58220859 0.58980733 0.59324155 0.59760705
0.60380952 0.6049461 0.6063492 0.60684844 0.60542929 0.59295861
0.58755338 0.58555825 0.58363636 0.58712811 0.58734023 0.59132559
0.59436274 0.60049474 0.5974106 0.59277403 0.59229828 0.59643734
0.6 0.60485376 0.60714285 0.60287679 0.60288582 0.59528243
0.58277744 0.59334565 0.59641532 0.60186916 0.5864799 0.56847697
0.56158785 0.56386109 0.53733857 0.51853871 0.51433207 0.52631579
0.51748634 0.52262693 0.52914798 0.55152225 0.56873479 0.55926146
0.55973715 0.53837342 0.52455357 0.5155631 0.53753581 0.55667655
0.56354916 0.56918429 0.57221206 0.57998764 0.58709273 0.58515283
0.59059561 0.5799508 0.57238442 0.56694813 0.57090687 0.56632344
0.55988024 0.55741626 0.54875148 0.54709058 0.5482509 0.55737705
0.57125382 0.58461538 0.59022329 0.60959715 0.66364734 0.71743119
0.73621813 0.77046263 0.77984857 0.7936879 0.78329439 0.75680473
0.68969555 0.62551683 0.57510472 0.54380664 0.53719512 0.53719512
0.52492668 0.53964497 0.56002401 0.57272727 0.57875458 0.58492896
0.59222082 0.60408684 0.6051282 0.60656793 0.60841424 0.61357702
0.61548556 0.62217795 0.61914191 0.61741424 0.61220472 0.60531432
0.61031331 0.6192994 0.62078094 0.6095176 0.61060209 0.61326329
0.61604207 0.61508196 0.61523309 0.61377049 0.61180327 0.61031331
0.61096605 0.61201828 0.59681528 0.58893777 0.59188846 0.60872395
0.61031331 0.6153342 0.61442623 0.61402359 0.61362148 0.61725955
0.6061776 0.60025461 0.60760309 0.59406565 0.56347305 0.53987378
0.52881925 0.52672605 0.54316752 0.57821059 0.61808718 0.6132989
0.61282051 0.61424903 0.61331626 0.60632911 0.60100376 0.6
0.59962756 0.59938271 0.59620098 0.59365466 0.59499083 0.599019
0.60171568 0.60024375 0.60329067 0.60414129 0.60882894 0.61642989
0.6209029 0.625 0.630625 0.62623762 0.62300123 0.6207317
0.61729141 0.6179302 0.61607678 0.6160287 0.61263408 0.61488095
0.6168947 0.62149253 0.62786489 0.63960639 0.64596273 0.65269086
0.65106117 0.65435191 0.65432873 0.65658475 0.65203761 0.64853033
0.64490049 0.64588528 0.65277777 0.65167827 0.65054105 0.64907819
0.64240506 0.6317757 0.63299874 0.63417721 0.62421581 0.620625
0.61860174 0.61706102 0.60990712 0.60529556 0.59597806 0.59306569
0.59499083 0.60247678 0.6127204 0.60957178 0.6113924 0.61023373
0.60741206 0.60678392 0.60960202 0.60691824 0.5988771 0.5855143
0.57628128 0.56623681 0.56025492 0.5552359 0.55154639 0.54940034]

What kind of transformation would you like to apply?
Normalizing the time series data should be similar to normalizing images.
If you need data augmentation (adding noise, flipping etc.) you could implement these augmentations quite easily yourself. Let me know, if you get stuck somewhere or would like to discuss a specific (complicated) transformation.

Is that the torchvision.transform in torchvision package is used for number data as well?

Some can be used on tensors (e.g. Normalize), while most of the transformations are written for image data.

Jittering code

class Jittering(object):
    def __init__(self, sigma):
        assert isinstance(sigma, (float, tuple))
        self.sigma = sigma

    def __call__(self, sample):
        #print(sample)
        data, label = sample['data'], sample['label']
        if isinstance(self.sigma, float):
            myNoise = np.random.normal(loc=0, scale=self.sigma, size=data.shape)
            data = data+myNoise
        return {'data': data, 'label': label}

Create Dataset instance

dataset = Dataset(data_csv='data/train_data.csv', label_csv='data/train_label.csv',transform=transforms.Compose([Jittering(0.3)]))

Is this common approach?

If I use such augmentation using transforms.Compose, will both the processed data and original data are used for training phase? I am really wondering this.

Your transformation looks alright!
If you pass it to your Dataset it will be applied for every sample in the usual case.
Of course if depends, how you’ve implemented the __getitem__ in your custom Dataset class, i.e. you could randomly apply the transformation. Alternatively, you could use torchvision.transforms.RandomApply to apply your transformation randomly.

I really appreciate your help!