How to convert [0,1] range data to [-1, 1]

I have data that is Minmax scaled i.e. in the range [0,1]
For some purpose I need this data to be Standard scaled, i.e. in range [-1, 1]
But, I do not have access to data before it was Minmax scaled

Can I simply so the operation:

x = (x - mean) / std

mean = mean of data in range [0,1]
std = standard deviation of data in range [0,1]

I mean, if I do this operation for all data points in range [0,1] now,
Will this be equivalent to doing Standard Scaling to original data (before it was minmax scaled)?

Seems like I was totally wrong to understand these concepts:
I did a little bit more study, Could you verify these:

1. Standard Scaling

x = x - mean / std

makes the data have a zero mean and unit variance and standard deviation

2. Minmax scaling

makes the data have a range between min and max

3. Z-score normalization

same as standard scaling

4. [0, 1] scaling

same as minmax scaling but with the min max range set to 0 and 1 respectively

And it turns out that, standard scaling the minmax scaled data is equivalent to standard scaling the original data


Yes, you are right.
Actually, it is the common way to first scale data to [0, 1], then compute mean and std to get to z-score.

But something I would like to mention is that using z-score does not necessrilty convert your data to [-1, 1]. See below post please: