I am working on an image which I have converted into its bits having a size of 1*2000001 and of "char" data type. I need to use interp() to interpolate it by a factor of 3 making it 1*6000001. I have converted it into 'double' data type by the following:
Interp() accepts only ‘double’ data types. I need to further process the bits in ‘char’. But at this point, all the values have been corrupted, meaning that now, the bits are useless. I am confused here, what the exact problem is?In simple words, I have an array of 1*2000001 ‘char’ which I need to interpolate by 3. But it seems that the conversion from ‘char’ to ‘double’ and vice versa is corrupting the data bits making it useless.