I m generating an image using quartz2d
and I want to use it as an opengl
texture.
The tricky part is that I want to use as few bits per pixel as possible, so I m creating cgContext
as following:
int bitsPerComponent = 5;
int bytesPerPixel = 2;
int width = 1024;
int height = 1024;
void* imageData = malloc(width * height * bytesPerPixel);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGImageContext context = CGBitmapContextCreate(imageData, width, height, bitsPerComponent, width * bytesPerPixel, colorSpace, kCGImageAlphaNoneSkipFirst);
//draw things into context, release memory, etc.
As stated in the documentation here, this is the only supported RGB
pixel format for CGBitmapContextCreate
which uses 16 bits per pixel.
So now I want to upload this imageData which looks like "1 bit skipped - 5 bits red - 5 bits green - 5 bits blue" into an opengl texture. So I should do something like this:
glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_SHORT_5_5_5_1, imageData);
That won t work because in this call I ve specified pixel format as 5 red - 5 green - 5 blue - 1 alpha
. That is wrong, but it appears that there is no format that would match core graphics output.
There are some other options like GL_UNSIGNED_SHORT_1_5_5_5_REV
, but those wont work on the iphone
.
我需要以某种方式使用这一<代码>imageData作为案文,但我确实不想通过人工使用薄膜或薄膜抽打,因为这似乎非常低效。