Someone posted a sad comment the other day mourning the loss of Flash content on this blog. Sorry, I follow what I’m into. Sometimes that’s Flash, sometimes (very briefly) it was Silverlight. I think I even got into Python here for a while. The iPhone stuff may be a diversion, or it may be my future path. Time will tell.
Anyway, for an app I’m working on, I needed to take a screenshot of OpenGL ES rendered content. I assumed there was some built in function for that, but much searching led me to the conclusion that it’s a roll-your-own kind of thing. So, after a couple of days, I was finally able to piece together at least three or four different semi-working solutions from various forums and mailing lists, combined with some hacking about, to come up with a solution that actually works.
The first and last steps are easy.
First step, you read the GL data into a raw byte array with glReadPixels. Simple enough.
Last step, you save a UIImage to the Photo Album with UIImageWriteToSavedPhotosAlbum.
The tough part is getting that byte array into a UIImage. My first attempt was to use [UIImage imageFromData:data]. But the problem with that is that that method expects data to be in a file format of one of the supported image types of UIImage, whereas glReadPixels is just raw pixel data.
Digging around some more, I found [UIImage imageWithCGImage:imageRef]. You can get a CGImageRef with CGImageCreate.
CGImageCreate requires a CGDataProviderRef. And you can create one of those with CGDataProviderCreateWithData, using the results from glReadPixels! Finally, a path from one end to the other.
glReadPixels -> CGDataProviderCreateWithData -> CGImageCreate -> [UIImage imageWithCGImage:] -> UIImageWriteToSavedPhotosAlbum
Yay!
But wait. One more snag. OpenGL uses standard Cartesian coordinates. In other words, +Y is up, -Y is down. So the byte array you get with glReadPixels (and thus your final image) will be upside down. A bit of fancy bit-twiddling fixed that up. Here are the final methods, meant to be used within a UIView with a CAEAGLLayer class (just like the EAGL class in the OpenGL ES template file).
[c]-(UIImage *) glToUIImage {
NSInteger myDataLength = 320 * 480 * 4;
// allocate array and read pixels into it.
GLubyte *buffer = (GLubyte *) malloc(myDataLength);
glReadPixels(0, 0, 320, 480, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
// gl renders “upside down” so swap top to bottom into new array.
// there’s gotta be a better way, but this works.
GLubyte *buffer2 = (GLubyte *) malloc(myDataLength);
for(int y = 0; y < 480; y++) { for(int x = 0; x < 320 * 4; x++) { buffer2[(479 - y) * 320 * 4 + x] = buffer[y * 4 * 320 + x]; } } // make data provider with data. CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer2, myDataLength, NULL); // prep the ingredients int bitsPerComponent = 8; int bitsPerPixel = 32; int bytesPerRow = 4 * 320; CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB(); CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault; CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault; // make the cgimage CGImageRef imageRef = CGImageCreate(320, 480, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent); // then make the uiimage from that UIImage *myImage = [UIImage imageWithCGImage:imageRef]; return myImage; } -(void)captureToPhotoAlbum { UIImage *image = [self glToUIImage]; UIImageWriteToSavedPhotosAlbum(image, self, nil, nil); }[/c] I’m pretty proud of myself for figuring this all out. And it works great. But for the love of God, if there’s an easier way, please let me know. It really, really, really seems like there should be. And if you pros see any horrendous memory leaks or anything of the sort in there, let me know. For instance, I feel like i should be freeing those malloc’d buffers when I’m done, but if I do that, the thing crashes. I don’t know me so much about malloc. Does it get freed when it goes out of scope anyway?