원본 : http://cocoaswirl.com/2009/05/20/iphone-opengl-speed-tip-turn-off-thumb-instructions/

 

Want to boost your iPhone OpenGL app’s framerate with one checkbox?  It’s easy; turn off Thumb instructions.

What are Thumb instructions?

The iPhone uses the ARM 1176JZ processor, and Thumb instructions are 16-bit versions of common 32-bit ARM instructions.  By default, your Xcode project will compile with Thumb instructions.

Why use Thumb instructions?

On embedded systems like the iPhone (or any system, really, but here especially), you have to think about the space your app uses.  Smaller instructions mean smaller code in memory and on disk.  That’s a good thing!  However, there’s a trade-off: performance.

According to Apple, the cost comes from floating-point operations.   Ripping out the GLfloats from your app isn’t the way to go, so let’s learn a better way.

How do I turn off Thumb instructions?

Here’s what to do in Xcode:

  1. Open your project
  2. Choose Project -> Edit Project Settings
  3. In the Project Info window, choose the Build tab
  4. In the search box, type “thumb
  5. You should see a “Compile for Thumb” setting.  Uncheck it. (Click image to enlarge.)
  6. Clean and rebuild your project.

That’s it!  If you don’t have the setting, make sure the Active SDK is set to Device.  The setting isn’t applicable to the Simulator.

What kind of frame rate boost will I see?

I had improvements of around 20, 30, and 50%.  Hopefully you will see even bigger ones!



출처 : http://blog.naver.com/PostView.nhn?blogId=eclove33&logNo=50092622625
Posted by 오늘마감
[아이폰 앱 개발] OpenGL ES for iPhone : Part 3 with Accelerometer control

OpenGL ES for iPhone : Part 3 with Accelerometer control

In this part 3, we will add the accelerometer control to move the position of ellipse object that we have created in part 2 of the Tutorial.



1) UIAccelerometerDelegate
We need to add the UIAccelerometerDelegate protocol to the EAGLView and implement the accelerometer: didAccelerate: method as below


@interface EAGLView : UIView <UIAccelerometerDelegate>

- (void)accelerometer:(UIAccelerometer*)accelerometer didAccelerate:(UIAcceleration*)acceleration


We need to configure and start the accelerometer in the setupView method

[[UIAccelerometer sharedAccelerometer] setUpdateInterval:(1.0 / kAccelerometerFrequency)];
[[UIAccelerometer sharedAccelerometer] setDelegate:self];


2) Accelerometer values
Inside the accelerometer: didAccelerate: method, we add a low-pass filter in the accelerometer values. This low-pass filter codes are sourced from the GLGravity Sample Code from Apple.

//Use a basic low-pass filter in the accelerometer values
accel[0] = acceleration.x * kFilteringFactor + accel[0] * (1.0 - kFilteringFactor);
accel[1] = acceleration.y * kFilteringFactor + accel[1] * (1.0 - kFilteringFactor);
accel[2] = acceleration.z * kFilteringFactor + accel[2] * (1.0 - kFilteringFactor);


The meaning of accelerometer values:

acceleration.x = Roll. It corresponds to roll, or rotation around the axis that runs from your home button to your earpiece. Values vary from 1.0 (rolled all the way to the right) to -1.0 (rolled all the way to the left).

acceleration.y = Pitch. Place your iPhone on the table and mentally draw a horizontal line about half-way down the screen. That's the axis around which the Y value rotates. Values go from 1.0 (the headphone jack straight down) to -1.0 (the headphone jack straight up).

acceleration.z = Face up/face down. It refers to whether your iPhone is face up (-1.0) or face down (1.0). When placed on it side, either the side with the volume controls and ringer switch, or the side directly opposite, the Z value equates to 0.0.

3) Control on movement of the ellipse is using the variables moveX and moveY and the ellipse position will be changed according to acceleration.x (that is accel[0]) and acceleration.y (that is accel[1]) values that passed from the Accelerometer control after the low-pass filter. The larger the absolute value of acceleration.x/acceleration.y, the greater for the magnitude for the value of moveX/moveY and thus the faster the ellipse will change its position to that direction. As the object should not move beyond the screen view, the ellipseData.pos.x and ellipseData.pos.y values will be governed by the boundaries of the screen.

 ellipseData.pos.x += moveX;
 if (accel[0] > -0.1 & accel[0] < 0.1 ) {
   moveX = 0.0f;
 }
 else {
  moveX = 10.0f * accel[0];
 }

 ellipseData.pos.y += moveY;
 if (accel[1] > -0.1 & accel[1] < 0.1 ) {
   moveY = 0.0f;
 }
 else {
   moveY = -10.0f * accel[1];
 }


4) Conditional compilation code for the iPhone Simulator and on-screen debug info
As iPhone Simulator does not have Accelerometer control, we have added the code that will change the ellipse position inside this compiler directive, so that the ellipse will keep moving on the iPhone Simulator.
  #if TARGET_IPHONE_SIMULATOR 

Moroever, we have added a UILabel to the code so that we can read the Accelerometer values while we debug the program on actual device. This UILabel can be disabled using this define directive.
  #undef DEBUGSCREEN

5) The source codes are here, you just need to create a new project from OpenGL ES Application template of XCode and copy the source codes of EAGLView.h and EAGLView.m from below and paste them for Build & Go in XCode. The accelerometer control can only be tested on actual device.



EAGLView.h Select all

// EAGLView.h
// OpenGL ES Tutorial - Part 3 by javacom


// To enable Debug NSLog, add GCC_PREPROCESSOR_DEFINITIONS DEBUGON in Project Settings for Debug Build Only and replace NSLog() with DEBUGLOG()
#ifdef DEBUGON
#define DEBUGLOG if (DEBUGON) NSLog
#else
#define DEBUGLOG
#endif

#define DEBUGSCREEN

#import <UIKit/UIKit.h>
#import <OpenGLES/EAGL.h>
#import <OpenGLES/ES1/gl.h>
#import <OpenGLES/ES1/glext.h>

typedef struct
{
BOOL rotstop; // stop self rotation
BOOL touchInside; // finger tap inside of the object ?
BOOL scalestart; // start to scale the obejct ?
CGPoint pos; // position of the object on the screen
CGPoint startTouchPosition; // Start Touch Position
CGPoint currentTouchPosition; // Current Touch Position
GLfloat pinchDistance; // distance between two fingers pinch
GLfloat pinchDistanceShown; // distance that have shown on screen
GLfloat scale; // OpenGL scale factor of the object
GLfloat rotation; // OpenGL rotation factor of the object
GLfloat rotspeed; // control rotation speed of the object
} ObjectData;

/*
This class wraps the CAEAGLLayer from CoreAnimation into a convenient UIView subclass.
The view content is basically an EAGL surface you render your OpenGL scene into.
Note that setting the view non-opaque will only work if the EAGL surface has an alpha channel.
*/
@interface EAGLView : UIView {

@private
/* The pixel dimensions of the backbuffer */
GLint backingWidth;
GLint backingHeight;

EAGLContext *context;

/* OpenGL names for the renderbuffer and framebuffers used to render to this view */
GLuint viewRenderbuffer, viewFramebuffer;

/* OpenGL name for the depth buffer that is attached to viewFramebuffer, if it exists (0 if it does not exist) */
GLuint depthRenderbuffer;

NSTimer *animationTimer;
NSTimeInterval animationInterval;

@public
ObjectData squareData;
ObjectData ellipseData;
GLfloat ellipseVertices[720];
CGFloat initialDistance;
UIAccelerationValue accel[3];
GLfloat moveX, moveY;
#ifdef DEBUGSCREEN
UILabel *textView;
#endif
}

@property NSTimeInterval animationInterval;

@property (nonatomic) ObjectData squareData;
@property (nonatomic) ObjectData ellipseData;
@property CGFloat initialDistance;
#ifdef DEBUGSCREEN
@property (nonatomic, assign) UILabel *textView;
#endif

- (void)startAnimation;
- (void)stopAnimation;
- (void)drawView;
- (void)setupView;

@end


EAGLView.m Select all

// EAGLView.m
// OpenGL ES Tutorial - Part 3 by javacom
//
#import <QuartzCore/QuartzCore.h>
#import <OpenGLES/EAGLDrawable.h>

#import "EAGLView.h"

#include <math.h>

// Macros
#define degreesToRadians(__ANGLE__) (M_PI * (__ANGLE__) / 180.0)
#define radiansToDegrees(__ANGLE__) (180.0 * (__ANGLE__) / M_PI)

CGFloat distanceBetweenPoints (CGPoint first, CGPoint second) {
CGFloat deltaX = second.x - first.x;
CGFloat deltaY = second.y - first.y;
return sqrt(deltaX*deltaX + deltaY*deltaY );
};

CGFloat angleBetweenPoints(CGPoint first, CGPoint second) {
// atan((top - bottom)/(right - left))
CGFloat rads = atan((second.y - first.y) / (first.x - second.x));
return radiansToDegrees(rads);
}

CGFloat angleBetweenLines(CGPoint line1Start, CGPoint line1End, CGPoint line2Start, CGPoint line2End) {

CGFloat a = line1End.x - line1Start.x;
CGFloat b = line1End.y - line1Start.y;
CGFloat c = line2End.x - line2Start.x;
CGFloat d = line2End.y - line2Start.y;

CGFloat rads = acos(((a*c) + (b*d)) / ((sqrt(a*a + b*b)) * (sqrt(c*c + d*d))));

return radiansToDegrees(rads);
}

#define USE_DEPTH_BUFFER 0

// CONSTANTS
#define kMinimumTouchLength 30
#define kMaximumScale 7.0f
#define kMinimumPinchDelta 15
#define kAccelerometerFrequency 100.0 // Hz
#define kFilteringFactor 0.1


// A class extension to declare private methods
@interface EAGLView ()

@property (nonatomic, retain) EAGLContext *context;
@property (nonatomic, assign) NSTimer *animationTimer;

- (BOOL) createFramebuffer;
- (void) destroyFramebuffer;

@end


@implementation EAGLView

@synthesize context;
@synthesize animationTimer;
@synthesize animationInterval;
@synthesize squareData;
@synthesize ellipseData;
@synthesize initialDistance;
#ifdef DEBUGSCREEN
@synthesize textView;
#endif

// You must implement this method
+ (Class)layerClass {
return [CAEAGLLayer class];
}


//The GL view is stored in the nib file. When it's unarchived it's sent -initWithCoder:
- (id)initWithCoder:(NSCoder*)coder {

if ((self = [super initWithCoder:coder])) {

// Get the layer
CAEAGLLayer *eaglLayer = (CAEAGLLayer *)self.layer;

eaglLayer.opaque = YES;
eaglLayer.drawableProperties = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:NO], kEAGLDrawablePropertyRetainedBacking, kEAGLColorFormatRGBA8, kEAGLDrawablePropertyColorFormat, nil];

context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES1];

if (!context || ![EAGLContext setCurrentContext:context]) {
[self release];
return nil;
}

animationInterval = 1.0 / 60.0;
[self setupView];
}
return self;
}

// These are four methods touchesBegan, touchesMoved, touchesEnded, touchesCancelled and use to notify about touches and gestures

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
/*
NSUInteger numTaps = [[touches anyObject] tapCount]; // number of taps
NSUInteger numTouches = [touches count]; // number of touches
*/
UITouch *touch = [[touches allObjects] objectAtIndex:0];

DEBUGLOG(@"TouchBegan event counts = %d ",[[event touchesForView:self] count]);
DEBUGLOG(@"TouchBegan tounches counts = %d ",[touches count]);
if ([touches count]== 2) {
NSArray *twoTouches = [touches allObjects];
UITouch *first = [twoTouches objectAtIndex:0];
UITouch *second = [twoTouches objectAtIndex:1];
initialDistance = distanceBetweenPoints([first locationInView:self], [second locationInView:self]);
squareData.rotstop = YES;
squareData.touchInside = NO;
}
else if ([touches count]==[[event touchesForView:self] count] & [[event touchesForView:self] count] == 1) {
squareData.startTouchPosition = [touch locationInView:self];
if (distanceBetweenPoints([touch locationInView:self], squareData.pos) <= kMinimumTouchLength * squareData.scale) {
DEBUGLOG(@"Square Touch at %.2f, %.2f ",squareData.pos.x,squareData.pos.y);
squareData.rotstop = YES;
squareData.touchInside = YES;
}
}

}

- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[touches allObjects] objectAtIndex:0];
squareData.currentTouchPosition = [touch locationInView:self];
if ([touches count]== 2) {
NSArray *twoTouches = [touches allObjects];
UITouch *first = [twoTouches objectAtIndex:0];
UITouch *second = [twoTouches objectAtIndex:1];

// Calculate the distance bewtween the two fingers(touches) to determine the pinch distance
CGFloat currentDistance = distanceBetweenPoints([first locationInView:self], [second locationInView:self]);

squareData.rotstop = YES;
squareData.touchInside = NO;

if (initialDistance == 0.0f)
initialDistance = currentDistance;
if (currentDistance - initialDistance > kMinimumPinchDelta) {
squareData.pinchDistance = currentDistance - initialDistance;
squareData.scalestart = YES;
DEBUGLOG(@"Outward Pinch %.2f", squareData.pinchDistance);
}
else if (initialDistance - currentDistance > kMinimumPinchDelta) {
squareData.pinchDistance = currentDistance - initialDistance;
squareData.scalestart = YES;
DEBUGLOG(@"Inward Pinch %.2f", squareData.pinchDistance);
}
}
else if ([touches count]==[[event touchesForView:self] count] & [[event touchesForView:self] count] == 1) {
if (squareData.touchInside) {
// Only move the square to new position when touchBegan is inside the square
squareData.pos.x = [touch locationInView:self].x;
squareData.pos.y = [touch locationInView:self].y;
DEBUGLOG(@"Square Move to %.2f, %.2f ",squareData.pos.x,squareData.pos.y);
squareData.rotstop = YES;
}
}
}


- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
if ([touches count] == [[event touchesForView:self] count]) {
initialDistance = squareData.pinchDistanceShown = squareData.pinchDistance = 0.0f;
squareData.rotstop = squareData.touchInside = squareData.scalestart = NO;
DEBUGLOG(@"touchesEnded, all fingers up");
}
else {
initialDistance = squareData.pinchDistanceShown = squareData.pinchDistance = 0.0f;
squareData.scalestart = NO;
DEBUGLOG(@"touchesEnded");
}
}


- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
initialDistance = squareData.pinchDistanceShown = squareData.pinchDistance = 0.0f;
squareData.rotstop = squareData.touchInside = squareData.scalestart = NO;
DEBUGLOG(@"touchesCancelled");
}

- (void)setupView { // new method for intialisation of variables and states

// Enable Multi Touch of the view
self.multipleTouchEnabled = YES;

//Configure and start accelerometer
[[UIAccelerometer sharedAccelerometer] setUpdateInterval:(1.0 / kAccelerometerFrequency)];
[[UIAccelerometer sharedAccelerometer] setDelegate:self];
#if TARGET_IPHONE_SIMULATOR
moveX = 2.0f;
moveY = 3.0f;
#else
moveX = 0.0f;
moveY = 0.0f;
#endif

#ifdef DEBUGSCREEN
UIColor *bgColor = [[UIColor alloc] initWithWhite:1.0f alpha:0.0f];
textView = [[UILabel alloc] initWithFrame:CGRectMake(10.0f, 350.0f, 300.0f, 96.0f)];
textView.text = [NSString stringWithFormat:@"-Accelerometer Data-"];
textView.textAlignment = UITextAlignmentLeft;
[textView setNumberOfLines:4];
textView.backgroundColor = bgColor;
textView.font = [UIFont fontWithName:@"Arial" size:18];
[self addSubview:textView];
[self bringSubviewToFront:textView];
#endif


// Initialise square data
squareData.rotation = squareData.pinchDistance = squareData.pinchDistanceShown = 0.0f;
ellipseData.rotation = 0.0f;
squareData.scale = 1.0f;
squareData.rotstop = squareData.touchInside = squareData.scalestart = NO;
squareData.pos.x = 160.0f;
squareData.pos.y = 240.0f;
squareData.pinchDistance = 0.0f;
squareData.rotspeed = 1.0f;

// Initialise ellipse data
ellipseData.rotation = 0.0f;
ellipseData.rotstop = ellipseData.touchInside = ellipseData.scalestart = NO;
ellipseData.pos.x = 160.0f;
ellipseData.pos.y = 100.0f;
ellipseData.rotspeed = -4.0f;

// calculate the vertices of ellipse
const GLfloat xradius = 35.0f;
const GLfloat yradius = 25.0f;
for (int i = 0; i < 720; i+=2) {
ellipseVertices[i] = (cos(degreesToRadians(i)) * xradius) + 0.0f;
ellipseVertices[i+1] = (sin(degreesToRadians(i)) * yradius) + 0.0f;
// DEBUGLOG(@"ellipseVertices[v%d] %.1f, %.1f",i, ellipseVertices[i], ellipseVertices[i+1]);
}

// setup the projection matrix
glMatrixMode(GL_PROJECTION);
glLoadIdentity();

// Setup Orthographic Projection for the 320 x 480 of the iPhone screen
glOrthof(0.0f, 320.0f, 480.0f, 0.0f, -1.0f, 1.0f);
glMatrixMode(GL_MODELVIEW);

}

- (void)drawView {

// Define the square vertices
const GLfloat squareVertices[] = {
-20.0f, -20.0f,
20.0f, -20.0f,
-20.0f, 20.0f,
20.0f, 20.0f,
};

// Define the colors of the square vertices
const GLubyte squareColors[] = {
255, 255, 0, 255,
0, 255, 255, 255,
0, 0, 0, 0,
255, 0, 255, 255,
};


// Define the colors of the ellipse vertices
const GLubyte ellipseColors[] = {
233, 85, 85, 255,
233, 85, 85, 255,
233, 85, 85, 255,
233, 85, 85, 255,
233, 85, 85, 255,
};


[EAGLContext setCurrentContext:context];
glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
glViewport(0, 0, backingWidth, backingHeight);

// Clear background color
glClearColor(0.5f, 0.5f, 0.5f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);

// draw the square
glLoadIdentity();
glTranslatef(squareData.pos.x, squareData.pos.y, 0.0f);
glRotatef(squareData.rotation, 0.0f, 0.0f, 1.0f);
glScalef(squareData.scale, squareData.scale, 1.0f);
glVertexPointer(2, GL_FLOAT, 0, squareVertices);
glColorPointer(4, GL_UNSIGNED_BYTE, 0, squareColors);
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_COLOR_ARRAY);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);

// draw the ellipse
glLoadIdentity();
glTranslatef(ellipseData.pos.x, ellipseData.pos.y, 0.0f);
glRotatef(ellipseData.rotation, 0.0f, 0.0f, 1.0f);
glVertexPointer(2, GL_FLOAT, 0, ellipseVertices);
glColorPointer(4, GL_UNSIGNED_BYTE, 0, ellipseColors);
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_COLOR_ARRAY);
glDrawArrays(GL_TRIANGLE_FAN, 0, 360); // the ellipse has 360 vertices

// control the square rotation
if (!squareData.rotstop) {
squareData.rotation += squareData.rotspeed;
if(squareData.rotation > 360.0f)
squareData.rotation -= 360.0f;
else if(squareData.rotation < -360.0f)
squareData.rotation += 360.0f;
}

// control the ellipse rotation
if (!ellipseData.rotstop) {
ellipseData.rotation += ellipseData.rotspeed;
if(ellipseData.rotation > 360.0f)
ellipseData.rotation -= 360.0f;
else if(ellipseData.rotation < -360.0f)
ellipseData.rotation += 360.0f;
}

// control the square scaling
if (squareData.scalestart && squareData.scale <= kMaximumScale) {
GLfloat pinchDelta = squareData.pinchDistance - squareData.pinchDistanceShown;
if (squareData.pinchDistance != 0.0f) {
squareData.scale += pinchDelta/30;
squareData.pinchDistanceShown = squareData.pinchDistance;
if (squareData.scale >= kMaximumScale) {
squareData.scale = kMaximumScale;
squareData.pinchDistanceShown = squareData.pinchDistance = 0.0f;
squareData.scalestart = NO;
} else if (squareData.scale <= 1.0f) {
squareData.scale = 1.0f;
squareData.pinchDistanceShown = squareData.pinchDistance = 0.0f;
squareData.scalestart = NO;
}
DEBUGLOG(@"scale is %.2f",squareData.scale);
}
}

// control the ellipse movement
#if TARGET_IPHONE_SIMULATOR
ellipseData.pos.x += moveX;
if (ellipseData.pos.x >= 290.f) {
moveX = -2.0f;
}
else if (ellipseData.pos.x <= 30.f) {
moveX = 2.0f;
}

ellipseData.pos.y += moveY;
if (ellipseData.pos.y >= 450.f) {
moveY = -1.5f;
}
else if (ellipseData.pos.y <= 55.f) {
moveY = 3.5f;
}
#else
ellipseData.pos.x += moveX;
if (accel[0] > -0.1 & accel[0] < 0.1 ) {
moveX = 0.0f;
}
else {
moveX = 10.0f * accel[0];
}

ellipseData.pos.y += moveY;
if (accel[1] > -0.1 & accel[1] < 0.1 ) {
moveY = 0.0f;
}
else {
moveY = -10.0f * accel[1];
}
#endif
if (ellipseData.pos.x >= 290.f) {
ellipseData.pos.x = 290.0f;
}
else if (ellipseData.pos.x <= 30.f) {
ellipseData.pos.x = 30.0f;
}
if (ellipseData.pos.y >= 450.f) {
ellipseData.pos.y = 450.0f;
}
else if (ellipseData.pos.y <= 55.f) {
ellipseData.pos.y = 55.0f;
}


glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
[context presentRenderbuffer:GL_RENDERBUFFER_OES];
}

- (void)accelerometer:(UIAccelerometer*)accelerometer didAccelerate:(UIAcceleration*)acceleration
{
/*
The meaning of acceleration values for firmware 2.x
acceleration.x = Roll. It corresponds to roll, or rotation around the axis that runs from your home button to your earpiece.
Values vary from 1.0 (rolled all the way to the right) to -1.0 (rolled all the way to the left).

acceleration.y = Pitch. Place your iPhone on the table and mentally draw a horizontal line about half-way down the screen.
That's the axis around which the Y value rotates.
Values go from 1.0 (the headphone jack straight down) to -1.0 (the headphone jack straight up).

acceleration.z = Face up/face down.
It refers to whether your iPhone is face up (-1.0) or face down (1.0).
When placed on it side, either the side with the volume controls and ringer switch, or the side directly opposite
, the Z value equates to 0.0.
*/

//Use a basic low-pass filter in the accelerometer values
accel[0] = acceleration.x * kFilteringFactor + accel[0] * (1.0 - kFilteringFactor);
accel[1] = acceleration.y * kFilteringFactor + accel[1] * (1.0 - kFilteringFactor);
accel[2] = acceleration.z * kFilteringFactor + accel[2] * (1.0 - kFilteringFactor);

#ifdef DEBUGSCREEN
textView.text = [NSString stringWithFormat:
@"X (roll, %4.1f%%): %f\nY (pitch %4.1f%%): %f\nZ (%4.1f%%) : %f",
100.0 - (accel[0] + 1.0) * 50.0, accel[0],
100.0 - (accel[1] + 1.0) * 50.0, accel[1],
100.0 - (accel[2] + 1.0) * 50.0, accel[2]
];
#endif
}

- (void)layoutSubviews {
[EAGLContext setCurrentContext:context];
[self destroyFramebuffer];
[self createFramebuffer];
[self drawView];
}


- (BOOL)createFramebuffer {

glGenFramebuffersOES(1, &viewFramebuffer);
glGenRenderbuffersOES(1, &viewRenderbuffer);

glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
[context renderbufferStorage:GL_RENDERBUFFER_OES fromDrawable:(CAEAGLLayer*)self.layer];
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, viewRenderbuffer);

glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);

if (USE_DEPTH_BUFFER) {
glGenRenderbuffersOES(1, &depthRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, depthRenderbuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_DEPTH_COMPONENT16_OES, backingWidth, backingHeight);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, depthRenderbuffer);
}

if(glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) != GL_FRAMEBUFFER_COMPLETE_OES) {
DEBUGLOG(@"failed to make complete framebuffer object %x", glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES));
return NO;
}

return YES;
}


- (void)destroyFramebuffer {

glDeleteFramebuffersOES(1, &viewFramebuffer);
viewFramebuffer = 0;
glDeleteRenderbuffersOES(1, &viewRenderbuffer);
viewRenderbuffer = 0;

if(depthRenderbuffer) {
glDeleteRenderbuffersOES(1, &depthRenderbuffer);
depthRenderbuffer = 0;
}
}


- (void)startAnimation {
self.animationTimer = [NSTimer scheduledTimerWithTimeInterval:animationInterval target:self selector:@selector(drawView) userInfo:nil repeats:YES];
}


- (void)stopAnimation {
self.animationTimer = nil;
}


- (void)setAnimationTimer:(NSTimer *)newTimer {
[animationTimer invalidate];
animationTimer = newTimer;
}


- (void)setAnimationInterval:(NSTimeInterval)interval {

animationInterval = interval;
if (animationTimer) {
[self stopAnimation];
[self startAnimation];
}
}


- (void)dealloc {

[self stopAnimation];

if ([EAGLContext currentContext] == context) {
[EAGLContext setCurrentContext:nil];
}

[context release];
[super dealloc];
}

@end

.http://iphonesdkdev.blogspot.com/2009/04/opengl-es-for-iphone-part-3-with.html
.
Posted by 오늘마감
[cocos2D] 프로젝트에, OpenGL ES와 iphone framework이 동시에 있을때 좌표
OpenGL ES의 함수인  glViewport등 gl붙은 함수들이 모여있는 renderScene함수속에서 호출되는 함수상의 x y좌표 0.0은 수학적 좌표로 좌측하단에 있다. 심지어 이 함수 내부에서 콜되는 sprite의 함수도 만찮가지

- (void)renderScene {
    
// If OpenGL has not yet been initialised then go and initialise it
if(!glInitialised) {
[selfinitOpenGL];
}
    
// Set the current EAGLContext and bind to the framebuffer.This will direct all OGL commands to the
// framebuffer and the associated renderbuffer attachment which is where our scene will be rendered
[EAGLContextsetCurrentContext:context];
    glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
    
// Define the viewport.Changing the settings for the viewport can allow you to scale the viewport
// as well as the dimensions etc and so I'm setting it for each frame in case we want to change it
glViewport(0, 0, screenBounds.size.width , screenBounds.size.height);

// Clear the screen.If we are going to draw a background image then this clear is not necessary
// as drawing the background image will destroy the previous image
glClear(GL_COLOR_BUFFER_BIT);

// Setup how the images are to be blended when rendered.This could be changed at different points during your
// render process if you wanted to apply different effects
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);


[font1drawStringAt:CGPointMake(messageX, 100) text:message];
//[anim renderAtPoint:CGPointMake(70, 100)];
//[animThunder renderAtPoint:CGPointMake(180, 150)];
[animrenderAtPoint:CGPointMake(70, 5)];
[animThunderrenderAtPoint:CGPointMake(2, 180)];



// Bind to the renderbuffer and then present this image to the current context
    glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
    [contextpresentRenderbuffer:GL_RENDERBUFFER_OES];
}

아이폰 API함수의 좌표는 우측하단이다
아래의 함수상에서 호출되는 함수들의 좌표 x,y 는 우측상단이 0,0이다.


-(void)BallMoving{




float x = fireBall.center.x;
float y = fireBall.center.y;

fireBall.center = CGPointMake(fireBall.center.x + pos.x, fireBall.center.y + pos.y);
//
/
//방향전환만한다.
if (fireBall.center.x > 320 || fireBall.center.x < 0){

pos.x = -pos.x;
//찌그러뜨림


....
}

Posted by 오늘마감
OpenGL ES Progg Guide for iPhone

Ch1. OpenGL ES on the iPhone

OpenGLES isaclientofCoreAnimation.

YourapplicationcreatesaUIViewclasswithaspecialCoreAnimationlayer,aCAEAGLLayer.

ACAEAGLLayerobjectisawareofOpenGLESandcanbeusedtocreaterenderingtargetsthatactaspartofCoreAnimation.

Whenyourapplicationfinishesrenderingaframe,youpresentthecontentsoftheCAEAGLLayerobject,wheretheywillbecompositedwiththedatafromotherviews.


* categories of functions

- ReadingthecurrentstateofanOpenGLEScontext.

- ChangingstatevariablesinanOpenGLEScontext.

- Creating,modifyingordestroyingOpenGLESobjects.

- Submittinggeometrytoberendered. (rasterized to a framebuffer)


* objects

- texture: image

- buffer: set of memory (vertex data)

- shader

- renderbuffer: normally as a part of a framebuffer

- framebuffer: ultimate destination of the graphics pipeline


*common behavior of objects

- generate an OID: it simply allocates a reference to an object

- bound to OpenGL ES context:The first time you bind toan object identifier,OpenGL ES allocates memory and initializes that object.

- modify the state

- used for rendering

- deleted


OntheiPhone,OpenGLESobjectsaremanagedbyasharegroupobject.


Twoormorerenderingcontextscanbeconfiguredtousethesamesharegroup.


Appledoesnotprovideaplatforminterfaceforcreatingframebufferobjects.Instead,allframebufferobjectsarecreatedusingtheOES_framebuffer_objectextension.


* framebuffer creating procedure

1.Generateandbindaframebufferobject.

2.Generate,bindandconfigureanimage.

3.Attachtheimagetotheframebuffer.

4.Repeatsteps2and3forotherimages.

5.Testtheframebufferforcompleteness.


AllimplementationsofOpenGLESrequiresomeplatformspecificcodetocreatearenderingcontextandtouseittodrawtothescreen.

EAGL: Embedded Apple OpenGL Extension for MacOS X


* EAGLContextclass

- defines rendering context

- target of OpenGL ES commands

- presents images to Core Animation for display


EveryEAGLContextobjectcontainsareferencetoanEAGLSharegroupobject (texture, buffer, framebuffer, shader, ...).


EAGLDrawableprotocol:objectcanbeusedtoallocatestoreforarenderbufferthatcanlater bepresentedbytheuser. ImplementedonlybytheCAEAGLLayerclass.


** OpenGL ES on the iPhone

= OpenGL ES 1.1 (fixed function graphics pipeline)

+ OpenGL ES 2.0 (shader pipeline)


**IfyourapplicationfailstotestthecapabilitiesofOpenGLESatruntime,itmaycrashorfailtorun.


- v1.1: good baseline behavior for 3D graphics pipeline (on all iTouches)

- v2.0: more flexible. custom vertex and fragment operations are implemented trivially. this is not super set of 1.1.


EAGLContext* myContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES<1|2>];

If a particular implementation of OpenGL ES is not available,initWithAPI: will return nil. Application must test it before using it.


Ch2. Determining OpenGL ES Capabilities

Sincethecapabilitiesofthecontextwillnotchangeonceithasbeencreated,yourapplicationcantestthisonceanddeterminewhichpathitwilluse.


* common: GL_MAX_TEXTURE_SIZE, GL_DEPTH_BITS, GL_STENCIL_BITS

* 1.1: GL_MAX_TEXTURE_UNITS, GL_MAX_CLIP_PLANES

* 2.0: GL_MAX_VERTEX_ATTRIBS, GL_MAX_VERTEX_UNIFORM_VECTORS, GL_MAX_FRAGMENT_UNIFORM_VECTORS, GL_MAX_VARYING_VECTORS, GL_MAX_VERTEX_TEXTURE_IMAGE_UNITS, GL_MAX_TEXTURE_IMAGE_UNITS


BOOL CheckForExtension(NSString *searchName)

{

    NSString *extensionsString = [NSString stringWithCString:glGetString(GL_EXTENSIONS) encoding: NSASCIIStringEncoding];

    NSArray *extensionsNames = [extensionsString componentsSeparatedByString:@" "];

    return [extensionsNames containsObject: searchName];

}


ThedebugversionofyourapplicationshouldcallglGetErroraftereveryOpenGLEScommand.


Ch3. Working with EAGL

// creating EAGL context

EAGLContext* myContext = [[EAGLContext alloc]

initWithAPI:kEAGLRenderingAPIOpenGLES1];

[EAGLContext setCurrentContext: myContext];


// create framebuffer object

1. create framebuffer object

2. create target (renderbuffer or texture), allocate storage for target, attatch it to the framebuffer object

3. test framebuffer for completeness


1. Create the framebuffer and bind it so that future OpenGL ES framebuffer commands are directed to it. 

GLuint framebuffer;

glGenFramebuffersOES(1, &framebuffer);

glBindFramebufferOES(GL_FRAMEBUFFER_OES, framebuffer);

2. Create a color renderbuffer, allocate storage for it, and attach it to the framebuffer. 

GLuint colorRenderbuffer;

glGenRenderbuffersOES(1, &colorRenderbuffer);

glBindRenderbufferOES(GL_RENDERBUFFER_OES, colorRenderbuffer);

glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_RGBA8_OES, width, height);

glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, depthRenderbuffer);

3. Perform similar steps to create and attach a depth renderbuffer. 

GLuint depthRenderbuffer;

glGenRenderbuffersOES(1, &depthRenderbuffer);

glBindRenderbufferOES(GL_RENDERBUFFER_OES, depthRenderbuffer);

glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_DEPTH_COMPONENT16_OES, width,height);

glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES,GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, depthRenderbuffer);

4. Test the framebuffer for completeness. 

GLenum status = glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) ;

if(status != GL_FRAMEBUFFER_COMPLETE_OES) {

    NSLog(@"failed to make complete framebuffer object %x", status);

}


If your CAEAGLLayer object must be blended with other layers, you will see a significant performance penalty.You can reduce this penalty by playingy our CAEAGLLayer behind other UIKit layers.


* Sharegroup *

EAGLContext* firstContext = [[EAGLContext alloc]

initWithAPI:kEAGLRenderingAPIOpenGLES1];

EAGLContext* secondContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES1 sharegroup: [firstContext sharegroup]];


Ch4. Working with Vertex Data

AvoidtheuseoftheOpenGLESGL_FIXEDdatatype.

-glDrawArraystodrawyourgeometry

- useglDrawElementstospecifyindicesforthetrianglesinyourgeometry


If the data did not change, use vertex buffer object (VBO).


Ch5. Working with Texture Data

PowerVR Texture Compression (PRVTC) format by implementing theGL_IMG_texture_compression_pvrtc extension


Future Apple hardware may not support the PVRTC texture format. You must test for the existence of compressed texture.


Createandloadyourtexturedataduringinitialization.


BindingtoatexturechangesOpenGLESstate. Avoid unnecessary change.


Onewaytoavoidchangingthetextureistocombinemultiplesmallertexturesintoasinglelargetexture, knownasatextureatlas.


Yourapplicationshouldprovidemipmapsforalltexturesexceptthosebeingusedtodraw2Dunscaledimages.

GL_LINEAR_MIPMAP_LINEARfiltermodeprovidesthebestquality.GL_LINEAR_MIPMAP_NEARESTfiltermode better performance.


Ch6. Performance

* Redraw Scenes Only when Necessary

: Aslower,butfixedframerate (e.g. 30 fps)oftenappearssmoothertotheuserthanafastbutvariableframerate.


* Use Floating Point Arithmetic

: use ARM instruction set but not Thumb


* Disable Unused OpenGL ES features


* Minimize the Number of Draw Calls

:consolidate geometry that is in close spacialproximity


* Memory

-After loading an image into your OpenGL ES texture using glTexImage, you can free the original image.

-Only allocate a depth buffer when your application requires it.

-If your application does not need allof its resources at once,only load a subset of the totalresources.


* Avoid Querying OpenGL ES State

: Callsto glGet*() including glGetError() may require OpenGL ES to execute allprevious commands before retrieving any state variables.This synchronization forces the graphics hardware to run lockstep withthe CPU,reducing opportunities for parallelism. UseglGetError() only in debug build.


* Avoid Changing OpenGL ES State Unnecessarily


* Drawing order

: Do not waste CPU time sorting objects front to back.

: Sort objects by their opacity (opaque > alpha testing > alpha blended)


App.A. Using Texturetool

ex) Encode Image.png into PVRTC using linear weights and 4 bpp, and saving the output as ImageL4.pvrtc and a PNG preview as ImageL4.png


user$ texturetool -e PVRTC --channel-weighting-linear --bits-per-pixel-4 -o ImageL4.pvrtc -p ImageL4.png Image.png


ex) uploading image

void texImage2DPVRTC(GLint level, GLsizei bpp, GLboolean hasAlpha, GLsizei width, GLsizei height, void *pvrtcData)

{

    GLenum format;

    GLsizei size = width * height * bpp / 8;

    if(hasAlpha) {

        format = (bpp == 4) ? GL_COMPRESSED_RGBA_PVRTC_4BPPV1_IMG :

GL_COMPRESSED_RGBA_PVRTC_2BPPV1_IMG;

    } else {

        format = (bpp == 4) ? GL_COMPRESSED_RGB_PVRTC_4BPPV1_IMG :

GL_COMPRESSED_RGB_PVRTC_2BPPV1_IMG;

    }

    if(size < 32) {

        size = 32;

    }

    glCompressedTexImage2D(GL_TEXTURE_2D, level, format, width, height, 0, size,

data);

}




출처 : http://blog.naver.com/PostView.nhn?blogId=gonagi&logNo=150067227174
Posted by 오늘마감
openGL texture 만들기

by nik 13. 5월 2009 13:58
- (void)createGLTexture:(GLuint *)texName fromCGImage:(CGImageRef)img

{

 GLubyte *spriteData = NULL;

 CGContextRef spriteContext;

 GLuint imgW, imgH, texW, texH;

 
 imgW = CGImageGetWidth(img);

 imgH = CGImageGetHeight(img);

 
 // Find smallest possible powers of 2 for our texture dimensions

 for (texW = 1; texW < imgW; texW *= 2) ;

 for (texH = 1; texH < imgH; texH *= 2) ;

 
 // Allocated memory needed for the bitmap context

 spriteData = (GLubyte *) calloc(texH, texW * 4);

 // Uses the bitmatp creation function provided by the Core Graphics framework.

 spriteContext = CGBitmapContextCreate(spriteData, texW, texH, 8, texW * 4, CGImageGetColorSpace(img), kCGImageAlphaPremultipliedLast);

 
 // Translate and scale the context to draw the image upside-down (conflict in flipped-ness between GL textures and CG contexts)

 CGContextTranslateCTM(spriteContext, 0., texH);

 CGContextScaleCTM(spriteContext, 1., -1.);

 
 // After you create the context, you can draw the sprite image to the context.

 CGContextDrawImage(spriteContext, CGRectMake(0.0, 0.0, imgW, imgH), img);

 // You don't need the context at this point, so you need to release it to avoid memory leaks.

 CGContextRelease(spriteContext);

 
 // Use OpenGL ES to generate a name for the texture.

 glGenTextures(1, texName);

 // Bind the texture name.

 glBindTexture(GL_TEXTURE_2D, *texName);

 // Speidfy a 2D texture image, provideing the a pointer to the image data in memory

 glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, texW, texH, 0, GL_RGBA, GL_UNSIGNED_BYTE, spriteData);

 // Set the texture parameters to use a minifying filter and a linear filer (weighted average)

 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);

 
 // Enable use of the texture

 glEnable(GL_TEXTURE_2D);

 // Set a blending function to use

 glBlendFunc(GL_SRC_ALPHA,GL_ONE);

 // Enable blending

 glEnable(GL_BLEND);

 
 free(spriteData);

}



출처 : http://blog.naver.com/PostView.nhn?blogId=amoros21&logNo=140107111154
Posted by 오늘마감