Book on Sale

This promotion has now ended! You can still grab our book at O'Reilly's site.

The newest book (released December 2012) by Jon Manning and Paris Buttfield-Addison, co-founders of Secret Lab, is currently on sale from O'Reilly! Our book, along with a collection of other great iOS-development books, is available in the Build Successful iOS Apps sale.

You can use the discount code WKPRGS6 or visit http://go.secretlab.co/OELh to get 50% off.

AppleSecret LabBook, sale
Using Gesture Recognizers with Cocos2D and CCNode + SFGestureRecognizer

One of the most interesting and useful features that iOS includes is the gesture recogniser. Gesture recognisers are objects that are attached to views, and look for specific patterns of touches. When a gesture recogniser notices that the user has interacted in the way that it’s looking for, it notifies a delegate.

Prior to gesture recognisers, handling complex gestures like pinching or rotation was a lot harder than it had to be. Time was, developers had to manually track the touches involved in a gesture, and measure how they were moving over time; nowadays, we just do this:

And then have a method that gets run when the user interacts with the view with a rotation gesture:

Gesture recognisers are one of those APIs that are completely obvious once you think about them, and solve a potentially tricky problem very cleanly. However, gesture recognisers have to work within the bounds of how views in iOS work, which can have some interesting consequences for using them in games.

To put it briefly: how can we use gesture recognisers in OpenGL-based games?

Here’s the problem: gesture recognisers work by being attached to views; when a touch lands on the screen, UIKit determines which view the finger belongs to, and this information is used for tracking gestures.

However, all OpenGL games do their main work using a single view - the OpenGL view in which all rendering takes place. This is true regardless of whether you’re drawing complex 3D graphics or simple 2D sprites. And this can mean that gesture recognisers are trickier to do, because when the finger lands on the screen, UIKit will say, “hey, the view that was touched was the OpenGL view! Job done, you’re welcome, see you later!”

So, if we want gesture recognisers, and we’re drawing using a single OpenGL view, what needs to happen is this that gesture recognisers need to be added to the OpenGL view, but are limit the areas in which they’ll look for touches to areas that depend on what’s happening in the game.

This is possible through the use of the gestureRecognizer:shouldReceiveTouch: method in the UIGestureRecognizerDelegate protocol. If a delegate implements this method, it’s possible to make a recogniser only track touches in certain areas.

This is the approach taken by Krzysztof Zabłocki’sCCNode+SFGestureRecognizers, which is a zlib-licensed extension to Cocos2D. CCNode+SFGestureRecognizers performs some very clever hacks, including dynamically creating classes that operate as delegates and using the Objective-C runtime’s new associated object support, that allow you to add gesture recognisers directly to CCNodes.

We’re using CCNode+SFGestureRecognizers in Leonardo’s Moon Ship, an adventure game that we’re looking forward to talking about further in the coming weeks and months, to support dragging items from the player’s inventory onto items in the game world.

leo-dragging.jpg
December Developer Training in Melbourne

This event has now passed! Our next training is in Sydney, February 2013!

We're exceptionally pleased to announce another training course! Join us for three days of intense iOS training in Melbourne, where you'll learn Objective-C and iOS development from the ground up. We'll be running the course from December 14 to 16, in the heart of Melbourne's CBD.

For more information, look no further than this here internet web page site link!