We've put a quick blog post about rotating an iOS UIView in 3D on the O'Reilly Programming Blog. Check it out!
This weekend we were at PyCon Australia 2013 in Hobart! Once again, it was very exciting to have a technology conference in our city. So exciting, in fact, that we sponsored the conference WiFi network. You're welcome! (We attempted to troll our friend --- conference coordinator Chris Neugebauer --- by setting the WiFi password to AskChris, but it backfired in that everyone asked us, rather than Chris. Better luck next time!) Our friends from the Australian Computer Society (Tasmania) also sponsored the conference.
Highlights of the conference were the three keynotes: Alex Gaynor, Mark Pesce, and Tennessee Leeuwenburg. Alex spoke about the nature of software engineering, and its relationship to art and science. Mark, during a brilliant dinner keynote, spoke about the Internet of things, and his new venture: MooresCloud. Tennessee discussed the use of tools for problem solving and communication. All three keynotes were highly enjoyable, alongside the sessions, and we highly recommend watching them as they become available on PyCon Australia's YouTube channel.
Exceptional conference coffee was again supplied by Ritual Coffee Tasmania. Although we didn't sponsor the coffee this year, it was as tasty as last year --- we can't wait to get a bag of beans to enjoy in the office.
Lightning talks were also highly enjoyable, as well as CodeWars --- a devious and irreverent coding competition devised by our friend and frequent collaborator Josh Deprez. This year, Paris from Secret Lab, and another friend and frequent collaborator Tim 'McJones' Nugent hosted the event. We're sure that we thoroughly confused all those who participated. It was pleasing to inspire a room full of conference-goers to chant "Bovril! Bovril! Bovril!"
We spent a lot of time hacking on the Holiday by MooreCloud, a fabulous set of very open and very hackable Christmas lights. Over the course of a few hours of coding (and harassment of Mark), Secret Lab and friends managed to build a script to make the Holiday respond to sound levels, a CPU activity display, a one-dimensional game of life style game, and a native iOS interface to the Holiday. It was all great fun, and we highly recommend that you order a Holiday while it's a little cheaper (they ship in November); it really is one of the coolest, most fun and inventive gadgets that we've seen in a while.
During the lightning talks at the end of the conference the hacks that we'd all made for the Holiday were also showcased. Very cool! The other lightning talks were all highly enjoyable as well.
Our friend, Frank Sainsbury, was the life of the conference (as usual), delivering a surprisingly meaningful lightning talk on helping others, as well as (also as usual) entertaining everyone with his wigs, and assorted props.
Next year, PyCon Australia will be in Brisbane. We're looking forward to it! All our photos from PyCon Australia this year can be found on Flickr. Congratulations to Chris and his team for another successful world-class technology conference.
Also, huge thanks to Rex Smeal for his fantastic art work for our conference poster.
Over the weekend, the Secret Lab team participated in GovHack 2013. Together with our friends and colleagues we formed a team and hacked a project together, using government data, over the 48 hour event.
The GovHack team included Secret Lab’s Jon and Paris, as well as frequent collaborators and friends Tim “McJones” Nugent, Nic “Winton” Wittison, Josh Deprez, Matthew D’Orazio, and Rex Smeal, as well our friends Frank Sainsbury, Sebastian Cook, and Eloise “Ducky” Macdonald-Meyer. Together, the team built a digital card game called Marvellous Ultimate Appliance, based around Energy Australia’s appliance efficiency and power consumption data.
The game was designed to help raise awareness of the energy usage and efficiencies of common household appliances. We had an absolute blast making it, and can’t wait for next year’s GovHack!
You can learn more about our 2-day game, Marvellous Ultimate Appliance, at http://admiraldolphin.github.io. The game was built in Unity, and runs on the web, Mac, Windows, and Linux. We plan to expand on it, with the team, in the near future. Thank you to the Hobart venue, The Typewriter Factory, and the organisers in Hobart, particularly Richard Tubb and Casey Farrell. The food was brilliant, the venue was brilliant, and the constant coffee was exceptional!
Special congratulations to our team member, Frank Sainsbury, who won the Spirit of GovHack award for Tasmania. Likewise, thank you to Pia Waugh for organising the event nationally.
You can watch our team's video below.
[youtube=http://www.youtube.com/watch?v=bLppE0mx5DE&w=500] You can view all of our photos from GovHack 2013 in Tasmania on Flickr. Don't forget to check out our project, as well as all the other brilliant projects on the GovHack Hackerspace.
We're very pleased to be presenting at O'Reilly's OSCON conference in Portland once again this year. We're involved in two tutorials this time around! For the first time at OSCON, Jon Manning and Paris Buttfield-Addison will be presenting a half-day tutorial on game design where they'll discuss what makes games fun, how they work, and how you can apply game design techniques to your daily non-game related work. This tutorial is hands-on, very practical, lots of fun, and is called How Do I Game Design?
For the third year in a row Chris Neugebauer, along with Jon and Paris, will be presenting a half-day tutorial on mobile application development with a focus on user-experience. As with the last two years, we'll be using Android as the platform we discuss the most – but everything will be applicable to all mobile platforms. The tutorial is called Level Up Your Apps: Mobile UX Design and Development.
We hope to see you at OSCON!
One of the most interesting and useful features that iOS includes is the gesture recogniser. Gesture recognisers are objects that are attached to views, and look for specific patterns of touches. When a gesture recogniser notices that the user has interacted in the way that it’s looking for, it notifies a delegate.
Prior to gesture recognisers, handling complex gestures like pinching or rotation was a lot harder than it had to be. Time was, developers had to manually track the touches involved in a gesture, and measure how they were moving over time; nowadays, we just do this:
And then have a method that gets run when the user interacts with the view with a rotation gesture:
Gesture recognisers are one of those APIs that are completely obvious once you think about them, and solve a potentially tricky problem very cleanly. However, gesture recognisers have to work within the bounds of how views in iOS work, which can have some interesting consequences for using them in games.
To put it briefly: how can we use gesture recognisers in OpenGL-based games?
Here’s the problem: gesture recognisers work by being attached to views; when a touch lands on the screen, UIKit determines which view the finger belongs to, and this information is used for tracking gestures.
However, all OpenGL games do their main work using a single view - the OpenGL view in which all rendering takes place. This is true regardless of whether you’re drawing complex 3D graphics or simple 2D sprites. And this can mean that gesture recognisers are trickier to do, because when the finger lands on the screen, UIKit will say, “hey, the view that was touched was the OpenGL view! Job done, you’re welcome, see you later!”
So, if we want gesture recognisers, and we’re drawing using a single OpenGL view, what needs to happen is this that gesture recognisers need to be added to the OpenGL view, but are limit the areas in which they’ll look for touches to areas that depend on what’s happening in the game.
This is possible through the use of the gestureRecognizer:shouldReceiveTouch: method in the UIGestureRecognizerDelegate protocol. If a delegate implements this method, it’s possible to make a recogniser only track touches in certain areas.
This is the approach taken by Krzysztof Zabłocki’sCCNode+SFGestureRecognizers, which is a zlib-licensed extension to Cocos2D. CCNode+SFGestureRecognizers performs some very clever hacks, including dynamically creating classes that operate as delegates and using the Objective-C runtime’s new associated object support, that allow you to add gesture recognisers directly to CCNodes.
We’re using CCNode+SFGestureRecognizers in Leonardo’s Moon Ship, an adventure game that we’re looking forward to talking about further in the coming weeks and months, to support dragging items from the player’s inventory onto items in the game world.
This event has now passed! Our next training is in Sydney, February 2013!
We're exceptionally pleased to announce another training course! Join us for three days of intense iOS training in Melbourne, where you'll learn Objective-C and iOS development from the ground up. We'll be running the course from December 14 to 16, in the heart of Melbourne's CBD.
For more information, look no further than this here internet web page site link!