While at OSCON this year, Jon and I were lucky enough to be interviewed by our editor, Rachel, for O'Reilly Radar. Find the interview embedded below, or check it out on O'Reilly. Our upcoming book, Learning Cocoa with Objective-C Third Edition, can be preordered now if you're interested! It's going to be amazing.
At OSCON this year, @chrisjrn, @desplesda and I presented a 3-hour tutorial on "Android-Fu: Awesome Apps for Ice Cream Sandwich and Beyond". We had a great turn out, and it was a lot of fun to present at OSCON for the second year in a row (last year we presented the well received Android for people who hate phones, a similar topic in a similar style).
You can also find the code from the activity on GitHub, as well as the final binary (APK) for installation on your Android 4.0+ device.
We hope you enjoyed it! Please get in touch if you have any questions or feedback!
This was a fun one. As we've mentioned before, we're currently working on a board game for a client. Without going into the details, the game is based on nodes on a game board that are linked together; players move between connected nodes, and claim territory.
When coding up the game, we initially generated the game boards in memory, by directly instantiating 'node' and 'link' objects. Then, when it came time to make the game load different boards on demand, we created a JSON-based data structure that defines the nodes and links. It looks a little something like this:
This structure defines two nodes, including their names and positions, as well as a relationship that links the two together. (Because the graph is nondirectional, we didn't want to define links in the node data structure itself.)
This is all well and good for simple game boards, like this:
But then the client said, "cool, can you make this?"
And we were all:
Clearly, hand-coding wasn't an option, and a tool was needed. Writing one ourselves wouldn't have been the best use of our time, so we looked at our favourite Mac app ever: OmniGraffle Pro. OmniGraffle already knows about objects and connections, and it exposes all of this information through AppleScript.
So, we went searching, found this excellent Gist by Michael Bianco, and adapted it into one that extracts the information we need and generates the game board data that we care about.
Loosely put, it turns this:
Into a level that can be loaded:
How it works
First, we designed the level. In this board game, there are only nodes and connections; we represented nodes as circles, and connected the nodes with lines.
The first thing the script does, we get access to OmniGraffle via the Scripting Bridge:
OmniGraffle exposes a list of shapes, which AppleScript can query for important information such as the origin point, and its text. You can also query and set a special "tag" property for each shape, which is useful for storing an identifier. We base the identifier on the text of the shape, if it has any; otherwise, we use a random number.
So, to generate the list of game nodes, we ask OmniGraffle for all shapes, and format the result into a hash, which we store in an array for later.
When generating the hash for a node, we can also make use of the user data dictionary that OmniGraffle Pro exposes. This lets you set custom key-value pairs for a shape, which is very useful for setting things like which player owns a node, or at what point in the game the node becomes active. This is a simple matter of merging in the userData hash.
Once all nodes have been processed, we know that all shapes have had a tag associated with them; we can then iterate over all shapes a second time, this time generating information for each connection.
Finally, we export the nodes and links as JSON:
Because this script operates on the frontmost open document in OmniGraffle and outputs to stdout, saving the JSON is as simple as a one-line command: $ ruby graffle2board.rb > MyAwesomeBoard.board
Summary
This is a pretty powerful technique, since it lets us design the game maps with a powerful (and, more importantly, pre-existing) tool and import them exactly the way we want them. We're definitely going to be using this more in the future.
You can see the script in GitHub!
We recently needed to add a tutorial to a board game (see SLTutorialController), and realised that we needed a way to highlight various controls and other user interface elements that the user should interact with next. A common way that this is handled is by making things glow, often with an animation. So, we wrote UIView+Glow. It's a very simple category that adds two methods: startGlowing and stopGlowing. When you call startGlowing, the view will start to pulse with a soft light; this effect is removed when stopGlowing is called.
You can see a video of it in action here:
It's available on our GitHub now!
Here's a common use case we run across. We're building a really nice custom control, and it's got a reasonably complex view hierarchy. We want to be able to have multiple copies of it, plus keep the complexity of the higher-level UI down, so we store it in a nib.
The only way you can get stuff out of a nib is to use the UINib API to load the nib, get the loaded object, and then start using it. This means writing code. We want to keep the amount of code in the app down, and programatically generating interfaces sucks. What we want to do is to just insert a UIView placeholder into our view controller, and have it be replaced with the full control at runtime.
So, how can we do this?
Some previous work on this topic was done by Yang Meyer, who figured out that you can override the -awakeAfterUsingCoder: method to switch out the placeholder view with a view loaded from a nib at load-time. However, this method doesn't play nice with ARC.
We came up with a solution that we quite like. It's easy to understand, allows us to keep the control's UI in a separate nib, and also allows us to simply insert an empty UIView into our view controllers (and not clutter them up).
We used this technique in the development of SLNumberPickerView.
The Technique
First, design the interface for your class in a separate nib. We find it helpful to write a class method that loads and returns the object:
Next, we override -awakeAfterUsingCoder: to check to see if self is a placeholder view or the real view that was loaded from the external nib. We determine this based on how many subviews we have - if it's zero, then we're the empty placeholder view.
If we figure out that self is the placeholder, we load a new instance of the view, and then add it as a subview of the placeholder. We also keep a reference to this internal view, and forward any relevant messages to it.
There's one drawback to this technique: because we're inserting the real view inside the placeholder view, we're keeping an extra instance of the class around (though none of its subviews are loaded, so not too much additional memory is allocated.)
You can see an example of this technique in action in SLNumberPickerView, available on our GitHub.
Another keynote has come and gone, and this one was one of the most emotionally charged and content-packed of recent memory. One of the things that struck me from CEO Tim Cook's opening and closing remarks was the sheer amount of emotion in his voice when he began describing how the iPad and iPhone have changed people's lives for the better. When you combine this with some other key features of both iOS and OS X that Apple heavily pushed at developers today, you can start to see a common theme emerging - Apple wants to continue the tradition of making the world a better place, in their eyes. When Steve Jobs was CEO, this meant making the world a better place in his eyes, and doing so under his personal criteria: better design, ease of use, and making computers fit into their place in user's lives. Tim Cook's approach, however, appears much more heavily focused on globally-beneficial improvements. Almost ten minutes was spent talking about how VoiceOver helps the blind, how the iPad and AirPlay improve teaching, and how apps like Airbnb help people connect.
The common theme here is helping, and it's easy to see what Apple - or at least Cook - see as the purpose of the devices they sell. They should help people with something.
After the keynote, Apple began the first of their confidential sessions. While I can't relate details, they spent significant time on the new accessibility features and hammered home the point that accessibility is important. There are more and more users, and this means more and more people to help - and more and more ways in which software can find ways to improve their lives.
That's my justification for spending half our funds on new MacBook Pros, anyway.