Monday, November 19, 2007

NSNull versus (null)

OK, the book says that an NSNull instance (=null) can be used to represent "nothingness," e.g. in a NSMutableArray. But it looks like this will be the value of an NSDate instance variable that you declare but fail to implement (by which I mean that you fail to assign any value to). At least, its descriptor will return "(null)."

But then it looks like I was wrong; it's just a mere case of homonymy: (null) is not null. Even though my unassigned NSDate instance entryDate was described as "(null)" by its description method, it's not an NSNull instance, since

[entryDate isEqual:[NSNull null]]

returns 0. And NSNull has only one instance, so every null would be "equal" to each other. Right?

So what's this "(null)"??! NSDate's documentation doesn't say anything about it, and neither does NSObject's.

initWithFormat: vs. stringWithFormat:

OK, I get it: NSString's initWithFormat method is an instance method, whereas stringWithFormat is a class method.

But the book says that the latter returns an autoreleased string object. OK, cool. But why does the documentation for either method or the class fail to mention that?!

Is there a rule of thumb that class methods other than alloc, new, copy or mutableCopy always return autoreleased objects (as Aaron suggests)? If so, how reliable is this rule? Would be nice to know.

I know that this point is mostly moot with garbage collection in Objective-C 2.0, but still. I want to learn the good ol' laborious retain counting way as well (and also, I hear GC isn't suitable in all circumstances).

By the way, I'm at Chapter 3. Building a Foundation tool.

Sunday, November 18, 2007

The RandomNumber app

I had already started to read the book before, and liked it a lot. I think I completed the first app it teaches, RandomNumber. So I decided to read the specs and create it without looking in the book. I actually managed to. As I suspected, I had to browse the documentation a lot (especially for the random functions), but in the end, my little app did build.

Let me just say that I'm not fully satisfied with the documentation facilities built in Xcode. I will collect my thoughts on it as I gain more experience (and probably realize that lots of things that I didn't think were there are in fact there), and do a separate post on the subject later. All that I know is that, as a beginner, I could use a bit more hand-holding.

So anyway, here's how my version of the app differred from Aaron's.

Mine displayed the random number in a text field, not as a label. More importantly, I complicated things a bit, as I already applied the MVC concept to this simple little thingy.

While in Aaron's version (note that it is the first-ever app he has his readers build), the random number generation is done in the controller, my version actually has a model class, and the random number is one of its properties. Random number generation (as well as "seeding," or initialization) is performed in the model, all the controller does is call these methods, and update the text field with the randomNumber property of the model class.

I guess this is overkill, but then at least it's MVC. And the model preserves state, too.

Due to my lack of experience, I had no idea how the model object should be created. I decided to have only one instance throughout the life cycle of the app, and I had the controller (which is serialized in the nib, if I'm not mistaken) allocate and initialize it as soon as it awakes up from the nib.

I was a bit surprised when I found out (from sample code) that there's no need to call [super awakeFromNib] when implementing that method, if I'm subclassing NSObject, but then I confirmed that to be the case in the documentation.

I guess I could also serialize my model object in the nib file? Um, maybe not. This doesn't seem to be recommended. So I'll leave it at that.

Just started "Cocoa Programming for Mac OS X"

This book has been recommended to me in several forums, so I bought it last year, decided not to wait for a Leopard version. (Is there a Leopard version coming at all? Maybe I'll find out by registering the book, as recommended. OK, I've just registered the book, only to find out that my benefits for registration happen to be nada. So, no information on that. Sorry…)

All in all, I like the book, and am very excited as I go in and try to learn Cocoa. Wish me luck.

Sunday, November 11, 2007

Some more bitching: editing application package info

OK, software development is supposed to be geeky, but why is this so incredibly tedious? I mean, I just want to enter the version number for my app. It's something every app needs. So why do I have to type stuff like CFBundleShortVersionString, instead of, say, filling in a text field somewhere?

Is CodeSense dumb? Or am I?

OK, this is weird. I declare a method in the header file for a class. Then I switch over to the implementation file. I want to implement the method… and I need to type out its name. CodeSense doesn't offer to autocomplete it. Is that a feature? Or can't it read the header file?

Then, when I begin to type an expression that sends a message, as in [myObject performMethod], why doesn't CodeSense offer performMethod as a prominent autocompletion option? And when I want to access a property with the dot syntax… No CodeSensing for me. Am I supposed to keep in mind all method and instance variable names I declared, at all times, and get no help?

Houston, we have Leopard

I've put off diving too deep into Cocoa until Leopard was out. I installed the new OS the day it came out; today, I even got the internet working (I had to buy a router, as Leopard seemed to be unable to get an IP address from my ISP on its own, needless to say, that's made me pretty disappointed and frustrated).

I read Apple's PDF on Objective-C years ago. I know some C, though I'm not a C programmer by any stretch of the imagination. I've followed some tutorials, even read a couple of Cocoa books, so I'm vaguely familiar with the simplest concepts, but I still find the learning curve that's ahead of me a bit intimidating.

I followed through the CurrencyConverter tutorial, though for some strange reason, the version included with the Leopard dev tools was not updated for the new OS. I couldn't help noticing how different things are now, even at such an elementary level.

For example, I'm no longer supposed to create classes in Inteface Builder, and then import them into Xcode. Looks like classes are now solely being generated by manually writing code in Xcode. I can't say I like that; though real programmers probably rejoice. I guess I need to grow fond of typing everything by hand, even boilerplate code, if I want to be a real programmer.

OK, now that I have internet, I can follow the updated tutorial. I'm going in.

Monday, September 24, 2007

College years: learning some math

In order to get into a college or university in Hungary, one had to pass an entrance exam. (I did also apply to some U.S. colleges. Lawrence University, a small liberal arts college in Canton, New York, did accept me, and even offered a scholarship, but I was still $10,000 short per semester, so I couldn't enroll.)

There was a complicated points system: you could get half of your score from certain high-school grades (up to 60), you had to get the second half at the entrance exam (also up to 60), and you could get some extra credits for any state-accredited language examinations (up to 3, I think). If your exam was so good, you could choose to double its score instead of using your high-school grades.

I applied for a computer science program at a respected university in Budapest. The entrance exam consisted of two parts: a written math test, and an oral test in programming. Each was worth 15 points, which needed to be doubled for a total maximum score of 60. (See above.)

The three-hour math part consisted of eight questions (as it had for decades). The first three questions were dead easy: math entrance exam tacticians recommended that you spend half an hour on them in total. The next three were a bit more difficult, it was good practice to do them in an hour or so. Finally, the last two questions (worth the most points, obviously) would take about one and a half hour, or half of your total allocated time.

We had practiced this a lot. We were professionals, almost. The advanced math class we took in the last year focused on these exams. We did a lot of the tests from the previous years. We almost always scored a respectable 12, 13, later 14 or even 15.

So did I.

But I couldn't sleep on the night before the test. And I was nervous. And I screwed up big time. I got stuck with some stupid typos I made on my second or third question. I wasted a lot of time. As a direct result, my score sucked: 10 points out of 15.

I added up my score quickly. I had 58 points from high school and languages. If I want to meet the previous year's score limit of 103, I need at least 13 points on the oral exam: a new subject for me, of which I'd never had any formal training.

I almost gave up.

But in the end I did go to the exam, though. They asked me what language I chose. I said none: I chose "pseudocode." I insisted (perhaps mistakenly) that it was allowed. They gave me a strange look, but conceded.

I drew a question from the pile. I remember it to this day:
An aircraft flies over a territory in a straight line, taking elevation measurements. A measurement of 0 means sea, anything higher than that is ground. Find the highest peak on an island.

An island was defined as ground surrounded by water on both ends. A peak was defined as a point higher than both of its neighbors.

So I was standing by a blackboard, chalking up pseudocode that iterated over an array, and the examiners' helpful questions made me realize that I iterated over it one time too many. I streamlined my procedure, it looked fine in the end. Then I had to answer some question on integer representation. I was pretty clueless about it. Then I had to represent the number of rice grains in the famous Legend of the Ambalappuzha Paal Payasam. I struggled a lot, even though I should have just written out the binary digit "1" sixty-four times. I did arrive at a solution, though, but repeatedly toyed with the thought of giving up the whole exam.

To my shock, my examiners awarded me a near-perfect score of 14, as there seemed to be "order in my head." They stopped short of welcoming me on board, though my final score of 106 would have granted me entry in any previous year since the existence of the program.

Well, except that year. The score was raised to 108. I desperately looked for any program that would take me, and it turned out that I was able to start my studies at a different, much less competitive branch of the same university: the college branch for training primary school teachers. I already had the necessary score before completing my oral math exam, I had to be careful not to score below -8. (Actually, I did need some minimum score, like 5, because of other rules, but you get my point about the less-than-competitive entrance process to that place.) My majors were math and English. I wanted to transfer to the computer science program, but wasn't allowed to.

Primary school, eh? I thought we'd learn addition in the first year, maybe subtraction in the second.

Wrong.

We had four semesters of some surprisingly tough real calculus. We also had university-level geometry, with an infamous professor who would reduce the class of 120 to about 30 by the end of the first year.

But what I learned was this: one needs to study math. Do the homework. Practice. And be sure to understand everything. Ask questions. Don't be ashamed. Not everyone is a natural.

I remember when I asked one of my classmates, back in high school, that okay, this looks like the equation for a circle, but how do I know that it really is the equation of a circle? He said that I should be more like an artist, and not concern myself with problems like these, as they clearly go over my head. Gee, thanks.

Except that at college, we learned a theorem (complete with proof, naturally) that this is the case: such an equation describes a real circle if the radius is positive; a "point circle" (i.e. a circle consisting of one point only) is the radius is zero; and an imaginary circle if the radius is negative.

I prefer explicitly understanding things.

My favorite math subject was elementary math. It dealt with tricky, sometimes extremely difficult problems that could be solved by elementary means. That is, a very, very, very smart sixth grader could solve them, and thus so should his or her teacher. Makes sense.

I remember a trick question I was really proud of solving:
Point A is 100 miles from Point B.
A train starts from A to B, traveling at 15 miles per hour.
A train starts, at the same moment, from B to A, at 35 miles per hour.

At the same moment, a fly starts flying from the nose of the train from Point A at 50 miles per hour, to the nose of the other train. As soon as it reaches it, it starts back, at the same speed. And so on.

The two trains meet (crash?) somewhere between A and B. The fly keeps flying between them up to that point. How much distance does the fly cover in total?

At universities, students would use infinite sequences and the like. We used the four basic operations.

So I ended up actually liking math. Better late than never!

My parents got me a 386 when I started college. It had 4 megabytes or RAM, ran Windows 3.1, and had trouble running most games that I wanted to run. I spent about as much time fixing it as using it: getting Windows to recognize its sound and video cards, dealing with infected floppies, squeezing apps into the memory available, and so on.

I don't remember exactly why and how, but I got a copy of Borland's Turbo Pascal IDE. I don't even remember the terminology used by the system, but what I remember is this: I decided to use the coordinate geometry knowlegde I'd gained after the first year of college, and do something with it.

I came up with a library of functions that could describe points and vectors in 3D space, as well as simple polyhedrons (basically, just a collection of corners, and the edges connecting those). The main goal was to project these on a 2D surface (represented by the monitor screen), using a perspective vector, and try to mimic human vision by calculating the images captured by both eyes (with the help of blue-red goggles).

I had a summer job, and carried a little checkered notebook with me at all times, thinking about this problem. I did eventually solve all the questions I hoped I would, and I did code up the Pascal library I wanted to.

However, I stopped short of tackling invisible edges or filled polygons. My interest wasn't strong enough to lead me down those difficult paths.

Wednesday, May 9, 2007

Hello

Hi everyone, let me explain what this whole thing is. Oh, and sorry about the name of the blog. It was 2 a.m. when I came up with it, for crying out loud. (Besides, note how I managed to smuggle in both "Xcode" and the "My Whatever" naming convention... It's actually growing on me.)

So. I've decided to learn developing desktop applications for Mac OS X using Cocoa, Objective-C, and Apple's development tools, notably Xcode and Interface Builder. Then it occurred to me that I should maybe blog about it as well.

I'm going to be tracking my own progress this way, reminding me where I am, and just simply taking notes of things I should remember later. As there's nothing secret or personal about the whole learning process, I thought, why not make it public? Perhaps some helpful people will offer some help. Perhaps some people will learn from my mistakes, who knows.

So, read on! You'll be in for a ride.

Tuesday, May 8, 2007

Some background: the first 18 years

I'm 32, and I've never had any formal training in computer science, apart from some elementary introductory course at college.

I'm professionally trained as a primary school teacher in mathematics and English, though I've yet to graduate. Most of my adult life, I've worked in publishing, mostly in desktop publishing (DTP), doing layout, design, production and project management.

I'm not a native speaker of English, so if my posts sound strange, that may be the main reason. (Either that, or the fact that I'm weird. You decide.)

So, why try to learn programming at such an advanced age? Do I stand a chance of getting anywhere? OK, while I was never formally trained, I've always been involved with computing. Here's my story. (Beware. It's long.)

I've always been fascinated by computers. As a child, I had a Sinclair ZX Spectrum, and I was amazed by its BASIC interpreter.

(Having a Commodore 64 was considered much cooler, incidentally, and I soon found myself on the losing side of a platform war. However, the BASIC interpreter of the Commodore was way inferior, lending me a sense of justified snobbery using the Sinclair. It also provided me with an early reason, back in 1986, to hate the company that had written the Commodore BASIC, long before that little software firm called Microsoft became somewhat more famous. Twenty-one years later, I'm cheering yet another underdog that also happens to be competing against computers running software from Microsoft. Some things never change.)

I took a stab at writing some very elementary games using Sinclair BASIC. One-character hero shoots one-character missiles at one-character villain that goes "beep-beep-beep." That kind of games. I had to borrow some machine code snippets from various sources for things like scrolling routines. Actually, one game called "cavern-ufo," which featured a flying saucer passing through an endless cave of stalactites and stalagmites, actually used pixel graphics, and one of my friends once actually played with it! He said it was cool.

I tried to make some of my games look like "professional" titles, with loaders, screens that the program would display while the rest of the game loads (loading a huge 40-kilobyte game could take 3 to 5 minutes from a cassette player back then). But I was a pre-teen, and it showed. Believe me when I say that the games were really no good.

I also designed some logos for my own fictitious software company, whose activity was limited to working furiously on its own corporate image, including company names, logos, and even corporate fonts! I was particularly fond of an animated logo that featured a galloping unicorn. I had to write a screen capture program (in LaserBASIC, one of the popular programming utilities for the Spectrum, beside the competing Beta BASIC) specifically for the purpose of capturing the various phases of the animation. I planned to use the capture utility for my "real" projects as well, which, however, never progressed through initial brainstorming phases. But at least I learned what was the difference between an interpreter and a compiler.

You can tell that, at such an early age, I was already fascinated with all the wrong aspects of software development: the corporate image, the software tools, and the marketing fluff. As far as actual production: sorry, folks. Nothing to announce.

That's when I had my first brush with DTP (no pun indented... I mean intended). I came across a bootlegged copy of a program called Art Studio. Upon loading, it displayed a menu bar across the top of the screen (something I'd never seen before and had no idea what it was), and a strange arrow in the middle. Then it froze. So I thought. But once I discovered, by accident, that the traditional keys for "Up, Down, Left, Right, Shoot," i.e. Q, A, O, P, and M, worked: they moved the arrow (a pointer!), and allowed you to select a menu, which would scroll down, and... You get it. Art Studio brought a bit of the Mac experience to the ZX Spectrum: a drawing program that would let you draw various shapes on the screen, including text, and even let you design bitmapped fonts (not that there were other kinds of fonts for the Spectrum). I used Art Studio to do most of my design "work." Shades of things to come! (Of course, it wasn't until another ten years or so that I first learned what a Mac was.)

I also tried to learn machine code and assembly, as all real game development happened there. But that thing just went over my head. I could never wrap my head around loading numbers into registers and accumulators. I needed the abstraction of a more human-readable (and writable) programming language. I think I still do.

High-school years were a bit of a stagnant period for my computing career. I learned English furiously, and realized that I suck at math. No, actually, I used to be the math whiz at primary school, and never spent one minute ever studying math, I just didn't need to. Then, as I tried to apply the same work ethic at high school, somehow I wasn't seeing the same results. I wasn't getting the (equivalent of) A grades, but rather some Cs and Ds. So I decided I wasn't a genius after all, and stopped caring. Only later did I realize that my classmates had one secret advantage over me, something I was shocked to discover, and started to apply it when it was too late: they actually did their homework every once in a while. It actually helps.

While at high-school, I learned how to use PCs (running then-state-of-the-art Windows 3.1), though my parents or I couldn't afford one. Then I got involved with one of the student newspapers some of my classmates produced, as a cartoonist, and later as a designer... This was at a time when PCs (at least the PCs we had access to) were still pretty damn slow and underpowered for DTP. Printing a simple headline might have taken us ten minutes. Read that sentence again. Man, do I feel old!

Back then, cutting and pasting meant scissors and glue. That's how we produced our newspaper, mostly using typewriters for the body text, while we looked at envy with some of our competitors who already started to use word processors. The bastards!

One important thing I realized, in contrast to my general suckiness in math, was that I had a strange affinity towards what I can only describe as "coding." Here's what I mean. While my classmates kept routinely kicking my ass at math (as they all wanted to go to business school and took the entrance exam pretty seriously), there was one math lesson which turned out to be a dialog between the teacher and me, with nobody else ever saying a word. That lesson focused on playful problems of encoding information.

What would you do if you had to transmit a large number to someone, using only digits, and the number nine broke off your keypad? One of my solutions (besides switching to a different numeral system): use double-digit numbers. 00 would stand for 0, 01 for 1, 02 for 2, and so on. The number 9 could be represented, by mutual agreement, by anything that doesn't start with a 0, like 11. This answer came very fast and easily to me, while the problem seemed totally alien to everyone else.

What if you have to transmit several numbers, and no separator character is working? Well, easy: use the double-digit numbers 00, 01, 02 and so on, with maybe 99 being used as the separator.

The toughest, final question was this: what if only one number, say, the 1, is working on your keypad? I was the only one to answer that as well: Write down the numbers you have to transmit, using the double-digit encoding and the 99 separator. That entire sequence of digits can be thought of as one single (very long) number. Let's call it x. Then, just transmit the number 1 x times*.

The teacher's solution would have involved a product of prime numbers raised to different powers. I haven't looked at it, but I have a feeling that mine would involve fewer clicks of the "1" button in most cases.

All in all, at that moment, I realized that while I may not be a math genius, at least parts of my brain are wired in a way that helps me come up with creative ways for encoding information. This skill might even have a name, I don't know. What I know is that it has certainly come in handy on a few occasions in my life. I'll tell about some later.

Stay tuned for more of my life story. Coming up: the college years, with some actual Pascal programming, and a lot of firsts: my first PC, my first DTP job, and maybe my first encounter with a Mac. I'll spare you the story of my first love, though. (I mean, loving a human being. It's quite different from a man's deep love for his development environment.)

*Of course, you'll be losing the initial zero this way. However, that will never cause a problem. Just think about it.