Why is day not a pointer?


#1

On page 83 where we’re calculating the day of the month we’re using day for the return value from ordinalityOfUnit:inUnit:forDate: but day is not declared as a pointer. I see from the docs that ordinalityOfUnit returns an (NSUInteger) not an (NSUInteger *). I’m struggling to understand why this is. What is day’s type? Can anyone help?


#2

day is an unsigned int. So, not an object, but a C primitive type.


#3

Ok. I see that now in the Quick Help inspector. I had assumed NSUInteger was a class. It seems a bit odd to have the same naming convention as classes or did I miss a clue somewhere? Or is it always necessary to determine for yourself what kind of a thing you are working with? Xcode certainly makes it easy to do that. I just hadn’t clicked on NSUInteger.


#4

I’d be interested in that question as well. I was surprised the first time I chased NSUInteger to discover it was a typedef to a primitive type.

I did some digging, and while I haven’t found any indication of a naming convention yet, there don’t seem to be many of them. It’s possible that NSInteger and NSUinteger are the only two such types. See http://www.cocoadev.com/index.pl?NSUInteger for a discussion/debate over their utility.

Not directly related, but another link I stumbled upon that has some useful information on handling primitive types: http://www.informit.com/articles/article.aspx?p=1681071


#5

A framework can include functions, typedefs, macros, classes, functions, and constants. In Foundation, all of these things start with an NS.

So, for example, NSTimeInterval is a typedef for double.

And NSUInteger is a typedef for unsigned long. (32-bit on 32-bit architectures, 64-bit on 64-bit architectures.)
And NSInteger is a typedef for long.

Do I think this was a great idea? No. But it is not uncommon for a library to make typedefs for standard types (OpenGL, for example declares GLfloat.) It might give you some extra bit of platform independence, but most of the time, I think it just confuses people.

I think Apple should have just used long and unsigned long.


#6

Looking at the docs for NSInteger and NSUInteger it seems as if Apple is doing this to make the 64-bit transition smoother. Rather than think about what type of primitive you need you just use NS* and not worry about it.


#7

Yes, that was the goal, but long would have worked just as well: It is 64-bit on 64-bit machines, and 32-bit on 32-bit machines; just like NSInteger.


#8

[quote=“AaronHillegass”]A framework can include functions, typedefs, macros, classes, functions, and constants. In Foundation, all of these things start with an NS.

So, for example, NSTimeInterval is a typedef for double.

And NSUInteger is a typedef for unsigned long. (32-bit on 32-bit architectures, 64-bit on 64-bit architectures.)
And NSInteger is a typedef for long.

Do I think this was a great idea? No. But it is not uncommon for a library to make typedefs for standard types (OpenGL, for example declares GLfloat.) It might give you some extra bit of platform independence, but most of the time, I think it just confuses people.

I think Apple should have just used long and unsigned long.[/quote]

This information would be a valuable addition to the next edition of the book. As a beginner, I spent quite a bit of time trying to chase down the reason and your explanation cleared some things up almost instantly. Thanks.


#9

Never assume while programming, always read the reference documentation. In Xcode, it is only one click away.


#10

So this would work just as well:

I tried it. Seems to. But it’s probably better to use the NS versions of everything. In some cases you get class functionality, in a rare few others - you don’t?