You make an excellent point that using elephant instead of int seems useless. It’s not until you start needing to do slightly more complex things that the usefulness of typedef starts to shine through.
Consider this example, from one of Apple’s headers, NSObjCRuntime.h:
#if __LP64__ || (TARGET_OS_EMBEDDED && !TARGET_OS_IPHONE) || TARGET_OS_WIN32 || NS_BUILD_32_LIKE_64
typedef long NSInteger;
typedef unsigned long NSUInteger;
typedef int NSInteger;
typedef unsigned int NSUInteger;
This code looks at several things that may or may not have been #define’d in other headers, imported above this snippet. Those headers #define the processor architecture of the machine that you’re building for.
Note, though, that we see this #if / #else / #endif. All of these statements that start with the hash symbol (#) are called preprocessor directives, and we won’t worry about them for now. You should be able to guess what these ones in particular do.
This particular set conditionally typedefs NSInteger to be an int or an unsigned int depending on the computer’s processor architecture.
This is because an int can be a 4-byte number or a 2-byte number depending on the computer. The difference can cause all sorts of problems if you’ve used int throughout your program, and then compile it for a different kind of computer where int may no longer be a sufficiently large type to store the numbers you need.
By conditionally using typedef to make NSInteger be either an int or a long int depending on your computer’s architecture, NSInteger becomes a numerical type that you can use safely for almost all architectures, without worrying about whether you might accidentally be compiling for a computer where a normal int isn’t big enough.
The more common uses of typedef are are typically in conjunction with struct and enum declarations, but I wanted to provide this example in hopes that it helps to understand why typedef can be a useful command in and of itself.