Creating an iPhone Application

Published on December 2016 | Categories: Documents | Downloads: 35 | Comments: 0 | Views: 323
of 21
Download PDF   Embed   Report



Creating an iPhone Application
At a high level, the process for creating an iPhone application is similar to that for creating a Mac OS X application. Both use the same tools and many of the same basic libraries. Despite the similarities, there are also significant differences. An iPhone is not a desktop computer; it has a different purpose and requires a very different design approach. That approach needs to take advantage of the strengths of iOS and forego features that might be irrelevant or impractical in a mobile environment. The smaller size of the iPhone and iPod touch screens also means that your application¶s user interface should be well organized and always focused on the information the user needs most. iOS lets users interact with iPhone and iPod touch devices in ways that you cannot interact with desktop applications. The Multi-Touch interface reports on each separate finger that touches the screen and making it possible to handle multifinger gestures and other complex input easily. In addition, built-in hardware features such as the accelerometers, although present in some desktop systems, are used more extensively in iOS to track the screen¶s current orientation and adjust your content accordingly. Understanding how you can use these features in your applications will help you focus on a design that is right for your users. The best way to understand the design of an iPhone application is to look at an example. This article takes you on a tour of the MoveMe sample application. This sample demonstrates many of the typical behaviors of an iPhone application, including: y y y y y Initializing the application Displaying a window Drawing custom content Handling touch events Performing animations

Figure 1 shows the interface for this application. Touching the Welcome button triggers an animation that causes the button to pulse and center itself under your finger. As you drag your finger around the screen, the button follows your finger. Lift your finger from the screen and, using another animation, the button snaps back to its original location. Double-tapping anywhere outside the button changes the language of the button¶s greeting.

Figure 1 TheMoveMe application window Before reading the other sections of this article, you should download the sample (MoveMe) so that you can follow along directly in the source code. You should also have already read the following orientation pages in the iOSDev Center to get a basic understanding of iOS and the tools and language you use for development: y iOS Overview y Tools for iOS Development If you are not familiar with the Objective-C programming language, you should also have read Learning Objective-C: A Primer to familiarize yourself with the basic syntax of Objective-C.

Examining the MoveMe Sample Project
Downloading the MoveMe sample provides you with the source code and support files needed to build and run the application. You manage projects for iOS using the Xcode application (located in /Developer/Applications by default). Each Xcode project window combines a workspace for gathering your code and resource files, build rules for compiling your source and assembling your application, and tools for editing and debugging your code.

Figure 2 shows the Xcode project window for the MoveMe application. To open this project, copy it to your local hard drive and double-click the MoveMe.xcodeprojfile to open it. (You can also open the project from within Xcode by selecting File > Open and choosing the file.) The project includes several ObjectiveC source files (denoted by the .m extension), some image files and other resources, and a predefined target (MoveMe) for building the application bundle. Figure 2 TheMoveMe project window

In iOS, the ultimate target of your Xcode project is an application bundle, which is a special type of directory that houses your application¶s binary executable and supporting resource files. Bundles in iOS have a relatively flat directory structure, with most files residing at the top level of the bundle directory. However, a bundle may also contain subdirectories to store localized versions of strings and other language-specific resource files. You do not need to know the exact structure of the application bundle for the purposes of this article, but you can find that information in ³Build-Time Configuration Details´ in iOS Application Programming Guide if you are interested in it.

Building the MoveMe Application
To build the MoveMe application and run it in the simulator, do the following: 1. Open the MoveMe.xcodeproj file in Xcode. 2. In the project toolbar, make sure the simulator option is selected in the Active SDK menu. (If the Active SDK menu does not appear in the toolbar, choose Project > Set Active SDK > Simulator.) 3. Select Build > Build and Go (Run) from the menu, or simply click the Build and Go button in the toolbar.

When the application finishes building, Xcode loads it into the iOS Simulator and launches it. Using your mouse, you can click the Welcome button and drag it around the screen to see the application¶s behavior. If you have a device configured for development, you can also build your application and run it on that device. For information about how to configure devices for development and load applications, see iOS Development Guide.

A Word About Memory Management
iOS is primarily an object-oriented system, so most of the memory you allocate is in the form of ObjectiveC objects. iOS uses a reference counting scheme to know when it is safe to free up the memory occupied by an object. When you first create an object, it starts off with a reference count of 1. Clients receiving that object can opt to retain it, thereby incrementing its reference count by 1. If a client retains an object, the client must also release that object when it is no longer needed. Releasing an object decrements its reference count by 1. When an object¶s reference count equals 0, the system automatically reclaims the memory for the object. Note: iOS does not support memory management using the garbage collection feature that is in Mac OS X v10.5 and later. If you want to allocate generic blocks of memory²that is, memory not associated with an object²you can do so using the standard malloc library of calls. As is the case with any memory you allocate using malloc, you are responsible for releasing that memory when you are done with it by calling the free function. The system does not release malloc-based blocks for you. Regardless of how you allocate memory, managing your overall memory usage is important. Although iOS has a virtual memory system, it does not use a swap file. This means that code pages can be flushed as needed but your application¶s data must all fit into memory at the same time. The system monitors the overall amount of free memory and does what it can to give your application the memory it needs. If memory usage becomes too critical though, the system may terminate your application. However, this option is used only as a last resort, to ensure that the system has enough memory to perform critical operations such as receiving phone calls. For more information about how to allocate objects in iOS, see Cocoa Fundamentals Guide. For information and tips on how to improve your application¶s memory usage, see ³Using Memory Efficiently´ in iOS Application Programming Guide.

Initializing the MoveMe Application
As is true for every C-based application, the initial entry point for every iPhone application is a function called main. The good news is that, when you create a new project using the iPhone templates in Xcode, you do not have to write this function yourself. The project templates include a version of this function with all the code needed to start your application. Listing 1 shows the main function for the MoveMe application. The main function is located in that project¶s main.m file. Every application you create will have amain function that is almost identical to this one. This function performs two key tasks. First, it creates the application¶s top-level autorelease pool used by the memory management reference counting system. Second, it calls the UIApplicationMain function to create the MoveMe application¶s key objects, initialize those objects, and start the event-processing loop. The application does not return from this function until it quits. Listing 1 Using the provided main function
int main(intargc, char *argv[]) {

NSAutoreleasePool * pool = [[NSAutoreleasePoolalloc] init]; intretVal = UIApplicationMain(argc, argv, nil, nil); [pool release]; return retVal; }

Defining the Application Delegate
One of the most important architectural details of your project is defining the application delegate object, which is instantiated from a class you provide in your project. The application delegate class in MoveMe project declares its interface in MoveMeAppDelegate.h and defines its implementation in MoveMeAppDelegate.m. Once you have added these files to the project, you can use Interface Builder to designate an instance of the class as the application delegate. Interface Builder is a visual tool that you use to create and arrange views in a window, set up view hierarchies, configure each view¶s options, and establish relationships between the views and the other objects of your application. Because it is a visual tool, you perform all of these tasks by dragging components around a window surface. The result is an interactive version of your interface that you can see immediately and change in seconds. Interface Builder saves your user interface in a file known as a nib file, which is an archive of your application¶s object graph. To launch Interface Builder and see how the application delegate object¶s role is defined, double-click the MainWindow.xib file (under MoveMe> Resources) in the Groups & Files pane of the Xcode project window. MainWindow.xib is the nib file that contains your application¶s window and defines the relationships among several important objects in your application, including the application delegate. To see how the application delegate relationship is established, click the File¶s Owner icon in the nib file document window (titled ³MainWindow.xib´), show the Inspector window (choose Tools > Inspector), and click the Inspector window¶s Application Connections tab. As shown in Figure 3, the Inspector shows that the File¶s Owner object (which represents the application in the nib file) has a delegate outlet connected to the MoveMeAppDelegate object. Figure 3 The application delegate

The application delegate object works in tandem with the standard UIApplication object to respond to changing conditions in the application. The application object does most of the heavy lifting, but the delegate is responsible for several key behaviors, including the following: y y y y y y Setting up the application¶s window and initial user interface Performing any additional initialization tasks needed for your custom data engine Opening content associated with the application¶s custom URL schemes Responding to changes in the orientation of the device Handling low-memory warnings Handling system requests to quit the application

At launch time, the most immediate concern for the delegate object is to set up and present the application window to the user, which is described in ³Creating the Application Window´. The delegate should also perform any tasks needed to prepare your application for immediate use, such as restoring the application to a previous state or creating any required objects. When the application quits, the delegate needs to perform an orderly shutdown of the application and save any state information needed for the next launch cycle. For more information about the fundamental architecture and life cycle of an iPhone application, see ³Core Application Architecture´ in iOS Application Programming Guide.

Creating the Application Window
Every application is responsible for creating a window that spans the entire screen and for filling that window with content. Graphical applications running in iOS do not run side-by-side with other applications. In fact, other than the kernel and a few low-level system daemons, your application is the only thing running after it is launched. What¶s more, your application should never need more than one window²an instance of the UIWindow class. In situations where you need to change your user interface, you change the views displayed by your window. Windows provide the drawing surface for your user interface, but view objects provide the actual content. A view object is an instance of the UIView class that draws some content and responds to interactions with that content. iOS defines standard views to represent things such as tables, buttons, text fields, and other types of interactive controls. You can add any of these views to your window, or you can define custom views by subclassing UIView and implementing some custom drawing and event-handling code. The MoveMe application defines two such views²represented by the MoveMeView and PlacardView classes²to display the application¶s interface and handle user interactions. At launch time, the goal is to create the application window and display some initial content as quickly as possible. The window is unarchived from theMainWindow.xib nib file. When the application reaches a state where it is launched and ready to start processing events, the UIApplication object sends the delegate an applicationDidFinishLaunching: message. This message is the delegate¶s cue to put content in its window and perform any other initialization the application might require. In the MoveMe application, the delegate¶s applicationDidFinishLaunching: method does the following: 1. It creates a view controller object whose job is to manage the content view of the window. 2. It initializes the view controller with an instance of the MoveMeView class, which is stored in the MoveMeView.xib nib file, to act as the background view and fill the entire window frame. 3. It adds the controller¶s view as a subview of the window.

4. It shows the window. Listing 2 shows the applicationDidFinishLaunching: method for the MoveMe application, which is defined in the application delegate¶s implementation file,MoveMeAppDelegate.m. This method creates the main content view for the window and makes the window visible. Showing the window lets the system know that your application is ready to begin handling events. Listing 2 Creating the content view
- (void)applicationDidFinishLaunching:(UIApplication *)application { // Set up the view controller UIViewController *aViewController = [[UIViewControlleralloc] initWithNibName:@"MoveMeView" bundle:[NSBundlemainBundle]]; self.viewController = aViewController; [aViewController release];

// Add the view controller's view as a subview of the window UIView *controllersView = [viewController view]; [window addSubview:controllersView]; [window makeKeyAndVisible]; }

Note: You can use the applicationDidFinishLaunching: method to perform other tasks besides setting up your application user interface. Many applications use it to initialize required data structures, read any user preferences, or return the application to the state it was in when it last quit. Although the preceding code creates the window's background view and then shows the window, what you do not see in the preceding code is the creation of thePlacardView class that displays the Welcome button. That behavior is handled by the setUpPlacardView method of the MoveMeView class, which is called from the initWithCoder: method called when the MoveMeView object is unarchived from its nib file. The setUpPlacardView method is shown in Listing 3. Part of the initialization of this view includes the creation of a PlacardView object. Because the MoveMeView class provides the background for the entire application, it adds the PlacardView object as a subview. The relationship between the two views not only causes the Welcome button to be displayed on top of the application¶s background, it also allows the MoveMeView class to handle events that are targeted at the button. Listing 3 Creating the placard view
- (void)setUpPlacardView { // Create the placard view -- it calculates its own frame based on its image.

PlacardView *aPlacardView = [[PlacardViewalloc] init]; self.placardView = aPlacardView; [aPlacardView release]; =; [self addSubview:placardView]; }

For detailed information about creating windows and views, see ³What Are Windows and Views?´ in iOS Application Programming Guide.

Drawing the Welcome Button
You can use standard views provided by UIKit without modification to draw many types of simple content. For example, you can use the UIImageView class to display images and the UILabel class to display text strings. The MoveMeView class in the MoveMe application also takes advantage of a basic property of allUIView objects²specifically, the backgroundColor property²to fill the view with a solid color. This property can be set in code in the view object¶s initialization method. In this case, the property is set when MoveMeView is created in the MoveMeView.xib nib file, using a color well in the Attributes tab of the Inspector window of Interface Builder. When you need to draw content dynamically, however, you must use the more advanced drawing features found in UIKit or you must use Quartz or OpenGL ES. The PlacardView class in the MoveMe application draws the Welcome button and manages its location on the screen. Although the PlacardView class could draw its content using an embedded UIImageView and UILabel object, it instead draws the content explicitly, to demonstrate the overall process. As a result, this class implements a drawRect: method, which is where all custom drawing for a view takes place. By the time a view¶s drawRect: method is called, the drawing environment is configured and ready to go. All you have to do is specify the drawing commands to draw any custom content. In the PlacardView class, the content consists of a background image (stored in the Placard.png resource file) and a custom string, the text for which can change dynamically. To draw this content, the class takes the following steps: 1. Draw the background image at the view¶s current origin. (Because the view is already sized to fit the image, this step provides the entire button background.) 2. Compute the position of the welcome string so that it is centered in the button. (Because the string size can change, the position needs to be computed each time based on the current string size.) 3. Set the drawing color to black. 4. Draw the string in black, and slightly offset. 5. Set the drawing color to white. 6. Draw the string again in white at its intended location. Listing 4 shows the drawRect: method for the PlacardView class. The placardImage member variable contains a UIImage object with the background for the button and the currentDisplayString member variable is an NSString object containing the welcome string. After drawing the image, this method calculates the position of the string within the view. The size of the string is already known, having been calculated when the string was loaded and stored in the textSizemember variable. The string is then

drawn twice²once in black and once in white²using thedrawAtPoint:forWidth:withFont:fontSize:lineBreakMode:baselineAdjustment: method of NSString. Listing 4 Drawing the Welcome button
- (void)drawRect:(CGRect)rect { // Draw the placard at 0, 0 [placardImagedrawAtPoint:(CGPointMake(0.0, 0.0))];

/* Draw the current display string. This could be done using a UILabel, but this serves to illustrate theUIKit extensions to NSString. The text is drawn center of the view twice - first slightly offset in black, then in white -- to give an embossed appearance. The size of the font and text are calculated insetupNextDisplayString. */

// Find point at which to draw the string so it will be in the center of the view CGFloat x = self.bounds.size.width/2 - textSize.width/2; CGFloat y = self.bounds.size.height/2 - textSize.height/2; CGPoint point;

// Get the font of the appropriate size UIFont *font = [UIFontsystemFontOfSize:fontSize];

[[UIColorblackColor] set]; point = CGPointMake(x, y + 0.5); [currentDisplayStringdrawAtPoint:point forWidth:(self.bounds.size.width-STRING_INDENT) withFont:font fontSize:fontSize

lineBreakMode:UILineBreakModeMiddleTruncation baselineAdjustment:UIBaselineAdjustmentAlignBaselines];

[[UIColorwhiteColor] set]; point = CGPointMake(x, y); [currentDisplayStringdrawAtPoint:point forWidth:(self.bounds.size.width-STRING_INDENT) withFont:font fontSize:fontSize lineBreakMode:UILineBreakModeMiddleTruncation baselineAdjustment:UIBaselineAdjustmentAlignBaselines]; }

When you need to draw content that is more complex than images and strings, you can use Quartz or OpenGL ES. Quartz works with UIKit to handle the drawing of vector-based paths, images, gradients, PDF, and other complex content that you want to create dynamically. Because Quartz and UIKit are based on the same drawing environment, you can call Quartz functions directly from the drawRect: method of your view and even mix and match Quartz calls through the use of UIKit classes. OpenGL ES is an alternative to Quartz and UIKit that lets you render 2D and 3D content using a set of functions that resemble (but are not exactly like) those found in OpenGL for Mac OS X. Unlike Quartz and UIKit, you do not use your view¶s drawRect: method to do your drawing. You still use a view, but you use that view object primarily to provide the drawing surface for your OpenGL ES code. How often you update the drawing surface, and which objects you use to do so, are your decision. For detailed information about each of the drawing technologies and how you use them, see ³Supporting High-Resolution Screens´ in iOS Application Programming Guide.

Handling Touch Events
The Multi-Touch interface in iOS makes it possible for your application to recognize and respond to distinct events generated by multiple fingers touching the device. The ability to respond to multiple fingers offers considerable power but represents a significant departure from the way traditional, mouse-based event-handling systems operate. As each finger touches the surface of the device, the touch sensor generates a new touch event. As each finger moves, additional touch events are generated to indicate the finger¶s new position. When a finger loses contact with the device surface, the system delivers yet another touch event to indicate that fact. Because there may be multiple fingers touching the device at one time, it is possible for you to use those events to identify complex user gestures. The system provides some help in detecting common gestures such as swipes, but you are responsible for detecting more complex gestures. When the event system generates a new touch event, it includes information about the current state of each finger that is either touching or was just removed from the surface of the device. Because each event object contains information about all active touches, you can monitor the actions of each finger with the arrival of each new event. You can then track the movements of each finger from event to event to detect gestures,

which you can apply to the contents of your application. For example, if the events indicate the user is performing a pinch-close or pinch-open gesture (as shown in Figure 4) and the underlying view supports magnification, you could use those events to change the current zoom level. Figure 4 Using touch events to detect gestures

The system delivers events to the application¶s responder objects, which are instances of the UIResponder class. In an iPhone application, your application¶s views form the bulk of your custom responder objects. The MoveMe application implements two view classes, but only the MoveMeView class actually responds to event messages. This class detects taps both inside and outside the bounds of the Welcome button by overriding the following methods of UIResponder:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event; - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event; - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event; - (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;

To simplify its own event-handling behavior, the MoveMe application tracks only the first finger to touch the surface of the device. It does this with the support of theUIView class, which disables multi-touch events by default. For applications that do not need to track multiple fingers, this feature is a great convenience. When multi-touch events are disabled, the system delivers events only related to the first finger to touch the device. Events related to additional touches in a sequence are never delivered to the view. If you want the information for those additional touches, however, you can reenable multi-touch support using thesetMultipleTouchEnabled: method of the UIView class. As part of its event-handling behavior, the MoveMeView class performs the following steps: 1. When a touch first arrives, it checks to see where the event occurred. y y Double-taps outside the Welcome button update the string displayed by the button. Single taps inside the button center the button underneath the finger and trigger an initial animation to enlarge the button. All other touches are ignored.


2. If the finger moves and is inside the button, the button¶s position is updated to match the new position of the finger. 3. If the finger was inside the button and then lifts off the surface of the device, an animation moves the button back to its original position.

Listing 5 shows the touchesBegan:withEvent: method for the MoveMeView class. The system calls this method when a finger first touches the device. This method gets the set of all touches and extracts the one and only touch object from it. The information in the UITouch object is used to identify in which view the touch occurred (the MoveMeView object or the PlacardView object) and the number of taps associated with the touch. If the touch represents a double tap outside the button, the touchesBegan:withEvent: method calls the setupNextDisplayString method to change the welcome string of the button. If the event occurred inside the Welcome button, it uses the animateFirstTouchAtPoint: method to grow the button and track it to the touch location. All other touch-related events are ignored. Listing 5 Handling an initial touch event
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { // We only support single touches, so anyObject // retrieves just that touch from touches UITouch *touch = [touches anyObject];

// Only move the placard view if the touch was in the placard view if ([touch view] != placardView) { // In case of a double tap outside the placard view, // update the placard's display string if ([touch tapCount] == 2) { [placardViewsetupNextDisplayString]; } return; } // Animate the first touch CGPointtouchPoint = [touch locationInView:self]; [self animateFirstTouchAtPoint:touchPoint]; }

Listing 6 shows the touchesMoved:withEvent: method of the MoveMeView class. The system calls this method after the finger has touched the device and in response to it moving from its original location. The MoveMe application tracks only those movements that occur within the Welcome button. As a result, this

method checks the location of the event and uses it to adjust the center point of the PlacardView object. The movement of the view causes it to be redrawn at the new location automatically. Listing 6 Responding to movement from a touch
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject];

// If the touch was in the placardView, move the placardView // to its location if ([touch view] == placardView) { CGPoint location = [touch locationInView:self]; = location; return; } }

When the user¶s finger finally lifts from the screen, the MoveMe application responds by triggering an animation to move the button back to its starting position in the center of the application¶s window. Listing 7 shows the touchesEnded:withEvent: method that initiates the animation. Listing 7 Releasing the Welcome button
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject];

// If the touch was in the placardView, bounce it back to the center if ([touch view] == placardView) { // Disable user interaction so subsequent touches // don't interfere with animation self.userInteractionEnabled = NO; [self animatePlacardViewToCenter];

return; } }

To simplify the event handling process for the application, the touchesEnded:withEvent: method disables touch events for the view temporarily while the button animates back to its original position. If it did not do this, each of the event-handling methods would need to include logic to determine whether the button was in the middle of an animation and, if so, cancel the animation. Disabling user interactions for the short time it takes the button to travel back to the center of the screen simplifies the event handling code and eliminates the need for the extra logic. Upon reaching its original position, the animationDidStop:finished: method of the MoveMeView class reenables user interactions so that the event cycle can begin all over again. If the application is interrupted for some reason²for example, by an incoming phone call²the view is sent a touchesCancelled:withEvent: message. In this situation, the application should try to do as little work as possible to avoid competing for device resources. In the example implementation, the placard view¶s center and transformation are simply set to their original values.
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event { =; placardView.transform = CGAffineTransformIdentity; }

For more information on handling events in iOS, see ³Document Revision History´ in iOS Application Programming Guide.

Animating the Button¶s Movement
In iPhone applications, animation plays a very important role. Animation is used extensively to provide the user with contextual information and immediate feedback. For example, when the user navigates hierarchical data in a productivity application, rather than just replace one screen with another, iPhone applications animate the movement of each new screen into place. The direction of movement indicates whether the user is moving up or down in the hierarchy and also provides a visual cue that there is new information to look at. Because of its importance, support for animation is built into the classes of UIKit already. The MoveMe application takes advantage of this support by using it to animate the different aspects of the Welcome button. When the user first touches the button, the application applies an animation that causes the size of the button to grow briefly. When the user lets go of the button, another animation snaps it back to its original position. The basic steps for creating these animations are essentially the same: 1. Call the beginAnimations:context: method of the view you want to animate. 2. Configure the animation properties. 3. Call the commitAnimations method of the view to begin the animation. Listing 8 shows the animation code used to pulse the Welcome button when it is first touched. This method sets the duration of the animation and then applies a transform to the button that scales it to its new size. When this animation completes, the animation infrastructure calls

thegrowAnimationDidStop:finished:context: method of the animation delegate, which completes the pulse animation by shrinking the button slightly and moving the placard view under the touch. Listing 8 Animating the Welcome button
- (void)animateFirstTouchAtPoint:(CGPoint)touchPoint { #define GROW_ANIMATION_DURATION_SECONDS 0.15

NSValue *touchPointValue = [[NSValuevalueWithCGPoint:touchPoint] retain]; [UIViewbeginAnimations:nilcontext:touchPointValue]; [UIViewsetAnimationDuration:GROW_ANIMATION_DURATION_SECONDS]; [UIViewsetAnimationDelegate:self]; [UIViewsetAnimationDidStopSelector: @selector(growAnimationDidStop:finished:context:)]; CGAffineTransform transform = CGAffineTransformMakeScale(1.2, 1.2); placardView.transform = transform; [UIViewcommitAnimations]; }

- (void)growAnimationDidStop:(NSString *)animationID finished:(NSNumber *)finished context:(void *)context { #define MOVE_ANIMATION_DURATION_SECONDS 0.15

[UIViewbeginAnimations:nilcontext:NULL]; [UIViewsetAnimationDuration:MOVE_ANIMATION_DURATION_SECONDS]; placardView.transform = CGAffineTransformMakeScale(1.1, 1.1);

// Move the placard view under the touch NSValue *touchPointValue = (NSValue *)context; = [touchPointValueCGPointValue]; [touchPointValue release]; [UIViewcommitAnimations]; }

For more information about using the built-in view-based animations, see ³Animating Views´ in iOS Application Programming Guide. For more information about Core Animation, see ³Applying Core Animation Effects´ in iOS Application Programming Guide.

Finishing the Application
In the preceding sections, you saw how the MoveMe application was initialized, presented its user interface, and responded to events. In addition to those aspects of the application creation, there are also smaller details that need to be considered before building an application and loading it onto a device. One of the final pieces to put in place is your application¶s information property-list (Info.plist) file. It is an XML file that communicates basic information about your application to the system. Xcode creates a default version of this file for you and inserts your application¶s initial configuration information into it. You can extend this information, however, to provide additional details about your application that the system should know. For example, you would use this file to communicate information about your application version, any custom URL schemes it supports, its launch image, and the default visibility status and style of the system status bar. Listing 9 shows the contents of the Info.plist file for the MoveMe application. This file identifies the name of the executable, the image file to display on the user¶s Home screen, and the string that identifies the application uniquely to the system. Because the MoveMe application is a full-screen application²in other words, it does not display the status bar²it also includes the UIStatusBarHidden key and assigns to it the value true. Setting this key to true lets the system know that it should not display the application status bar at launch time or while the application is running. Although the MoveMe application could configure this same behavior programmatically, that behavior would not take effect until after the application was already launched, which might look odd. Listing 9 The contents of the Info.plist file
<?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" ""> <plist version="1.0"> <dict> <key>CFBundleDevelopmentRegion</key> <string>en</string> <key>CFBundleDisplayName</key> <string>${PRODUCT_NAME}</string> <key>CFBundleExecutable</key> <string>${EXECUTABLE_NAME}</string> <key>CFBundleIconFile</key> <string>Icon.png</string> <key>CFBundleIdentifier</key>

<string>com.yourcompany.${PRODUCT_NAME:identifier}</string> <key>CFBundleInfoDictionaryVersion</key> <string>6.0</string> <key>CFBundleName</key> <string>${PRODUCT_NAME}</string> <key>CFBundlePackageType</key> <string>APPL</string> <key>CFBundleSignature</key> <string>????</string> <key>CFBundleVersion</key> <string>1.0</string> <key>UIStatusBarHidden</key> <true/> <key>NSMainNibFile</key> <string>MainWindow</string> </dict> </plist>

Note: You can edit the contents of your application¶s Info.plist file using TextEdit, which displays the XML contents of the file as shown in Listing 9, or the Property List Editor, which displays the file¶s keys and values in a table. Xcode also provides access to some of these attributes in the information window for your application target. To view this window, select your application target (in the Targets group) and choose File > Get Info. The Properties tab contains some (but not all) of the properties in the Info.plist file. For information about configuring your application¶s Info.plist file, see ³The Information Property List´ in iOS Application Programming Guide. With this final piece in place, you now have all of the basic information needed to create your own functional iPhone application. The next step is to expand on the information you learned here by learning more about the features of iOS. The applications you create should take advantage of the built-in features of iOS to create a pleasant and intuitive user experience. Some of these features are described in ³Taking Your Applications Further´, but for a complete list, and for information on how to use them, see iOS Application Programming Guide.

Taking Your Applications Further
There are many features associated with iPhone and iPod touch that users take for granted. Some of these features are hardware related, such as the automatic adjustment of views in response to a change in a device¶s orientation. Others are software related, such as the fact that the built-in iPhone applications all share a single list of contacts. Because so many of the features described next are integral to the basic user experience, you should consider them during your initial design to see how they might fit into your application.

Tracking Orientation and Motion Using the Accelerometers
The accelerometers in iPhone and iPod touch provide valuable input for the system and for your own custom applications. An accelerometer measures changes in velocity along a single linear axis. Both iPhone and iPod touch have three accelerometers to measure changes along each of the primary axes in three-dimensional space, allowing you to detect motion in any direction.

Figure 5 Accelerometer axes Although you might not think measuring changes in acceleration would be very useful, in reality there is a lot you can do with the information. The force of gravity is constantly trying to pull objects to the ground. This force results in a measurable amount of acceleration toward the ground even when the device is at rest. By tracking which accelerometers are registering this acceleration, and the extent of that acceleration, you can detect the physical orientation of a device in 3D space with a fair amount of accuracy. You can then apply this orientation as input to your application. The system uses the accelerometers to monitor a device¶s current orientation and to notify your application when that orientation changes. If your application¶s interface can be displayed in both landscape and portrait mode, you should incorporate view controllers into your basic design. The UIViewController class provides the infrastructure needed to rotate your interface and adjust the position of views automatically in response to orientation changes. If you want access to the raw accelerometer data directly, you can do so using the shared UIAccelerometer object in UIKit. The UIAccelerometer object reports the current accelerometer values at a configurable interval. You can also use the data to detect the device¶s orientation or to detect other types of instantaneous motion, such as the user shaking the device back and forth. You can then use this information as input to a game or other application. For examples of how to configure the UIAccelerometer object and receive accelerometer events, see ³Accessing Accelerometer Events´ in iOS Application Programming Guide.

Accessing the User¶s Contacts

The user¶s list of contacts is an important resource that all system applications share. The Phone, Mail, and SMS Text applications use it to identify people the user needs to contact and to facilitate basic interactions such as starting a phone call, email, or text message. Your own applications can access this list of contacts for similar purposes or to get other information relevant to your application¶s needs.

Figure 6 Accessing the user¶s contacts iOS provides both direct access to the user¶s contacts and indirect access through a set of standard picker interfaces. Using direct access, you can obtain the contact information directly from the contacts database. You might use this information in cases where you want to present contact information in a different way or filter it based on application-specific criteria. In cases where you do not need custom interface, however, iOS also provides the set of standard system interfaces for picking and creating contacts. Incorporating these interfaces into your applications requires little effort but makes your application look and feel like it¶s part of the system. You access the user¶s contact information using the Address Book and Address Book UI frameworks. For more information about these frameworks, see Address Book Framework Reference for iOS and Address Book UI Framework Reference for iOS.

Getting the User¶s Current Location
Devices that run iOS are meant for users on the go. Therefore the software you write for these devices should also take this fact into account. And because the Internet and web make it possible to do business anywhere, being able to tailor information for the user¶s current location can make for a compelling user experience. After all, why list coffee shops in New York for someone who is thirsty and currently in Los Angeles? That¶s where the Core Location framework can help. The Core Location framework monitors signals coming from cell phone towers and Wi-Fi hotspots and uses them to triangulate the user¶s current position. You can use this framework to grab an initial location fix only, or you can be notified whenever the user¶s location changes. With this information, you can filter the information your application provides or use it in other ways. For an example of how to get location data in your application, see ³Getting the User¶s Current Location´ in iOS Application Programming Guide.

Playing Audio and Video
iOS supports audio features in your application through the Core Audio and OpenAL frameworks, and provides video playback support using the Media Player framework. Core Audio provides an advanced interface for playing, recording, and manipulating sound and for parsing streamed audio. You can use it to play back simple sound effects or multichannel audio, mix sounds and position them in an audio field, and even trigger the vibrate feature of an iPhone. If you are a game developer and already have code that takes advantage of OpenAL, you can use your code in iOS to position and play back audio in your games. The Media Player framework is what you use to play back full-screen video files. This framework supports the playback of many standard movie file formats and gives you control over the playback environment, including whether to display user controls and how to configure the aspect ratio of video content. Game developers might use this framework to play cut scenes or other prerendered content, while media-based applications can also use this framework to play back movie files.

Figure 7 Playing back custom video For information about the media technologies in iOS, see Multimedia Programming Guide.

Taking Pictures with the Built-in Camera
The Camera application on iPhone lets users take pictures and store them in a centralized photo library, along with the other pictures they upload from their computer. And although the iPod touch has no camera, it does have a photo library to hold the user¶s uploaded pictures. iOS provides access to both of these features through the UIImagePickerController class in the UIKit framework.

Figure 8 The iPhone camera The UIImagePickerController class provides the implementation for both the camera and photo library interfaces for your application. These are the standard system interfaces used by other applications, including the Camera and Photos applications. When you display the picker interface, the picker controller takes care of all of the required user interactions and returns the resulting image to your application. For information on how to use the picker interfaces, see ³Taking Pictures with the Camera´ in iOS Application Programming Guide and ³Picking a Photo from the Photo Library´ in iOS Application Programming Guide

Sponsor Documents

Or use your account on


Forgot your password?

Or register your new account on


Lost your password? Please enter your email address. You will receive a link to create a new password.

Back to log-in