1. Trang chủ
  2. » Công Nghệ Thông Tin

iPhone SDK 3 Programming Advanced Mobile Development for Apple iPhone and iPod touc phần 7 pptx

68 313 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Tiêu đề Location Awareness
Trường học University of California, Berkeley
Chuyên ngành Mobile Development
Thể loại Bài tập lớn
Năm xuất bản 2023
Thành phố Berkeley
Định dạng
Số trang 68
Dung lượng 751,71 KB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

This protocol is declared as follows: @protocol MKAnnotation @property nonatomic, readonly CLLocationCoordinate2D coordinate; For example, the following shows the declaration of a data

Trang 1

Location Awareness 387

Listing 13.8 shows the implementation of the application delegate class The DidFinishLaunching: method simply creates a view controller of type LocationsView-Controllerand uses it as the root controller for a navigation controller The view of the navigationcontroller is then added as a subview to the main window and the main window is made visible

application-Listing 13.8 The implementation of the application delegate class used in the tracking application

#import "Location3AppDelegate.h"

@implementation Location3AppDelegate

@synthesize window;

- (void)applicationDidFinishLaunching:(UIApplication *)application {window = [[UIWindow alloc]

initWithFrame:[[UIScreen mainScreen] bounds]];ctrl = [[LocationsViewController alloc]

CLLocation-of movements and as a “Next” button In addition, the view controller maintains a reference to a webview for visualizing the locations sampled

Listing 13.9 The declaration of LocationsViewController view controller class used in the trackingapplication

Trang 2

UIBarButtonItem *rightButton, *leftButton;

Listing 13.10 The implementation of LocationsViewController view controller class used in thetracking application

-(void) centerMap:(NSUInteger) index{

CLLocation *loc = [locations objectAtIndex:index];

NSString *js = [NSString stringWithFormat:

"document.createTextNode(\"Loc: (%i/%i), Time: %@\"));",

[loc coordinate].latitude, [loc coordinate].longitude,

Trang 3

self.navigationItem.rightBarButtonItem = rightButton;

leftButton = [[UIBarButtonItem alloc]

initWithTitle:@"Previous"

style:UIBarButtonItemStyleDonetarget:self action:@selector(prev)];

locations = [[NSMutableArray arrayWithCapacity:10] retain];

locationMgr = [[CLLocationManager alloc] init];

locationMgr.distanceFilter = MIN_DISTANCE;

locationMgr.delegate = self;

Trang 4

noUpdates = 0;

CGRect rectFrame = [UIScreen mainScreen].applicationFrame;

webView = [[UIWebView alloc] initWithFrame:rectFrame];

NSString *htmlFilePath =

[[NSBundle mainBundle] pathForResource:@"map3" ofType:@"html"];

NSData *data = [NSData dataWithContentsOfFile:htmlFilePath];

[webView loadData:data MIMEType:@"text/html"

textEncodingName:@"utf-8" baseURL:[NSURL

by Google As you will see shortly, we will use JavaScript to modify the appearance of the mapdynamically

Listing 13.11 The HTML page used for displaying a Google map for the geo-tracking application

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"

Trang 5

Location Awareness 391

</head>

<body onload="initialize()" onunload="GUnload()">

<div id="map_canvas" style="width: 500px; height: 500px">

to “Next” and point out the first location on the map

The methodcenterMap:is used to display the location on the map The method takes as an inputparameter the index of the location in the array of sampled locations It extracts the latitude andlongitude information from the location, sets the center of the map to that location, and pans to thecenter In addition, it opens an information window with the time of the sampling of the location All

of this is done in JavaScript such as the one shown below Finally, we execute the JavaScript codeusing the web view’s methodstringByEvaluatingJavaScriptFromString:

var map = new GMap2(document.getElementById("map_canvas"));

map.setMapType(G_HYBRID_MAP);

map.setCenter(new GLatLng(37.331689, -122.030731), 18);

map.panTo(map.getCenter());

map.openInfoWindow(map.getCenter(),document.createTextNode("Loc: (1/1),Time: 2008-08-06 19:51:27 -0500"));

Figure 13.3 A screenshot of the tracking application while sampling movements

Trang 6

Figure 13.4 A screenshot of the tracking application while viewing a sampled location.

Figure 13.3 shows a screenshot of the tracking application while sampling movements, andFigure 13.4 shows a screenshot of the tracking application while viewing one of those sampledlocations

The application poses some ethical (and maybe legal) issues If you find a need to launch thisapplication and hide it in someone’s car or bag, you should think again! Spying is not nice and

it may land you in jail Moms, of course, are an exception! One may want to modify the applicationand add real-time reporting of movements to interested parties This is left to the reader as anexercise

13.5 Working with ZIP Codes

The United States Postal Service (USPS) uses a coding system to help in the efficient distribution ofmail in the US Each potential recipient of mail is thought to live in a specific zone represented by aZone Improvement Plan (ZIP) code ZIP codes are, in theory, tied to geographical locations.There are various databases available on ZIP codes These databases differ in their accuracy andpricing Databases referring to the latitude and longitude of a given ZIP code can be thought todescribe the center of the ZIP code servicing area There are several places where you can buy USZIP code databases You can even download a recent database for free from the site in [1]

Trang 7

1 Create an SQLitezipcodestable To efficiently search, it is advisable to represent your

data in a database The following table can be used to store the ZIP code data

CREATE TABLE zipcodes (

zipcode int NOT NULL PRIMARY KEY,

latitude float(10,8), longitude float(10,8),

state varchar(2), city varchar(128),

county varchar(128))

The zipcode will be our primary key and for each ZIP code, we have the latitude,longitude,state,city, andcounty

2 Populate the zipcodes table Populate the table with the ZIP code geographical data

obtained from the text file The data is stored in a comma-separated ASCII file Use anNSScannerobject for value extraction The extracted tokens of each line are used as input

to an INSERT SQL statement.

3 Construct an Objective-C class for answering questions After you have produced the

database for online use, you need to develop a new class that will answer geographical queries

A major query that one would like to ask is: give me all ZIP codes that are within 10 miles of

20007 This query might be implemented with a method having the following signature:-(NSArray*)zipcodesNearLatitude:(float)lat andLongitude:(float) lon

withinDistance:(float)distance;

Let’s take a look at a possible implementation of the above method The method’s main focus

is the execution and the manipulation of results of the following SQL statement:

SELECT Z.zipcode FROM zipcodes AS Z WHERE

Distance(latitude1, latitude2, Z.latitude, Z.longitude) <= distance

This SELECT statement finds all ZIP codes such that the distance between a ZIP code’s

(latitude, longitude) and a given point (latitude1, longitude1) is within the valuedistance(in kilometers)

You have learned how to write code for these SQL statements You have also learned how tocreate C-functions and use them in SQL queries TheDistance() function in the above SQLstatement must be implemented by you Listing 13.12 presents a C-implementation

Trang 8

Listing 13.12 The C implementation of the Distance user-defined function.

void distance(sqlite3_context *context, int nargs,

double x = sin( latitude1 * pi/180 ) *

sin( latitude2 * pi/180 ) + cos(latitude1 *pi/180 ) *

cos( latitude2 * pi/180 ) *

cos( abs( (longitude2 * pi/180)

The complete application can be found in theLocation3project available in the source downloads

13.6 Working with the Map Kit API

The Map Kit framework provides the ability to embed an interactive map as a subview in anapplication The map behaves similarly to the one used by theMaps.appapplication that shipswith the iPhone OS

You can specify the center of this map and annotate it with any number of items The map has adelegate which allows it to communicate touch events on the annotated objects that you provide

Trang 9

Location Awareness 395

13.6.1 The MKMapView class

TheMKMapViewclass is the center of the Map Kit API It is a subclass ofUIView, which means thatyou can create an instance of it as you do with anyUIViewclass

To use this class, you need to add theMapKit.frameworkto your application and #importKit/MapKit.h> Adding a framework to your project is explained in Section D.4

<Map-The following shows a code fragment that creates an instance of this class and adds it as a subview:MKMapView *mapView =

[[[MKMapView alloc] initWithFrame:

[UIScreen mainScreen].applicationFrame] autorelease];

[self.view addSubview:mapView];

The above code specifies the size of the map to be full-screen You can specify any dimension youwant

13.6.2 The MKCoordinateRegion structure

When you present a map, you need to specify the area that this map should display and the level of that area TheMKCoordinateRegionstructure encapsulates this as shown below:

TheMKMapViewclass declares the following property for use as its region:

@property (nonatomic) MKCoordinateRegion region

The following shows an example of setting up the region of a map:

Trang 10

13.6.3 The MKAnnotation protocol

Locations that you wish to show on the map can be specified as annotations An annotation

is composed of a data model and a view The data model specifies the title, subtitle, andlatitude/longitude of the location The view is a visual representation of the data model

TheMKAnnotationprotocol describes the data model of the annotation This protocol is declared

as follows:

@protocol MKAnnotation <NSObject>

@property (nonatomic, readonly) CLLocationCoordinate2D coordinate;

For example, the following shows the declaration of a data modelPersonthat adopts the Annotationprotocol:

MK-@interface Person : NSObject <MKAnnotation>{

NSString *_title, *_subTitle;

CLLocationCoordinate2D _coordinate;

}

@property (nonatomic, readonly) CLLocationCoordinate2D coordinate;

@property (nonatomic, readonly) NSString *title;

@property (nonatomic, readonly) NSString *subtitle;

@end

The following shows the implementation of thePersonclass

Trang 11

Location Awareness 397

@implementation Person

@synthesize coordinate=_coordinate, title=_title, subtitle=_subTitle;-(id)initWithTitle:(NSString*)theTitle subTitle:(NSString*)theSubTitleandCoordinate:(CLLocationCoordinate2D) theCoordinate{

if(self = [super init]){

_title = [theTitle copy];

_subTitle = [theSubTitle copy];

13.6.4 The MKAnnotationView class

To show the annotation to the user on the screen, you need to set a delegate object to the map viewinstance and implement a specific method that returns a view for a given annotation

Thedelegateproperty of theMKMapViewclass is declared as follows:

@property (nonatomic, assign) id <MKMapViewDelegate> delegate

The delegate method that is called to retrieve a visual representation of an annotation is declared asfollows:

- (MKAnnotationView *)mapView:(MKMapView *)mapView

viewForAnnotation:(id <MKAnnotation>)annotation;TheMKAnnotationViewclass is a subclass ofUIView To create a new instance of this class andreturn it from the delegate method above so that it is used to represent theannotationobject, youare encouraged to reuse existing views whose annotation objects are outside the current viewing area

of the map

Trang 12

The MKMapView method dequeueReusableAnnotationViewWithIdentifier: should becalled before attempting to create a new view This method is declared as follows:

view = [[[MKAnnotationView alloc]

initWithAnnotation:annotation reuseIdentifier:@"ID1"] autorelease];}

You can, if you choose to, give the view an image This can be achieved by setting the imageproperty of the annotation view

The callout view

An annotation view can display a standard callout bubble when tapped To enable this feature, youneed to set thecanShowCalloutproperty of theMKAnnotationViewinstance toYES

If the callout bubble is enabled, the title and the subtitle of the corresponding annotation are displayedwhen the user taps on the view

You can also configure a right and a left accessory view if you want to The right callout accessoryview property is declared as follows:

@property (retain, nonatomic) UIView *rightCalloutAccessoryView

As you can see, it can be just a simple view Normally, however, this property is set to an accessorybutton (e.g.,UIButtonTypeDetailDisclosure) used by the user to get more information aboutthe annotation The left callout view is declared similarly

There is a default behavior that the API provides for you if you make the right/left callout view aninstance ofUIControlor one of its subclasses This default behavior is to invoke a specific method

in the delegate when the user taps on the accessory view You can, however, bypass this defaultbehavior and handle the touch events yourself

The following code fragment creates/dequeues an annotation view and configures both its right andleft callout accessory views The right callout accessory view is configured to be a button, while theleft callout accessory view is configured to be a simple yellow view

Trang 13

Figure 13.5 shows the annotation view created by the above code.

When the user taps on any of the right/left accessory views (provided the view is aUIControl),the delegate method mapView:annotationView:calloutAccessoryControlTapped: getscalled

You can provide your own logic in this method For example, the following code fragment displays

an alert view only if the annotation’s title that is tapped is equal toMarge

- (void)mapView:(MKMapView *)mapView

annotationView:(MKAnnotationView *)view

calloutAccessoryControlTapped:(UIControl *)control{

if([view.annotation.title isEqualToString:@"Marge"]){

[[[[UIAlertView alloc] initWithTitle:view.annotation.title

message:view.annotation.subtitledelegate:nil cancelButtonTitle:@"OK"

otherButtonTitles:nil] autorelease] show];}

13.6.5 The MKUserLocation class

The map view provides an annotation for the user’s location This annotation is an instance of theMKUserLocationclass

To access the user’s location annotation object, you can use theuserLocationproperty which isdeclared as follows:

@property (nonatomic, readonly) MKUserLocation *userLocation

If you want to use the built-in view for the user’s location annotation, you need to returnnilin themapView:viewForAnnotation:delegate method For example:

Trang 14

Figure 13.5 An example of an annotation view.

- (MKAnnotationView *)mapView:(MKMapView *)mapView

Figure 13.6 The default annotation view for the user’s current location

If you do not want the user’s location to show up on the map, you can set the map’s viewUserLocationproperty toNO

Trang 15

shows-Location Awareness 401

13.6.6 The MKPinAnnotationView class

TheMKPinAnnotationViewis a subclass of theMKAnnotationViewclass that you can use as avisual representation of your annotations This view represents a pin icon You can specify the color

of this pin as well as whether the pin should be animated when it is dropped on the map

For example, the following code fragment creates a new pin view, if one is not available, configuresthe pin to animate when it’s dropped, and gives it a green color

// Code continues in the delegate method

return pin; // return a pin for an annotation object

Figure 13.7 shows the pin annotation view

Figure 13.7 The pin annotation view

Refer to theMapViewproject in the code downloads for a complete application that utilizes the MapKit API

13.7 Summary

In this chapter, we addressed the topic of Location Awareness First, we talked in Section 13.1 aboutthe Core Location framework and how to use it to build location-aware applications After that,

Trang 16

Section 13.2 discussed a simple location-aware application Next, Section 13.3 covered the topic ofgeocoding In that section, you learned how to translate postal addresses into geographical locations.

In Section 13.4, you learned how to sample movement of the device and display that information

on maps After that, Section 13.5 discussed how to relate ZIP codes to geographical information Inthat section, you also learned the actual formula that implements the distance between two locations.Finally, Section 13.6 showed you how to utilize the Map Kit API to add an interactive map to yourview hierarchy

Trang 17

Working with Devices

In this chapter, we demonstrate the use of the several devices available on the iPhone Section 14.1discusses the usage of the accelerometer In Section 14.2, you learn how to play short and long audiofiles, how to record audio files, and how to utilize the iPod library Next, Section 14.3 shows how toplay video files After that, Section 14.4 shows how to obtain iPhone/iPod touch device information.Using the camera and the photo library is described in Section 14.5 After that, Section 14.6 showsyou how to obtain state information regarding the battery of the device Next, we discuss theproximity sensor in Section 14.7 Finally, we summarize the chapter in Section 14.8

14.1 Working with the Accelerometer

The iPhone is equipped with an easy-to-use accelerometer The accelerometer provides you with thecurrent orientation of the device in 3D space You subscribe to these updates with a given frequency(10 updates/s to 100 updates/s) and you receive three floating-point values in each update Thesevalues represent the acceleration ofx,y, andzin space The acceleration on each axis is measured

in gs, where g is the acceleration due to gravity on earth at sea-level (1g is equal to 9.80 m s−2).

14.1.1 Basic accelerometer values

If you hold the iPhone in front of you and imagine an axis that goes through the Home button andthe earpiece that is orthogonal to the floor, then that axis is the y-axis Positive values ofyindicatethat the phone is accelerating up and negative values indicate that it is accelerating down towards thefloor The x-axis goes from right to left perpendicular to the y-axis Positive values indicate that theforce is towards your right side and negative values indicate that the force is towards the left Thez-axis passes through the device Negative values indicate that the device is moving away from youand positive values indicate that the force is moving the device towards you

Due to the force of gravity, the device will report non-zero values on some or all of the axes even ifthe device is stationary For example, if you hold the device in front of you in portrait mode as shown

Trang 18

in Figure 14.1, the x- and z-axes will report 0g while the y-axis will report −1g This basically says that there is no force moving the device to the right/left or forward/backward, but there is a 1g force

on the device downwards This force, of course, is gravity

If you hold the device in landscape mode as shown in Figure 14.2, the x-axis becomes the axis

affected by the force of gravity The value of the x component of the vector reported by the accelerometer will be 1g If you hold the device as in Figure 14.3, the value will be −1g If you

rest the iPhone face up on the table, thezreading will be−1g and if you put it face down, it will report 1g.

If you hold the iPhone facing you as shown in Figure 14.1 and tilt it to the right, theyvalue willstart increasing and thexvalue increasing If you tilt it to the left, theyvalue will start increasingand thexvalue decreasing

Figure 14.1 Stationary iPhone reporting an accelerometer vector of (0,−1, 0)

Trang 19

Working with Devices 405

Figure 14.2 Stationary iPhone reporting an accelerometer vector of (1, 0, 0)

Figure 14.3 Stationary iPhone reporting an accelerometer vector of (−1, 0, 0)

14.1.2 Example

In this section, we present a simple application that demonstrates the use of the accelerometer The

example will show you how to configure the accelerometer and how to intercept a shake, a hug and

a push In addition, the application will report when the iPhone is in portrait mode with the Home

button up or down while being perpendicular to the floor

To use the accelerometer, follow these steps:

Trang 20

1 Obtain the shared accelerometer object The application has one accelerometer object Use

thesharedAccelerometermethod to obtain that object The method is declared as follows:+ (UIAccelerometer *) sharedAccelerometer

2 Configure the accelerometer Configure the frequency of updates using the Intervalproperty This property is declared as follows:

update-@property(nonatomic) NSTimeInterval updateInterval;

NSTimeInterval is declared as double The value you specify for this property ranges

from 0.1 (a frequency of 10Hz) to 0.01 (a frequency of 100Hz) seconds.

You also need to configure the delegate propertydelegatewhich is declared as follows:

@property(nonatomic, assign) id<UIAccelerometerDelegate> delegate

The protocol UIAccelerometerDelegate has a single optional method meter:didAccelerate:, which is declared as follows:

accelero (void) accelerometer:(UIAccelerometer *)accelerometer

didAccelerate:(UIAcceleration *)acceleration;

The method receives the accelerometer object and a UIAcceleration instance TheUIAccelerationobject holds the values for the 3D vector (x,y, andz) and a timestamp(timestamp)

Listing 14.1 shows the application delegate class declaration for the accelerometer example Theapplication delegate adopts both UIApplicationDelegate andUIAccelerometerDelegateprotocols In addition, it maintains the previous accelerometer reading in the acceleration-Valuesinstance variable

Listing 14.1 The application delegate class declaration for the accelerometer example

Listing 14.2 shows the implementation of the application delegate class

Listing 14.2 The implementation of the application delegate class used in the accelerometer example

#import "AccelAppDelegate.h"

#define BETWEEN(arg, v1, v2) ((arg >= v1) && (arg <= v2 ))

Trang 21

Working with Devices 407

BOOL x_big_difference = (fabs(x - accelerationValues[0]) >3);

BOOL y_big_difference = (fabs(y - accelerationValues[1]) >3);

BOOL z_big_difference = (fabs(z - accelerationValues[2]) >3);

int axes = x_big_difference + y_big_difference + z_big_difference;

BOOL x_change = (fabs(x - accelerationValues[0]) < 1);

BOOL y_change = (fabs(y - accelerationValues[1]) < 1);

BOOL z_change = (fabs(z - accelerationValues[2]) >= 3);

if(x_change && y_change && z_change){

window = [[UIWindow alloc] initWithFrame:fullScreen];

UIAccelerometer *accelerometer =

[UIAccelerometer sharedAccelerometer];

accelerometer.updateInterval = 0.1; // 10Hz

Trang 22

on at least two axes We use a 3g value-difference for each axis For example, the statement:

BOOL x_big_difference = (fabs(x - accelerationValues[0]) >3);

will result in the valueYES(1) if the difference between the previous and the current acceleration on

the x-axis is larger than 3g.

To recognize that the iPhone is in portrait mode with the axis of the Home–earpiece orthogonal tothe floor while the Home button is at the bottom, we make sure that thexandzvalues are 0 withsome tolerance interval, and theyvalue is about−1 Similarly, to recognize that the iPhone is upsidedown, the value ofymust be around 1g.

To check for an iPhone hug/punch, the method checks to see a major acceleration on thez-axis with

a negligible change on the x- and y-axes If thezvalue has changed towards a negative acceleration,

we interpret that as a punch If, on the other hand, the value has changed to a positive acceleration,

we interpret that as a hug

14.2 Working with Audio

In this section, you learn how to play short and long audio files, how to record audio files, and how

to utilize the iPod library

14.2.1 Playing short audio files

In this section, we demonstrate the playing of short audio files (< 30 seconds in length) To play a

short sound file, you first register the file as a system sound and obtain a handle After that you canplay the sound using this handle When you are finished and do not want to play this sound again,you deallocate that system sound

Trang 23

Working with Devices 409

To register a sound file as a system sound, use the function SoundID() which is declared as follows:

0 to indicate successful registration of the system sound

To play the system sound, use theAudioServicesPlaySystemSound() function which is declaredas:

void AudioServicesPlaySystemSound(SystemSoundID inSystemSoundID)

You pass in the system sound handle you obtained from the previous function The predefinedidentifierkSystemSoundID_Vibratecan be used to trigger vibration

To deallocate the system sound, use the function AudioServicesDisposeSystemSoundID(),which is declared as follows:

OSStatus AudioServicesDisposeSystemSoundID(SystemSoundID inSystemSoundID)You pass in the system sound handle which you obtained from the registration function

Trang 24

absolute file path of thesound.caffile Then, anNSURLobject is created from this file path usingthe methodfileURLWithPath:isDirectory: The system sound is then registered The typesCFURLandNSURLare interchangeable or, in Cocoa’s terminology, “toll-free bridged” Therefore,

we pass in theNSURLobject in place of the reference toCFURL,CFURLRef If there is no error, thesound is played

Theplay:method plays the sound and then schedules a timer to invoke theplay:method in oneminute

Listing 14.4 The implementation of the application delegate class demonstrating the playing of small audiofiles

#import "AudioAppDelegate.h"

@implementation AudioAppDelegate

- (void)applicationDidFinishLaunching:(UIApplication *)application {CGRect screenFrame = [[UIScreen mainScreen] bounds];

window = [[UIWindow alloc] initWithFrame:screenFrame];

NSString *filePath = [[NSBundle mainBundle]

pathForResource:@"sound" ofType:@"caf"];NSURL *aFileURL = [NSURL fileURLWithPath:filePath isDirectory:NO];OSStatus error =

14.2.2 Recording audio files

To record and to play long audio files, you need to utilize theAVFoundationframework Just addthis framework as explained in Section D.4 and include the following header files:

#import <AVFoundation/AVFoundation.h>

#import <CoreAudio/CoreAudioTypes.h>

Trang 25

Working with Devices 411

TheAVAudioRecorderadds audio recording features to your application To use it, you first need

to allocate it and then initialize it using theinitWithURL:settings:error:method which isdeclared as follows:

- (id)initWithURL:(NSURL *)url settings:(NSDictionary *)settings

error:(NSError **)outError;

You pass in anNSURLinstance that represents a file in the first argument In the second argumentyou pass in a dictionary holding key/value pair of the recording session The third argument is areference to anNSErrorpointer

After initializing the recorder instance, you can send it arecordmessage to start recording Topause recording, send it apausemessage To resume from a pause, send it arecordmessage Tostop recording and close the audio file, send it astopmessage

The following method demonstrates the basic use of this class It assumes that it is the action of aUIButtoninstance If we are currently recording (therecorderinstance is notnil), the methodsimply stops the recording and changes the button’s title to Record

-(void)recordStop{

if(self.recorder){

[recorder stop];

self.recorder = nil;

UIButton *button = (UIButton*)[self.view viewWithTag:1000];

[button setTitle:@"Record" forState:UIControlStateNormal];

UIButton *button = (UIButton*)[self.view viewWithTag:1000];

[button setTitle:@"Stop" forState:UIControlStateNormal];

}

If we are not currently recording, the method creates an instance of the recorder and initializes itwith a URL pointing to therec.aifaudio file in thetmpdirectory of theHomedirectory of theapplication

Trang 26

We use minimal settings for the recording session We specify 16kHz for the sample rate, two audiochannels, and a Linear PCM audio format.

Once the recorder has been initialized, we send it arecordmessage and change the button’s title toStop

14.2.3 Playing audio files

The counterpart of theAVAudioRecorderclass is theAVAudioPlayerclass UsingPlayeryou can play audio files of any size

AVAudio-You allocate an instance of the audio player, and then initialize it using theURL:error:method passing in the URL of the file you want to play, and a pointer (possiblynil)

initWithContentsOf-to anNSErrorinstance After that, you send it aplaymessage to start playing the audio file.The following code fragment shows an example:

14.2.4 Using the media picker controller

TheMediaPlayerframework provides a view controller that can be used to pick media items fromthe iPod library Media items include music, podcasts, audio books, etc

You present this controller modally and ask the user to select the media items from the iPod Library.Once the user taps on the Done button, you receive a collection of these items (no deletion, ofcourse) You can then do whatever you intend to do with these items For example, you could putthem in a queue and play them all

The MPMediaPickerController class

The media picker is represented by the MPMediaPickerController class There are twoinitializers for this class:

Trang 27

Working with Devices 413

• init This method initializes the media picker to be able to pick any media type

• initWithMediaTypes: This method initializes the media picker to pick specific mediatypes The method is declared as follows:

- (id)initWithMediaTypes:(MPMediaType)mediaTypes

MPMediaTypeis declared as integer and can be set to any combination of the following flags:

• MPMediaTypeMusic This flag is used to denote music type

• MPMediaTypePodcast This flag is used to denote podcast type

• MPMediaTypeAudioBook This flag is used to denote audiobook type

• MPMediaTypeAnyAudio This flag is used to denote general audio type

• MPMediaTypeAny This flag is used to denote any media type

After creating and initializing the controller, you can specify non-default behavior by setting itsproperties For example, to allow the user to select more than one media item, you can set theallowsPickingMultipleItemsproperty toYES(default isNO)

You can also specify some text that can be shown above the navigation bar by setting thepromptproperty to anyNSStringinstance

The following code fragment creates, initializes, configures, and presents a media picker:

[self presentModalViewController:mp animated:YES];

The media controller delegate

The media controller has adelegateproperty that you can set This property is declared as follows:

@property(nonatomic, assign) id<MPMediaPickerControllerDelegate> delegate

The MPMediaPickerControllerDelegate protocol declares two optional methods The firstmethod gives you the selected items and is invoked when the user taps on the Done button Thismethod is declared as follows:

- (void)mediaPicker:(MPMediaPickerController *)mediaPicker

didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection;

Trang 28

TheMPMediaItemCollectionclass represents a sorted set of media items from the iPod library.You can obtain anNSArrayof the items by accessing theitemsproperty.

If the user chooses to cancel the media picker, themediaPickerDidCancel:method is called.This method is declared as follows:

- (void)mediaPickerDidCancel:(MPMediaPickerController *)mediaPicker

You should implement both of these methods and dismiss the controller in each of them

The MPMediaItem class

Media items are represented by theMPMediaItemclass Every media item has a unique identifier

In addition, a number of metadata key/value pairs are associated with the item

You can access the unique identifier or any of the metadata values using thevalueForProperty:method which is declared as follows:

- (id)valueForProperty:(NSString *)property

The following shows some of the predefined properties:

• MPMediaItemPropertyPersistentID The value for this property is anNSNumberobject

encapsulating a 64-bit integer (unsigned long long) This number is the unique

identifier of the item in the iPod library

• MPMediaItemPropertyTitle The value for this property is an NSStringobject storingthe title of the media item

• MPMediaItemPropertyArtist The value for this property is anNSStringobject storingthe artist’s name of the media item

• MPMediaItemPropertyPlaybackDuration The value for this property is anNSNumber

object storing the duration (in seconds) of the media item The duration is stored as a double.

Refer to theMPMediaItem.hheader file for a complete list of available keys

for (MPMediaItem *item in mediaItemCollection.items){

NSLog(@"\nTitle: %@\nAlbum title: %@\nDuration: %.2f sec",

[item valueForProperty:MPMediaItemPropertyTitle],

Trang 29

Working with Devices 415

14.2.5 Searching the iPod library

If you want to search the iPod library, you need to create and configure a media query A media query

is an instance of theMPMediaQueryclass and can be configured with the following two importantpieces of information:

• Zero or one grouping scheme You can group the media items that are returned from

executing the query according to a specific grouping scheme For example, you can ask theMPMediaQueryobject to group the result set according to the artist

• A query filter A query filter consists of zero or more media property predicates For a media

item to be returned as a result of executing this query, it has to pass through all the mediaproperty predicates that make up the filter provided If you specify no predicates, all mediaitems will be returned from the query

Grouping method

The groupingTypeproperty of the MPMediaQueryclass specifies the grouping method of thequery This property is declared as follows:

@property (nonatomic) MPMediaGrouping groupingType

TheMPMediaGroupingtype is an integer that can hold one of the the following values declared in

Trang 30

• MPMediaGroupingArtist This value is used to specify grouping based on the media itemartist.

• MPMediaGroupingAlbumArtist This value is used to specify grouping based on the mediaitem album artist

• MPMediaGroupingComposer This value is used to specify grouping based on the mediaitem composer

• MPMediaGroupingGenre This value is used to specify grouping based on the media itemgenre

• MPMediaGroupingPlaylist This value is used to specify grouping based on the mediaitem playlist

• MPMediaGroupingPodcastTitle This value is used to specify grouping based on themedia item podcast title

There are several class methods declared in theMPMediaQueryclass that give you media querieswith different groupings For example, to create a media query that groups and sorts media itemsaccording to the album’s name, you can use the following class method:

+ (MPMediaQuery *)albumsQuery

Other class methods includeartistsQuery,genresQuery, andplaylistsQuery

Once you have configured the media query object, you can retrieve the result media items using theMPMediaQueryinstance variableitems This instance variable is declared as follows:

@property(nonatomic, readonly) NSArray *items

Theitemsarray holds instances ofMPMediaItemclass that match the query If the result of thequery is empty, the array will contain no elements If, on the other hand, an error occurred duringthe execution of the query, the value of this property will benil

The following code fragment retrieves all songs in the iPod library, grouping them based on the artistname It then logs specific values of each media item

MPMediaQuery *query = [MPMediaQuery songsQuery];

[query setGroupingType:MPMediaGroupingArtist];

for(MPMediaItem *item in query.items){

NSLog(@"\nTitle: %@\nAlbum title: %@\nArtist: %@",

Trang 31

Working with Devices 417

Each element of this array is an instance of theMPMediaItemCollectionclass An instance ofthis class represents a set of media items that are sorted and grouped according to some criterion

To retrieve the items in a collection, you need to access theitemsproperty which is declared asfollows:

@property (nonatomic, readonly) NSArray *items

The following code fragment shows the retrieval of the query results in a grouped form:

for(MPMediaItemCollection *mc in query.collections){

NSLog(@" -");

for(MPMediaItem *item in mc.items){

NSLog(@"\nTitle: %@\nAlbum title: %@\nArtist: %@",

Media property predicate

To specify conditions for the query, you need to add property predicates A property predicate is aninstance of theMPMediaPropertyPredicateclass

To create a predicate instance, you can use one of the following two class methods:

• predicateWithValue:forProperty:comparisonType: This method is declared asfollows:

• predicateWithValue:forProperty: This is a convenience method that behaves similar

to the above factory method, except that it uses aMPMediaPredicateComparisonEqualTocomparison type

The following code fragment shows the creation of a predicate and the addition of that predicate to

a media query:

Trang 32

MPMediaPropertyPredicate *mPredicate =

[MPMediaPropertyPredicate predicateWithValue:artist

forProperty:MPMediaItemPropertyArtist];[query addFilterPredicate:mPredicate];

To add a predicate filter to the query, use theaddFilterPredicate:method

A predicate can only be created for filterable properties You can check if a property is filterable or

not by either looking it up in the header file or the documentation, or by using theProperty: MPMediaItemclass method which is declared as follows:

canFilterBy-+ (BOOL)canFilterByProperty:(NSString *)property

14.3 Playing Video

To play video from within your application, you can use theMPMoviePlayerControllerclass.You create and initialize an instance of this class and ask it to play This controller plays the videofile in full-screen mode When playback is finished, the applications screen will become visible

14.3.1 Using the MPMoviePlayerController class

The following code fragment plays the movieMyMovie.m4vstored in the application’s bundle:NSString *filePath =

[[NSBundle mainBundle] pathForResource:@"MyMovie" ofType:@"m4v"];NSURL *fileUrl = [NSURL fileURLWithPath:filePath];

After the initialization phase, the controller is asked to play the movie using the methodplay

To use theMPMoviePlayerControllerclass, you need to add theMedia Playerframework asexplained in Section D.4

Trang 33

Working with Devices 419

You can see a complete application that streams a movie off of the Internet by looking at theVideo1project available from the source download Figure 14.4 shows the view just after sending theplaymessage to the controller

Figure 14.4 Streaming a movie off of the Internet

14.4 Accessing Device Information

TheUIDeviceclass is used to provide information about the iPhone/iPod touch There is a singleinstance of this class that can be obtained using the class methodcurrentDevice The followingare some of the pieces of information you can obtain using this instance:

• Unique identifier You can obtain a string that uniquely identifies the iPhone device using the

propertyuniqueIdentifier This property is declared as follows:

@property(nonatomic,readonly,retain) NSString *uniqueIdentifier

• Operating system You can obtain the name of the operating system using thesystemNameproperty This property is declared as follows:

@property(nonatomic,readonly,retain) NSString *systemName

• Operating system version You can obtain the OS version using the systemVersionproperty This property is declared as follows:

@property(nonatomic,readonly,retain) NSString *systemVersion

• The model You can distinguish between iPhone and iPod Touch using themodelproperty.This property is declared as follows:

@property(nonatomic, readonly, retain) NSString *model

Trang 34

• Device orientation The orientation of the device can be obtained using theorientationproperty This property is declared as follows:

@property(nonatomic,readonly) UIDeviceOrientation orientation

Possible values are:

14.5 Taking and Selecting Pictures

In this section, you learn how to use the camera for taking pictures You learn that you do not havedirect access to the camera or the photo library, but rather you use a supplied controller that handlesthe user’s interaction for taking and editing the picture The controller provides you with the finalimage when the user finishes The same controller can be used to pick photos stored in the user’slibrary This section is organized as follows In Section 14.5.1, we outline the major steps needed

to access the camera and the photo library Then, in Section 14.5.2 we provide a detailed exampledemonstrating taking and picking pictures

14.5.1 Overall approach

To access the camera or to select pictures from the user’s library, you have to use a system-suppliedinterface that is provided to you The main class used for either taking new pictures or selectingexisting ones isUIImagePickerController The major steps for taking/selecting pictures are asfollows:

1 Check availability of action Whether you would like to take a new picture or select an

existing one, you need to check if this function is available to you TheController’s class method used for this purpose isisSourceTypeAvailable:

UIImagePicker-2 Create the controller instance If the specified action is available, you need to create an

instance of UIImagePickerController, initialize it, and configure it with the specifiedfunction If no source type is available, the controller should not be allocated

3 Set the delegate The UIImagePickerController will be responsible for the user’sinteraction while picking or taking a new picture You need to set the delegate to an objectand implement specific methods in order to receive the result The delegate follows theUIImagePickerControllerDelegateprotocol

Ngày đăng: 13/08/2014, 18:20

TỪ KHÓA LIÊN QUAN