Dragging an image on iPhone

Often we want to drag an image view or other control around the main view. Since all user interface controls inherit from UIView, this is a simple matter of adjusting the view’s frame at runtime in response to touch events. Let’s see how it’s done.

Start Xcode, select “Create a new Xcode project,” choose the Single View Application template, and click Next. Name the project DraggingView, and select options as shown here:

Click Next, choose a location to save the project, and click Create.

First, we must set up some properties of the ViewController object. Open ViewController.h and make the following modifications:

@interface ViewController : UIViewController

@property (nonatomic, weak) IBOutlet UIImageView *image;
@property (nonatomic, assign) CGPoint offset;

@end

The image property will be wired to a UIImageView control that we will set up in the nib file, and the offset property is to keep track of the difference in x and y from the touch to the upper left corner of the image view.

Now open ViewController.xib and drag a UIImageView control to the view. Find a nice image to place in the image view, and copy that image to the project folder. Navigate to File | Add files to “Dragging Image” and select the file you just placed in the project folder. Make sure to check the “Copy items into destination group’s folder (if needed) checkbox as shown:

Click Add, and the image is added to the project. Now select the image in Interface Builder, and make it the image view’s image as shown here:

Wire up this image control to the “image” outlet:

View controllers have four delegate methods that help us to deal with touch events occurring on their views. We will now implement two of these events. Open ViewController.m and make these changes:

#import "ViewController.h"

@interface ViewController ()

@end

@implementation ViewController

@synthesize image, offset;

(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
    if (touches.count == 1) {
        UITouch *touch = [touches anyObject];
        CGPoint touchPoint = [touch locationInView:self.view];
        if (touchPoint.x > self.image.frame.origin.x &&
            touchPoint.x <= self.image.frame.origin.x + self.image.frame.size.width &&
            touchPoint.y > self.image.frame.origin.y &&
            touchPoint.y <= self.image.frame.origin.y + self.image.frame.size.height) {
            //touch is in image
            self.offset = CGPointMake(touchPoint.x self.image.frame.origin.x,
                                      touchPoint.y self.image.frame.origin.y);
        }
    }
}

(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
    UITouch *touch = [touches anyObject];
    CGPoint touchPoint = [touch locationInView:self.view];

    self.image.frame = CGRectMake(touchPoint.x self.offset.x,
                                  touchPoint.y self.offset.y,
                                  self.image.bounds.size.width,
                                  self.image.bounds.size.height);
}

@end

As always, we first synthesize the properties. In touchesBegan: withEvent:, we first filter the touches to make sure that only one finger is on the screen, and that it is within the compass of the image’s frame. If it is, we register the offset from the actual point of touch to the upper left corner of the image.

The touchesMoved: withEvent: method first finds the touch. It isn’t necessary to filter this touch, because we know touchesBegan has already done this. We only need to move the object, adjusting for the offset.

When the app is run, you can move the image about the screen by dragging it…

Leave a Reply

Your email address will not be published. Required fields are marked *