I'm developing an app for iOS 4.2, iPhone, in this app I download images and I save them in the internal storage (NSDocuments).
Well, then I show the first image in a UIImageView. The user can drag their finger on the UIImageView (TouchesMoved), when the user do that, I load other image. If the user drag down, I load one image, if drag up, i load other, and also with right and left.
This is all done. But I want to implement zooming. This is my code until now:
initialDistance --> is the distance between the fingers at first touch
finalDistance --> is the distance between the fingers each time they move
x --> is 0
y --> is 0
// this method calculate the distance between 2 fingers
- (CGFloat)distanceBetweenTwoPoints:(CGPoint)fromPoint toPoint:(CGPoint)toPoint {
float xPoint = toPoint.x - fromPoint.x;
float yPoint = toPoint.y - fromPoint.y;
return sqrt(xPoint * xPoint + yPoint * yPoint);
}
//------------------- Movimientos con los dedos ------------------------------------
#pragma mark -
#pragma mark UIResponder
// First Touch
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
NSSet *allTouches = [event allTouches];
switch ([allTouches count]) {
case 1: { //Single touch
//Get the first touch.
UITouch *touch1 = [[allTouches allObjects] objectAtIndex:0];
switch ([touch1 tapCount])
{
case 1: //Single Tap.
{
// Guardo la primera localización del dedo cuando pulsa por primera vez
//inicial = [touch1 locationInView:self.view];
} break;
case 2: {//Double tap.
//Track the initial distance between two fingers.
//if ([[allTouches allObjects] count] >= 2) {
// oculto/o no, la barra de arriba cuando se hace un dobleTap
//[self switchToolBar];
} break;
}
} break;
case 2: { //Double Touch
// calculo la distancia inicial que hay entre los dedos cuando empieza a tocar
UITouch *touch1 = [[allTouches allObjects] objectAtIndex:0];
UITouch *touch2 = [[allTouches allObjects] objectAtIndex:1];
initialDistance = [self distanceBetweenTwoPoints:[touch1 locationInView:[self view]]
toPoint:[touch2 locationInView:[self view]]];
}
default:
break;
}
}
// when the finger/s move to
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
NSSet *allTouches = [event allTouches];
switch ([allTouches count])
{
case 1: {
} break;
case 2: {
//The image is being zoomed in or out.
UITouch *touch1 = [[allTouches allObjects] objectAtIndex:0];
UITouch *touch2 = [[allTouches allObjects] objectAtIndex:1];
//Calculate the distance between the two fingers.
CGFloat finalDistance = [self distanceBetweenTwoPoints:[touch1 locationInView:[self view]]
toPoint:[touch2 locationInView:[self view]]];
NSLog(@"Distancia Inicial :: %.f, Ditancia final :: %.f", initialDistance, finalDistance);
float factorX = 20.0;
float factorY = 11.0;
// guardo la posicion de los 2 dedos
//CGPoint dedo1 = [[[touches allObjects] objectAtIndex:0] locationInView:self.view];
//CGPoint dedo2 = [[[touches allObjects] objectAtIndex:1] locationInView:self.view];
// comparo para saber si el usuario esta haciendo zoom in o zoom out
if(initialDistance < finalDistance) {
NSLog(@"Zoom In");
float newWidth = imagen.frame.size.width + (initialDistance - finalDistance + factorX);
float newHeight = imagen.frame.size.height + (initialDistance - finalDistance + factorY);
if (newWidth <= 960 && newHeight <= 640) {
/*
if (dedo1.x >= dedo2.x) {
x = (dedo1.x + finalDistance/2);
y = (dedo1.y + finalDistance/2);
} else {
x = (dedo2.x + finalDistance/2);
y = (dedo2.y + finalDistance/2);
}
*/
//x = (dedo1.x);
//y = (dedo1.y);
imagen.frame = CGRectMake( x, y, newWidth, newHeight);
} else {
imagen.frame = CGRectMake( x, y, 960, 640);
}
}
else {
NSLog(@"Zoom Out");
float newWidth = imagen.frame.size.width - (finalDistance - initialDistance + factorX);
float newHeight = imagen.frame.size.height - (finalDistance - initialDistance + factorY);
if (newWidth >= 480 && newHeight >= 320) {
/*
if (dedo1.x >= dedo2.x) {
x = (dedo1.x + finalDistance/2);
y = (dedo1.y + finalDistance/2);
} else {
x = (dedo2.x + finalDistance/2);
y = (dedo2.y + finalDistance/2);
}
*/
//x -= (finalDistance - initialDistance + factorX);
//y -= (finalDistance - initialDistance + factorX);
//x = (dedo1.x);
//y = (dedo1.y);
imagen.frame = CGRectMake( x, y, newWidth, newHeight);
} else {
imagen.frame = CGRectMake( 0, 0, 480, 320);
}
}
initialDistance = finalDistance;
} break;
}
}
#pragma mark -
Thank you very much!!
PD: If there is a method with UIScrollView whitch I can move between different images, I m open to take a look too.
Ok. You can consider using a UIScrollView if only to use it for its zoom functionality.
Say we have a scrollView
and a imageView
with both having the same bounds. Add the imageView
as the subview of scrollView
.
[scrollView addSubview:imageView];
scrollView.contentSize = imageView.frame.size;
To support only zoom and not not panning in the scrollView
your viewController
will have to adopt the UIScrollViewDelegate
protocol.
// Disabling panning/scrolling in the scrollView
scrollView.scrollEnabled = NO;
// For supporting zoom,
scrollView.minimumZoomScale = 0.5;
scrollView.maximumZoomScale = 2.0;
...
// Implement a single scroll view delegate method
- (UIView*)viewForZoomingInScrollView:(UIScrollView *)aScrollView {
return imageView;
}
By now we have zooming available. For swipes, you can use appropriately configured UISwipeGestureRecognizers. Create a gesture for handling each swipe direction and add it to scroll view.
UISwipeGestureRecognizer *rightSwipe = [[UISwipeGestureRecognizer alloc] initWithTarget:self selector:@selector(handleRightSwipe:)];
rightSwipe.direction = UISwipeGestureRecognizerDirectionRight;
rightSwipe.numberOfTouchesRequired = 1;
[scrollView addGesture:rightSwipe];
[rightSwipe release];
And retrieve the appropriate image based on the gesture and set it using imageView.image = yourImage;
.