开发者

How to move a UIImageView after applying CGAffineTransformRotate to it?

开发者 https://www.devze.com 2023-02-08 00:47 出处:网络
I\'m building an iPhone app. I have a UIView that contains a set of UIImageView subclass objects.The user can drag and rotate the image views via touch.I\'m having trouble moving the image view after

I'm building an iPhone app. I have a UIView that contains a set of UIImageView subclass objects. The user can drag and rotate the image views via touch. I'm having trouble moving the image view after it has been rotated.

To rotate an image view, I apply a transform rotation, which works fine. It looks like this:

CGAffineTransform trans = self.transform;
self.transform = CGAffineTransformRotate(trans, delta);

The problem comes later when the user tries to move the element via touch. In touchesBegan:WithEvent:, I save the start point in a class variable, startLocation:

- (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event
{   
    // Retrieve the touch point
    CGPoint pt = [[touches anyObject] locationInView:self];
    startLocation = pt;
}

In touchesMoved:withEvent:, I had the following code, which works well enough if there is no rotation transform on the image view:

- (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event
{
    CGPoint pt = [[touches anyObject] locationInView:self];
    CGFloat dx = pt.x - startLocation.x;
    CGFloat dy = pt.y - startLocation.y;
    CGPoint newCenter = CGPointMake(self.center.x + dx, self.center.y + dy);
    self.center = newCenter;
}

But if there is a rotation transform on the image view, then the image view thrashes about the screen on each touchesMoved event and soon disappears. In the debugger, I observed that the value of pt became monstrous. It occurred to me that I needed to transform that point, which I did, like so:

- (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event
{
CGPoint pt = [[touches anyObject] locationInView:self];
if (!CGAffineTransformIsIdentity(self.transform)) {
    pt = CGPointApplyAffineTransform(pt, self.transform);
}

CGFloat dx = pt.x - startLocation.x;
CGFloat dy = pt.y - startLocation.y;
CGPoint ne开发者_开发问答wCenter = CGPointMake(self.center.x + dx, self.center.y + dy);
}

This worked much better. I can drag the image about the screen now. But the very first movement causes the image to jolt once in one direction or another, depending on the angle of rotation in the transform and the dimensions of the image view.

How can I move the image view without having the initial jolt?

Why is it that I do not need to transform startLocation (the point I capture when touches began)?


- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)_event {
    CGPoint pt = [[touches anyObject] locationInView:self]; 
    startLocation = pt;
} 

- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)_event {
    CGPoint pt = [[touches anyObject] previousLocationInView:self];
    CGFloat dx = pt.x - startLocation.x;
    CGFloat dy = pt.y - startLocation.y;
    CGPoint newCenter = CGPointMake(self.center.x + dx, self.center.y + dy);
    self.center = newCenter;
}

- (void)setRotation:(float)rotation {
    self.transform = CGAffineTransformRotate(self.transform, degreesToRadians(rotation));
}


It seems you need to convert the coordinates from the main not-rotated object to the rotated-view system.

Take a look at the UIView method "convertPoint:toView:":

convertPoint:toView:

Converts a point from the receiver’s coordinate system to that of the specified view.

- (CGPoint)convertPoint:(CGPoint)point toView:(UIView *)view

Parameters

point

A point specified in the local coordinate system (bounds) of the receiver.

view

The view into whose coordinate system point is to be converted. If view is nil, this method instead converts to window base coordinates. Otherwise, both view and the receiver must belong to the same UIWindow object.

Return Value

The point converted to the coordinate system of view.

Update:

In response to the comments:

You have to get the "real" coordinates of the finger touch (in the non-rotated system), then when the finger moves you always have new coordinates in the main non-rotated view: and they are the point that you have to convert in the rotated view parent of the view that you are moving.

  • If A is a mainView 320x480 pixel
  • and B is a subView 320x480 pixel centered in A,
  • and C is a subView in B at position 170,240 (+10,+0 of the screen center)
  • and you rotate B of 90 degrees clockwise
  • then C is still in 170,240 in B

But you see it in 160,250 on the screen,

and if now the user want to move it +20 to the right the user moves the finger +20 in the screen coordinates, not in the B view coordinates, so the user would like to see it in 180,250 of the screen, that means you need to convert this point in the B coordinates system...

So it's a bit easier, you just need to get the screen coordinates of the finger when the user moves it and just convert it in rotated-view coordinates (B)...

0

精彩评论

暂无评论...
验证码 换一张
取 消