SwiftUI scale path when Image size change - swiftui

in my project i'm drawing a box over a an image using a custom drawing gesture.
Image(uiImage: image!)
.resizable()
.scaledToFit()
.onTouch(type:.all ,limitToBounds: true, perform: updateLocation) //custom touch
.overlay(
ForEach(paths) { container in
// draw the bounding box
container.path
.stroke(Color.red, lineWidth: 4)
}
)
my issue is the following:
when my image change the dimension to fit the view I want to scale up, or down the bounding box I draw in order to keep the same proportion.
before change change image scale:
after image scale up:
as you can see on the second screenshot when image frame become bigger the path draw on it change location a dimension.
How can I solve this issue?
I tried with the following code when the picture frame dimension change:
.onPreferenceChange(ViewRectKey.self) { rects in
pictureFrame = rects.first
// temp to aplly scale
guard let path = paths.last else {return}
paths.removeAll()
print("------------")
let pa = path.path.applying(CGAffineTransform(scaleX: 5, y: 5)) // 5 just for testing, need to use correct scale factor
let cont = PathContainer(id: UUID(), path: pa)
paths.append(cont)
}
but I don't know how to calculate how much the scale should be and second how to keep the position where initially I draw my path.

Related

Problems with CIImageAccumulator from MTKView texture

I want capture the output of an MTKView via the view's texture into a CIImageAccumulator to achieve a gradual painting build up effect. The problem is that the accumulator seems to be messing with the color/alpha/colorspace of the original, as shown below:
From the image above, the way I capture the darker-looking brushstroke is via the view's currentDrawable.texture property:
lastSubStrokeCIImage = CIImage(mtlTexture: self.currentDrawable!.texture, options: nil)!.oriented(CGImagePropertyOrientation.downMirrored)
subStrokeUIView.image = UIImage(ciImage: lastSubStrokeCIImage)
Now, once I take the same image and pipe it into a CIIAcumulator for later processing (I only do this once per drawing segment), the result is the brighter-looking result shown in the upper portion of the attachment:
lazy var ciSubCurveAccumulator: CIImageAccumulator =
{
[unowned self] in
return CIImageAccumulator(extent: CGRect(x: 0, y: 0, width: self.frame.width * self.contentScaleFactor, height: self.frame.height * self.window!.screen.scale ) , format: kCIFormatBGRA8)
}()!
ciSubCurveAccumulator.setImage(lastSubStrokeCIImage)
strokeUIView.image = UIImage(ciImage: ciSubCurveAccumulator.image())
I have tried using a variety of kCIFormats in the CIImageAccumulator definition, all to no avail. What is the CIImageAccumulator doing to mess with the original, and how can I fix it? Note that I intend the use ciSubCurveAccumulator to gradually build up a continuous brushstroke of consistent color. For simplicity of the question, I'm not showing the accumulating part. This problem is stopping me dead on my tracks.
Any suggestions would kindly be appreciated
The problem came down to two things: one, I needed to set up MTLRenderPipelineDescriptor() for compositeOver compositing and two, I needed to set introduce a CIFilter to hold intermediate compositing over the accumulating CIImageAccumulator. This CIFilter also needed to be set up to CIOverCompositing. Below is a snippet of code that captures all of the above:
lazy var ciSubCurveAccumulator: CIImageAccumulator =
{
[unowned self] in
return CIImageAccumulator(extent: CGRect(x: 0, y: 0, width: self.frame.width * self.contentScaleFactor, height: self.frame.height * self.window!.screen.scale ) , format: kCIFormatBGRA8)
}()! // set up CIImageAccumulator
let lastSubStrokeCIImage = CIImage(mtlTexture: self.currentDrawable!.texture, options: nil)!.oriented(CGImagePropertyOrientation.downMirrored)
let compositeFilter : CIFilter = CIFilter(name: "CISourceOverCompositing")!
compositeFilter.setValue(lastSubStrokeCIImage, forKey: kCIInputImageKey) // foreground image
compositeFilter.setValue(ciSubCurveAccumulator.image(), forKey: kCIInputBackgroundImageKey) // background image
let bboxChunkSubCurvesScaledAndYFlipped = CGRect(...) // capture the part of the texture that was drawn to
ciSubCurveAccumulator.setImage(compositeFilter.value(forKey: kCIOutputImageKey) as! CIImage, dirtyRect: bboxChunkSubCurvesScaledAndYFlipped) // comp bbox with latest updates
Wrapping the above bit of code inside draw() allows gradual painting to accumulate quite nicely. Hopefully this helps someone at some point.

extension file making image flip upside down (swift3)

The following extension file generates my image upside down. All i need to do is flip my image by 180 degrees.
case .landscapeLeft:
var transform: CGAffineTransform = CGAffineTransform.identity
transform = transform.translatedBy(x: self.size.width, y: self.size.height)
transform = transform.rotated(by: CGFloat(Double.pi/2))
guard let cgImage = self.cgImage, let colorSpace = cgImage.colorSpace, let context: CGContext = CGContext(data: nil, width: Int(self.size.width), height: Int(self.size.height), bitsPerComponent: cgImage.bitsPerComponent, bytesPerRow: 0, space: colorSpace, bitmapInfo: CGImageAlphaInfo.premultipliedLast.rawValue) else { return self }
context.concatenate(transform)
context.draw(cgImage, in: CGRect(x: 0, y: 0, width: self.size.width, height: self.size.height))
guard let transformed = context.makeImage() else { return self }
return UIImage(cgImage: transformed)
return imageResult!
I have tried to use 3 * Double.pi / 2 but that makes no image appear on the image view. The only math formula that gets a image on the image view is double.pi / 2 for my code.
Your code rotates the image about the origin of the image which happens to be the top left hand corner. You need to translate the origin to be the centre of the image before applying the rotation and then after translate the origin back to the top left corner. If you replace your transform construction with the following it should work
var transform: CGAffineTransform = CGAffineTransform.identity
transform = transform.translatedBy(x: self.size.width/2, y: self.size.height/2)
transform = transform.rotated(by: angle)
transform = transform.translatedBy(x: -self.size.width/2, y: -self.size.height/2)

How to scale CAShapeLayer to different UIImageView

I have a UIImageView that you can tap on and it draws a circle. I store the location of the circles in an Array of Dictionaries. This allows me to "replay" the drawing of the circles. However, when the UIImageView is a different size from the original, the circles don't scale to the new UIImageView.
How can I get the circles to scale? For demonstration purposes, the top picture is the size of the UIImageView used for input and the second one is the size for replay.
Inputing the circles:
Replay the circles (the circles should be in the blue UIImageView
import Foundation
import UIKit
class DrawPuck {
func drawPuck(circle: CGPoint, circleColour: CGColor, circleSize: CGFloat, imageView: UIImageView) {
let circleBezierPath = UIBezierPath(arcCenter: CGPoint(x: circle.x,y: circle.y), radius: CGFloat(circleSize), startAngle: CGFloat(0), endAngle:CGFloat(M_PI * 2), clockwise: true)
let shapeLayer = CAShapeLayer()
shapeLayer.path = circleBezierPath.cgPath
//change the fill color
shapeLayer.fillColor = circleColour
//you can change the stroke color
shapeLayer.strokeColor = UIColor.white.cgColor
//you can change the line width
shapeLayer.lineWidth = 0.5
imageView.layer.addSublayer(shapeLayer)
}
}
I was able to resolve this with CATransform3DMakeScale As long as I keep the original aspect ratio of the original image it works great.
let width = yellowImageView.frame.width / blueImageView.frame.width
let height = yellowImageView.frame.height / blueImageView.frame.height
shapeLayer.transform = CATransform3DMakeScale(height, width, 1.0)
view.layer.addSublayer(shapeLayer)

MapView overlay is cutting off after zoom in

I am facing a weird problem with MKMapView. I have used a MKOverlayRenderer. Now the problem is when I am zooming out image showing correctly. But in case of zoom in, some portion of the image are cutting off. It's looking like a portion of MapView is coming above the overlay. Following is my overlay renderer code.
class MapOverlayRenderer: MKOverlayRenderer {
var overlayImage: UIImage
var plan: Plan
init(overlay: MKOverlay, overlayImage: UIImage, plan: Plan) {
self.overlayImage = overlayImage
self.plan = plan
super.init(overlay: overlay)
}
override func draw(_ mapRect: MKMapRect, zoomScale: MKZoomScale, in ctx: CGContext) {
let theMapRect = overlay.boundingMapRect
let theRect = rect(for: theMapRect)
// Rotate around top left corner
ctx.rotate(by: CGFloat(degreesToRadians(plan.bearing)));
// Draw the image
UIGraphicsPushContext(ctx)
overlayImage.draw(in: theRect, blendMode: CGBlendMode.normal, alpha: 1.0)
UIGraphicsPopContext();
}
func degreesToRadians(_ x:Double) -> Double {
return (M_PI * x / 180.0)
}
}
Though I don't know the actual reason but when I am commenting ctx.rotate(by:) function this problem is been fixed. But that's not my solution cause image has to be in position.
Please Try below.
override func draw(_ mapRect: MKMapRect, zoomScale: MKZoomScale, in ctx: CGContext) {
DispatchQueue.main.async {
let theMapRect = overlay.boundingMapRect
let theRect = rect(for: theMapRect)
// Rotate around top left corner
ctx.rotate(by: CGFloat(degreesToRadians(plan.bearing)));
// Draw the image
UIGraphicsPushContext(ctx)
overlayImage.draw(in: theRect, blendMode: CGBlendMode.normal, alpha: 1.0)
UIGraphicsPopContext();
}
}

SceneKit snapshot crashes if I put it in trackMotion, but not otherwise - timing issue?

I have a view with a SceneKit in the background and an AVCaptureVideoPreviewLayer in the foreground. I want a semi-transparent version of the background to be visible over the camera. So based on an older thread here, I added a UIImageView and put this in an action:
let background = sceneView.snapshot()
let cameraview = background.cgImage!.cropping(to: overlayView.bounds)
overlayView.image = UIImage.init(cgImage: cameraview!, scale: 1, orientation: .upMirrored)
This works. The problem is that I have to track the motion of the phone so that the grid aligns with the real world. So...
func trackMotion(motion: CMDeviceMotion?, error: Error?) {
guard let motion = motion else { return }
guard (self.camera) != nil else { return }
self.camera.orientation = motion.gaze(atOrientation: UIApplication.shared.statusBarOrientation)
let background = sceneView.snapshot()
let cameraview = background.cgImage!.cropping(to: overlayView.bounds)
overlayView.image = UIImage.init(cgImage: cameraview!, scale: 1, orientation: .upMirrored)
}
EXE_BAD_ACCESS on the sceneView.snapshot. I note again, it works perfectly if I just put the exact same code in an action. I suspect this is some sort of timing issue, because if I put a breakpoint there it does not crash, but instead, the SceneKit is all-black even after I continue.
Any advice?