pinch to zoom graph chart in the chart frame only SwiftUI - swiftui

I'd like to make my custom linear graph chart zoomable(?). after research I used pinch to zoom( MagnificationGesture in swiftUI) to make it. now my graph can zoom out or in. but when I zoom out it cover the whole scene which is I never want. I want my graph only zoom out in the chart frame(I mean limited area of screen) only. How can I solve this problem?
#State var scale : CGFloat = 1.0
let frame = CGSize(width: 350, height: 500)
public init(data: [Int], style: ChartStyle = Styles.lineChartStyleOne ){
self.style = style
}
var body: some View {
ZStack(alignment: .center){
VStack(alignment: .leading){
Spacer()
GeometryReader{ geometry in
Line(data: self.data, frame: .constant(geometry.frame(in: .local)), touchLocation: self.$touchLocation, showIndicator: self.$showIndicatorDot)
.offset(x: 15,y:0)
Legend(data: self.data, frame: .constant(geometry.frame(in: .local)), hideHorizontalLines: .constant(false))
RangeLineView(data: self.data, frame: .constant(geometry.frame(in: .local)))
RangeView(data: self.data, frame: .constant(geometry.frame(in: .local)))
}
.frame(width: frame.width, height: frame.height)
.scaleEffect(scale)
.gesture(MagnificationGesture()
.onChanged { value in
self.scale = value.magnitude
}
)
.offset(x: 0, y: 0)
}.frame(width: self.style.chartFormSize.width, height: self.style.chartFormSize.height)
}
}

figure it out my self...
.clipped(Rectangle())
simple but it will do that magic.

Related

SwiftUI: How can I align a view to a background image

I have been trying to align a view to a background image, but I haven't been able to find a solution that works for all devices. I am targeting iPhones in landscape orientation.
In this example I want to make the red rectangle align with the iMac screen. This code gets pretty close, by using an offset. It looks good in the preview canvas, but doesn't align in the Simulator or on a device.
I tried using .position(x:y:), but that was even more messy.
I found that if I crop the background so the target region is exactly centered, then it is possible, but I really hope that's not the only solution.
struct GeometryView: View {
let backgroundImageSize = CGSize(width: 1500, height: 694)
let frameSize = CGSize(width: 535, height: 304)
var body: some View {
GeometryReader { geometry in
let widthScale = geometry.size.width / backgroundImageSize.width
let heightScale = geometry.size.height / backgroundImageSize.height
let scale = widthScale > heightScale ? widthScale : heightScale
let frame = CGSize(width: frameSize.width * scale,
height: frameSize.height * scale)
ZStack {
Rectangle()
.frame(width: frame.width, height: frame.height)
.foregroundColor(.red).opacity(0.5)
.offset(x: 5, y: -8)
}
.frame(width: geometry.size.width, height: geometry.size.height)
.background(
Image("imac-on-desk")
.resizable()
.scaledToFill()
.ignoresSafeArea())
}
}
}
background image
this would work, but only on an iPhone 12. If you use .scaledToFill on the image the different display aspect ratios of phones will lead to different offsets. You could at least crop the background image , so the white screen is exactly in the center of the image.
var body: some View {
GeometryReader { geometry in
ZStack {
Image("background")
.resizable()
.scaledToFill()
.ignoresSafeArea()
Rectangle()
.foregroundColor(.red).opacity(0.5)
.frame(width: geometry.size.width / 2.45,
height: geometry.size.height / 2.1)
.offset(x: -geometry.size.width * 0.025, y: 0)
}
}
}

SwiftUI MagnificationGesture() delay

I have a standart app with a view, which you can scale in/out by pinch gesture.
It's work, but I have a little delay at first, it look like jumpy zoom. Does anybody know a solution to make it work smoother ?
Example of code here:
VStack {
Image("image")
.resizable()
.scaledToFill()
.frame(width: UIScreen.main.bounds.width, height: 200)
.scaleEffect(scale)
.gesture(MagnificationGesture()
.updating($scale, body: { (value, scale, trans) in
scale = value.magnitude
})
)
}
In order to solve the "jumpy zoom" you only need to add an animation.
Maybe something like this for example :
Image("image")
.resizable()
.scaledToFill()
.frame(width: UIScreen.main.bounds.width, height: 200)
.scaleEffect(scale)
.gesture(MagnificationGesture()
.updating($scale, body: { (value, scale, trans) in
scale = value.magnitude
})
)
.animation(Animation.easeInOut(duration: 2.0), value: scale)// Animation to solve the "jumpy zoom"

How to center a graph using ScrollView - Scrolling by Pixel

I am using a ScrollView to show a Graph in order to horizontally scroll it. Problem is that I need to center it when the view initially loads.
struct ContentView1: View {
var body: some View {
VStack {
GeometryReader { geometry in
ScrollView(.horizontal) {
VStack {
Graph()
.frame(width: geometry.size.width * 2, height: geometry.size.height, alignment: .center)
}
}
}
}
.frame(minWidth: 400, minHeight: 300, alignment: .center)
}
}
fileprivate struct Graph: View {
var body: some View {
GeometryReader { geometry in
let rect = geometry.size
Path { path in
path.move(to: CGPoint(x: 0, y: 0))
path.addLine(to: CGPoint(x: rect.width, y: rect.height))
}
.stroke(Color.black)
Path { path in
path.move(to: CGPoint(x: rect.width, y: 0))
path.addLine(to: CGPoint(x: 0, y: rect.height))
}
.stroke(Color.red)
}
}
}
ScrollViewReader provides the scrollTo function but it works if the ScrollView contains more views with their own identifier so it's not an option in this case.
How can I scroll the graph by pixel in order to center it programmatically?
(I think) Using ScrollView is unnecessary since you have only one View. You may use DragGesture to move Graph() on the x-axis. It also helps you center Graph() on the x-axis when it appears.
1.
Offset Graph() by half of the screen's width. In order to do so, define a new #State property, set it to -(geometry.size.width / 2), and use .offset() view-modifier.
2.
Use DragGesture() to move on the x-axis. You need to keep track of the last offset too.
struct ContentView: View {
#State var offset: CGFloat = .zero
#State var lastOffset: CGFloat = .zero
var body: some View {
VStack {
GeometryReader { geometry in
VStack {
Graph()
.frame(width: geometry.size.width * 2, height: geometry.size.height, alignment: .center)
.offset(x: offset)
}
.onAppear {
offset = -(geometry.size.width / 2)
lastOffset = offset
}
.gesture(
DragGesture().onChanged { value in
offset = lastOffset + value.translation.width
}
.onEnded { value in
lastOffset = offset
}
)
}
}
.frame(minWidth: 400, minHeight: 300, alignment: .center)
}
}

SwiftUI shape fill body

I'm trying to construct a view in SwiftUI, where the user can keep zooming in and out, and show elements across the view. But the rectangle keeps the size of the window, and scales down when zooming out instead of filling the body. The body (black) correctly fills the window.
How do you make the white rectangle fill the body when zooming out?
(Must be run in an app instead of preview)
import SwiftUI
func rgb (_ count: Int) -> [Color]{
let colors = [Color.red, Color.green, Color.blue]
var arr: [Color] = []
for i in 0..<count {
arr.append(colors[i%3])
}
return arr
}
struct ContentView: View {
#State var scale: CGFloat = 1.0
var body: some View {
let colors = rgb(20)
ZStack {
Rectangle()
.fill(Color.white)
.frame(minWidth: 0,
maxWidth: .infinity,
minHeight: 0,
maxHeight: .infinity,
alignment: .center)
ForEach(colors.indices.reversed(), id: \.self) { i in
Circle()
.size(width: 100, height: 100)
.fill(colors[i])
.offset(x: 100.0*CGFloat(i), y: 100.0*CGFloat(i))
}
}
.drawingGroup()
.scaleEffect(scale)
.gesture(MagnificationGesture()
.onChanged {self.scale = $0})
.background(Color.black)
.frame(minWidth: 0,
maxWidth: .infinity,
minHeight: 0,
maxHeight: .infinity,
alignment: .center)
}
}
I put this an an answer to show a screenshot. The second bit of code behaves very inconsistently. I never see 20 circles. It will zoom, but seems to then be caught in some other view. It is very strange behavior and tough to explain. While the screenshot is here, I could run it 20 times and get 20 different screenshots if I zoom and/or resize the window. I am not on Apple silicon, so your first post may be a bug in implementation on Apple silicon. Wouldn't be the first.
Functioning example for this use case, with rectangle removed from ZStack:
import SwiftUI
func rgb (_ count: Int) -> [Color]{
let colors = [Color.red, Color.green, Color.blue]
var arr: [Color] = []
for i in 0..<count {
arr.append(colors[i%3])
}
return arr
}
struct ContentView: View {
#State var scale: CGFloat = 1.0
#State var colorIndex = 0
var bgColor: Color { rgb(3)[colorIndex%3] }
var body: some View {
let colors = rgb(20)
ZStack {
ForEach(colors.indices.reversed(), id: \.self) { i in
Circle()
.size(width: 100, height: 100)
.fill(colors[i])
.offset(x: 100.0*CGFloat(i), y: 100.0*CGFloat(i))
}
}
.drawingGroup()
.scaleEffect(scale)
.background(bgColor)
.gesture(MagnificationGesture()
.onChanged {scale = $0})
.gesture(TapGesture().onEnded({colorIndex+=1}))
}
}
However it does not fix the problem of the shape not scaling to the body size.

How can I determine the frame size of a SwiftUI View

I have a SwiftUI view displaying an UIImage. How can I determine the frame of the displayed View?
I want to determine the color at the point on the image tapped by the user. I know the size of the raw image, but can't work out how to determine the actual size of the frame displayed.
For example, with an image 3024 x 4032, and code:
struct PhotoImage: View {
var image: UIImage
#State private var gest: DragGesture = DragGesture(minimumDistance: 0, coordinateSpace: .local)
var body: some View {
GeometryReader {
geometry in
Image(uiImage: self.image )
.resizable()
.frame(width: 500, height: 400, alignment: .center)
.aspectRatio(contentMode: .fit)
.gesture(self.gest
.onEnded({ (endGesture) in
let frame = geometry.frame(in: CoordinateSpace.local)
let size = geometry.size
print("Frame: \(frame)")
print("Geometry size: \(size)")
print("Image size: \(self.image.size)")
print("Location: \(endGesture.location)")
}))
}
}
}
the debug shows the frame and geometry size as 1194x745. The gesture location shows the image View to have dimensions 500x 400.
If I don't set the frame size, and use aspectFill, then the geometry size is correct. However, this is no good for my needs as the top and bottom of the image is clipped.
Your GeometryReader is not reading size of your image, but all the space which is available/which it claims.
You could ensure that Geometry Reader is returning the expected 500x400 in the frame and geometry size, by adding it to the background or overlay layer.
Here is a modified version:
struct PhotoImage: View {
var image: UIImage
#State private var gest: DragGesture = DragGesture(minimumDistance: 0, coordinateSpace: .local)
var body: some View {
Image(uiImage: self.image )
.resizable()
.frame(width: 500, height: 400, alignment: .center)
.aspectRatio(contentMode: .fit)
.overlay(
GeometryReader { geometry in
Color.clear
.contentShape(Rectangle())
.gesture(
self.gest
.onEnded({ (endGesture) in
let frame = geometry.frame(in: CoordinateSpace.local)
let size = geometry.size
print("Frame: \(frame)")
print("Geometry size: \(size)")
print("Image size: \(self.image.size)")
print("Location: \(endGesture.location)")
})
)
}
)
}
}