Is there a way to get the view's frame when its onTapGesture executes?
.onTapGesture {
... get the view's frame....
}
I would like to tap on viewA, get its frame, and use that for custom internal logic.
Wrap it inside a GeometryReader and use the proxy:
GeometryReader { proxy in
,,,
.onTapGesture {
print(proxy.frame(in: .global))
}
}
Related
I am trying to present a transparent view with black background over current view, so that we can still see the content of the current view. I have worked around the opacity modifier but it is not doing the job.
this is my code:
ZStack {
Rectangle()
.fill(.black)
.ignoresSafeArea()
VZScrollViewIfNeeded {
VStack(alignment: .leading) {
// some code
}
}
later I am presenting this view on click of other view:
buttonActionView()
.fullScreenCover(isPresented: $isPresenting, content: transparentView.init)```
Try changing the order of your elements. Currently your Rectangle that should overlay the view is behind the ScrollView. Should look like this:
ZStack {
VZScrollViewIfNeeded {
VStack(alignment: .leading) {
// some code
}
Rectangle()
.foregroundColor(Color.black.opacity(0.5))
.edgesIgnoringSafeArea(.all)
}
I have problem with Views that have onTapGesture and are placed inside ScollView
This onTapGesture is not always reacting to tap gesture.
I need to tap precisely on such view.
It seems like there is conflict with ScrollView drag?
I've tried
highPriorityGesture
onTapGesture
gesture(DragGesture(minimumDistance:0).onChange { })
gesture(TapGesture().onEnded { })
Views have contentShape(Rectangle()) added to them
It somtimes works ok sometimes doesn't. On simulature it most of the time works ok, on physical device it is much worse.
ScrollViewReader { proxy in
HStack(spacing: spacing) {
ForEach(0 ..< elements.count, id: \.self) { i in
Text(elements[i])
.fixedSize()
.contentShape(Rectangle())
.onTapGesture {
withAnimation {
selectedElement = i
}
}
}
I couldn't reproduce the behavior that you describe with that example code, but maybe you could try the following modifier in case another gesture is operating at the same time:
.simultaneousGesture(TapGesture().onEnded({
selectedElement = 1
}))
I am quite new to swiftUI. I have created a grid view on tapping on which I want to go to next screen. But somehow I am not able to manage to push to next screen. I am doing like this:
var body: some View {
NavigationView {
ScrollView {
LazyVGrid(columns: gridItems, spacing: 16) {
ForEach(viewModel.pokemon) { pokemon in
PokemonCell(pokemon: pokemon, viewModel: viewModel)
.onTapGesture {
NavigationLink(destination: PokemonDetailView(pokemon: pokemon)) {
Text(pokemon.name)
}
}
}
}
}
.navigationTitle("Pokedex")
}
}
Upon doing like this, I am getting a warning stating
Result of 'NavigationLink<Label, Destination>' initializer is unused
Can someone please guide me, how to do this?
.onTapGesture adds an action to perform when the view recognizes a tap gesture. In your case you don't need to use .onTapGesture. If you want to go to another view when cell is tapped you need to write NavigationLink as below.
NavigationLink(destination: PokemonDetailView(pokemon: pokemon)) {
PokemonCell(pokemon: pokemon, viewModel: viewModel)
}
If you want to use .onTapGesture, another approach is creating #State for your tapped cell's pokemon and using NavigationLink's isActive binding. So when user tap the cell it will change the #State and toggle the isActive in .onTapGesture. You may need to add another Stack (ZStack etc.) for this.
NavigationView {
ZStack {
NavigationLink("", destination: PokemonDetailView(pokemon: pokemon), isActive: $isNavigationActive).hidden()
ScrollView {
// ...
I am trying to use the whole iPhone area for my app.
I have this HStack at the top, used to create a custom toolbar.
var body: some View {
VStack (spacing:0) {
MyTopbar()
// other controls
Spacer()
}
.edgesIgnoringSafeArea(.top)
This appears like this on new devices with a notch and old devices without a notch. The notch cuts my menu.
I can solve that by adding a spacer with a frame height before MyTopbar() on the vertical stack but first of all this seems to be a very awful solution. First I have to guess a height for that spacer. Then I have to detect if the device has a notch or not (?).
Is there a better way?
You can think of it as layers (content that respects safe area and content that doesn't).
Something like this perhaps:
struct ContentView: View {
var body: some View {
ZStack {
Color.blue.ignoresSafeArea() // Whatever view fills the whole screen
VStack (spacing:0) {
MyTopbar()
// other controls
Spacer()
}
}
}
}
A possible solution to add clear color with safe area height. No need for much calculation.
var body: some View {
VStack (spacing:0) {
Color.clear.frame(height: Color.clear.frame(height: UIApplication.shared.windows.first?.safeAreaInsets.top ?? 0)
MyTopbar()
// other controls
Spacer()
}
.edgesIgnoringSafeArea(.top)
I want to implement a SwiftUI button, that accepts onTapGesture and onLongPressGesture, with an overridable handler
I have been unable to get the native button to work with both so have resorted to an HStack. The buttons use the CommandController pattern and dispatches the command from onTapGesture. The basic code was:
struct CmdButton : View {
#EnvironmentObject var appController : AppController
var command : AppCommand
init(command : AppCommand) {
self.command = command
self.onTapGesture {
self.doCommand()
}
}
var body: some View {
HStack {
Image(self.command.icon).resizable().scaledToFit().frame(width: 35)
}
.padding(6)
.cornerRadius(5)
.contentShape(Rectangle())
.animation(.default)
}
func doCommand(){
appController.dispatchCommand(command: self.command)
}
}
The goal was that CmdButton should call doCommand() from onTapGesture by default unless it was overridden by the implementer, thus:
CmdButton(command: SomeCommand)
.onTapGesture {
doSomethingFirst()
instance.doCommand()
}
.onLongPressGesture {
doLongPressAction()
}
I have 2 core issues.
There seems to be no way I can capture the correct self context in the CmdButton implementation to add the default onTapGesture, if I add during init I get the mutating self error. If I add the gesture to the HStack it cannot be overridden and I can't see how to cast some View to a concrete type to assign directly on the var body.
When I override onTapGesture I am unable to get the self context of the instance in the overridden onTapGesture handler to call the doCommand() method on.
I'm aware I could pass around callbacks and attach them in the implementation, but it might pose other issues capturing context and just seems hacky for something so basic and trivial.
EDIT
To clarify, according Apple docs it is valid to:
struct MyView : View {
let img : String = "asset"
let name : String = "Some Name"
var body: some View {
Image(img).onTapGesture {
onTap()
}
}
func onTap(){
print(name)
}
}
Where the correct context is captured and the gesture handler attached to the Image. My question is instead of the gesture being assigned to the inner implementation how can I assign it to the outer View body so it can be overridden.
MyView()
.onTapGesture {
doSomethingElse()
self_instance.onTap()
}
Again, I can call:
MyView().onTap()
How do I get the View instance to call onTap inside the handler?