How to customize the gesture recognizer automatically added by SwiftUI .onDrag? - swiftui

Is there any way to customize the gesture recognizer used by .onDrag() in a SwiftUI View? The developer documentation states "applying the onDrag(_:) modifier adds the appropriate gestures for drag and drop to this view" but is silent, so far as I can see, as to how to alter the behavior of those gestures. Those gestures wait for a longpress before initiating the drag. I would like to reduce that delay to zero.
Why Required
The app currently uses a custom DragGesture and .offset(value) to effect a drag. This strategy requires that the view in which the drag initiates have a greater .zIndex than any view over which an item might be dragged. Since drags can begin in different views, the .zIndex for each view is managed programmatically through ternary operators.
The .onDrag() functionality puts the dragged item on top of all views regardless of .zIndex. This behavior is now required due to implementation of a magnification gesture, which requires that the magnified view have a .zIndex below that of the other views or it will cover them as it expands. If the magnified view is then the source of a drag, the required .zIndex behaviors (high for drag, low for magnification) are incompatible.
I tried using .clipped() on the magnified view, but that prevents the dragged item from appearing outside of that view.

Apple developer support responds that there is no way to customize the gesture recognizer automatically added by SwiftUI to .onDrag()

Related

Declare bottom zone in a SwiftUI navigation sidebar below a list

In the below screenshots (taken from the Apple Developer app), we can see that the Account button sticks to the bottom of the sidebar.
When the window is tall enough (left), the list doesn’t scroll, Account button’s background color has no difference. When the window is not tall enough (right), causing the list to scroll, Account button changes its background color to reveal the relationship.
The list's scroll position can not be probed. How can I declare the Account button in SwiftUI?
That app is a UIKit catalyst app and the sidebar uses scrollViewDidScroll which uses the contentSize to set a bottomButtonState which is passed into a child UIHostingController (so the account button can be SwiftUI) which I would assume switches between a clear or solid background.
We cant get the scroll info in SwiftUI however a possible workaround would be to add dummy 1 pixel high cell to the bottom of the list and using its onAppear to set a binding that is used in a bottom view to enable/disable a background colour and should achieve the same effect.

Why does a swipe gesture stop advancing paged view controller after opening and closing side menu?

My app, written in SwiftUI 2.0, is based on a UIPageViewController which displays four panels, in the common onboarding style. The user can swipe through them, or advance them using an arrow button at the bottom.
I also have a button in the upper left that opens a slide out menu, which can be closed by with a tap gesture anywhere on the interface. This is driven by changing the geometry in the swiftui "Geometry Reader", in which the page controller is embedded.
All this works correctly, except when the user closes the side window, the swipe gestures fail to activate the transition between pages. The active page slides with the swipe, but then snaps back into place. However, the arrow button still advances the UIPageViewController to the appropriate next page.
What causes the swipe gesture to become inactive after this slide out is triggered?
I've posted a small sample app isolating the issue here: https://github.com/stevepvc/Slide-and-Swipe-Concept. The code controlling the slide out menu is in "Top Content View".

SwiftUI TextField does not work after adding gesture

After adding a combined gesture to a view, a TextField inside the view would no longer respond when I would tap into it to change the text. I discovered this after adding a custom combined gesture - where I used a long press to start things before dragging. (Note: things still worked if just a drag gesture was added. Not sure what is particularly different between these two cases.)
The combined gesture:
let combined = longPressGesture.simultaneously(with: dragGesture)
The gesture was added to the view with:
.gesture(combined)
I got things to work by adding an onTapGesture{} to the TextField. Didn’t have to put anything into the action. Seems like a side effect whose behavior could change in the future. Appreciate any comments on if this makes sense or other ways to handle.
TextField(“Enter Text”, text: $myText)
.textFieldStyle(RoundedBorderTextFieldStyle())
.onTapGesture {}
In case one would have this issue with a drag gesture, you can set the minimumDistance. This would still register the tap on the textfield to edit it.
DragGesture(minimumDistance: 30, coordinateSpace: .global)
Adding a drag gesture in SwiftUI to a View inside a ScrollView blocks the scrolling

SwiftUI fullscreen horizontal swipe with dot indicator

Many apps have an intro view that has fullscreen pages with a dot indicator at the bottom. Sometimes it is used to gather same basic information, sometimes to introduce the app features.
How can I realize that?
I tried the ScrollView with a horizontal setting. The issue is to set up the content to fit the screen and have the edges snap on scroll. Second issue is the have a dotted indicator that highlights the current page.
you need to wrap UIPageControl with UIViewControllerRepresentable.

How to prevent QScroller gesture from interfering with QGraphicsView drag mode?

I am developing a touchscreen compatible app that has some widgets that contain QScrollAreas. I am using
QScroller::grabGesture(ui->scrollArea->viewport(), QScroller::LeftMouseButtonGesture);
to allow the user to easily scroll these widgets by swiping.
However, some of the scroll areas contain subclassed QGraphicsViews. I am adding QGraphicsItems to these and would like the user to be able to select items using rubberbanding. I have set the drag mode using
setDragMode(QGraphicsView::RubberBandDrag).
This works as desired if I don't also use grabGesture on the scroll area containing the view.
However, grabbing the gesture for the swipe scrolling interferes with the rubberbanding action of the graphics view.
How can I scroll widgets containing these views while also keeping the rubberbanding functionality in tact? I essentially want the widget to scroll unless the user is swiping inside of a QGraphicsView.