GithubHelp home page GithubHelp logo

quickcursor's People

Contributors

micku7zu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

quickcursor's Issues

Split screen action doesn't work anymore in Android 13+

Issue description

Global system action not supported

Split screen action doesn't work or can't be selected. It worked before in Android 12 or older versions.

Unfortunately, there is no solution for this issue ☹️. The Global Action Split Screen was removed from Android 13 by Google and they won't add it back.


GLOBAL_ACTION_TOGGLE_SPLIT_SCREEN doesn't work anymore in Android 13+.

Tracker move delay - it doesn't move in real time with the finger

Issue description

The issue can be seen in this video:
tracker_move_delay_issue.mp4

This was reported by an user on email and I can't reproduce the issue on my devices so far.

If you have this issue, please reply with your device model and Android version. I try to gather more info to understand when and why it happens, to reproduce it myself and try to fix it.

Thanks!

Change brightness action doesn't work correctly on Samsung devices with Adaptive brightness turned on

Issue description

The "Change brightness" action doesn't work correctly on Samsung devices with Adaptive brightness turned on.

The action changes the brightness to the correct value but the screen brightness it is not changed, or it is changed really slow.

This is a bug from Samsung and can be reproduced with ADB:

  1. Turn on Adaptive brightness
  2. Set the screen brightness with ADB command
    adb shell settings put system screen_brightness 255
    adb shell settings put system screen_brightness 0
  3. The brightness setting will be changed but the actual screen brightness it is not changed, or it is changed really really slow.

Workaround

The only workaround found yet it is to disable the "Adaptive brightness".

There are possible fixes that can be implemented in Quick Cursor but all of them are hacks that are hard to support:

  1. Make Quick Cursor disable adaptive brightness first, and then change the brightness, and then enable adaptive brightness back when the device it is locked. This is a hack that would work but it is hard to support and it can create confussions
  2. Make Quick Cursor change the brightness really slow and repeatedly from value X to Y, by increasing/decreasing the brightness by 1 at a time. I've implemented this but it is a bad user experience that it is hard to support (to explain to users why it works so bad).

So currently there is no acceptable solution to implement in Quick Cursor.

Keys on the side of the keyboard have a small delay with Quick Cursor

Issue was reported on Reddit here:
https://www.reddit.com/r/androidapps/comments/105hc14/which_are_the_best_apps_to_try_in_2023/j3eztmu/

and the response is also available on Reddit here:
https://www.reddit.com/r/androidapps/comments/105hc14/which_are_the_best_apps_to_try_in_2023/j3g0os0/

Issue

When off, I get "instant" response, in particular on the A key and the backspace key (and probably numlock and enter as well.) With QC active, there's a 0.1 - 0.2s of a lag -- in other words, hitting backspace has a noticeable (but minor) delay before activating.

Solution

TL;DR

Yes, the triggers over the keyboard adds a minor delay over the letters they cover ("Q", "A", "P", "Backspace", etc). There are workarounds and solutions for this, depending on personal preference:

  • automatically raise the triggers above the keyboard when it is open
  • automatically disable the triggers when the keyboard is open
  • use the floating tracker mode to not overlap the keyboard
  • carefully adjust the triggers or keyboard layout so they don't overlap

Long answer

As you can see in this screenshot, the triggers overlaps some of the keyboard keys but this depends A LOT on many many variables: keyboard layout (keyboard app, android version, etc), screen size, personal "screen zoom" or "font size" adjusted from settings, curved display or not, etc.

So there are no two identical setups available, each one has a different trigger size that works for them which might overlap or not the keyboard. I tested it on many devices and sometimes a small trigger size works excellent on some devices but not so good on others, where you need to make it bigger, overlapping more of the keyboard.

So, when the app was first released 2-3 years ago, it didn't have many features, and one of the first feedback (example 1, example 2, example 3) was that they can't use the keyboard correctly because the triggers are triggered when they type.

I was surprised because on my device I could set the triggers small enough to not overlap but still to be triggered, then I discovered that this does not apply to all use cases and I had to implement some workarounds:

  • Keyboard detection mechanism - this was the first workaround implemented, you can automatically disable or raise the triggers above the keyboard when the keyboard is open. this should be good enough for majority of users, but it has the downside that the app can't be used or it is harder to be used when the keyboard is open, so I also tried to implement other workarounds.
  • Floating tracker mode - this was another workaround for the keyboard problem but not only that, it had other advantages too for some users
  • Pass trigger clicks to what is behind them - this was especially useful for keyboard issue BUT also for other cases when you want to tap on the screen where the triggers are located without actually wanting to trigger the cursors.
Pass trigger clicks delay

As you discovered yourself, the "pass trigger clicks" workaround is a hack, because this can't be achieved without a hack. It is an Android limitation or a security feature.

In Android only 1 app can "capture" a touch event, and that app can't be changed for the lifespan of that touch event, so I have two options in Quick Cursor:

  • capture any touch events (tap, long tap, swipe, etc) on triggers
  • don't capture anything on triggers

It is clear that triggers needs to listen to touch events, which means that when the user touches the trigger, that touch is completely reserved for Quick Cursor app, it can't be passed to other app.

So, when a user touches the trigger (first 0-100ms), it is not clear yet what they want to do, because that touch can be a simple tap or a swipe, or a long tap, it is still not decided. Screenshot of some touch examples https://i.imgur.com/9kXVcBL.png. Here it also depends A LOT on the touchscreen sampling rate and touchscreen sensitivity, I personally experienced a huge difference in sensitivity between devices. There are some devices where I can't simply tap on the screen without triggering also a 'move' event, it's always: 'down', 'move', 'up'. In other devices it is always 'down' and 'up' for a tap.

First delay: So, the user touches the trigger and until the 'up' event (that he released the screen) or a certain amount has passed (300ms, 500ms, to be considered a long tap, not a simple tap), I can't decide if he wanted to press what's under the trigger (a letter) or he wants to grab a cursor.

Second delay: After the event was decided (tap, long tap, swipe, etc)(0-299ms delay), depending on how fast the tap was (it can be even under 20ms, this one depends on how the user taps and the touchscreen/device "performance"), I have to pass the click through the tracker to the letter behind the tracker, and this is not possible in Android in a normal way, but this app being an accessibility service, I can simulate a tap on the screen at the position the user clicked behind the tracker, which implies some kind of hack and it adds another delay (pretty hard to measure and extremely hard to check how much difference between devices is) but it shouldn't be that much.

The keyboard app receives the new fake touch event after (first delay + second delay)ms and process the touch event themselves.

There is also another Android limitation: there can't be two simultaneous touch events from different input devices. Example:

  • two simultaneous touch events from your touchscreen (touching the screen with 2 fingers)
  • two simultaneous touch events simulated from Quick Cursor
  • two simultaneous touch events from different input devices: one from real touchscreen (your finger) + one simulated from app ❌. This doesn't work, it is specified in the Android documentation

So I can't simulate the touch under the trigger until the real touch event has finished.

I hope everything is clear now, why it happens, what workarounds are available.

Of course, choosing the best workaround for this depends on what you prefer. I personally prefer to make the triggers as small as possible but still be usable AND to pass the clicks through the triggers. There are a lot of users with the triggers raised above the keyboard, and a lot with them completely disabled. It depends 😁

Change volume action doesn't work on OnePlus/Oppo/Realme

Issue description

The "Change volume" action doesn't work outside of Quick Cursor app on OnePlus/Oppo/Realme device.
This issue happens only on Oxygen OS, Color OS and Realme UI (all 3 are based on the same system).

This is currently being investigated. I don't have more details on it yet.

Chrome tab switcher changes to list. Chrome tab grouping is not working.

Issue description

Google Chrome Android, and other browsers based on Chromium (Brave, Vivaldi, etc), automatically changes the tab switcher to a list when Quick Cursor is enabled.

Tab preview and grouping are not working when the tab switcher is rendered as list.

Tabs grouping with preview (before installing Quick Cursor)

image

Tabs list (after installing Quick Cursor)

image

How to fix

Disable "Simplified view for open tabs" from Chrome accessibility settings.

Steps:

  1. Open Google Chrome
  2. Click on the 3 dots from top right corner
  3. Choose "Settings"
  4. Choose "Accessibility"
  5. Uncheck "Simplified view for open tabs"
Steps Video
quick_cursor_disable_chrome_accessibility_tab.mp4
Steps GIF

chrome_disable_accessibility_tab_switcher

More details

Chromium (Google Chrome) has a feature that automatically detects when touch exploration or an accessibility service that can perform gestures is enabled and it enables "Simplified view for open tabs" automatically.

Chromium source code

You can find the method that detects the accessibility service with "canPerformGesture" permission or touch exploration on Chromium source code website. File ui/android/java/src/org/chromium/ui/util/AccessibilityUtil.java, method isAccessibilityEnabled()

2022-12-29_23-26-56

The application is stopped after some time

Issue description

Some manufacturers (Xiaomi, Huawei, Samsung and others) are stopping background apps (like Quick Cursor) even when the user doesn't want this.

Solution for users

Unfortunately, there is no solution for me as a developer to fix this. I don't have enough permissions to bypass their mechanisms.

The solution is to check what tricks/hacks/whitelists/locks you have to do specifically for your device in order to keep an app open for a long period of time.

Usually there are some tricks that apply to all manufacturers, but each manufacturer have different settings and it can also change on each Android version:

I can't maintain a list of all possible ways because each manufacturer on each Android version have a different complex way to stop a background app.

Huge congrats to Google Pixel and other close to AOSP devices that don't stop a background accessibility app.

Restricted setting - Quick Cursor accessibility service greyed out

Issue description

Screenshot 1
Screenshot 2

The Quick Cursor accessibility service is greyed out and can't be enabled. Clicking the greyed out item shows a popup:

Restricted setting
For your security, this setting is currently unavailable.
Learn more

This issue happens in Android 13+ because "ACCESS_RESTRICTED_SETTINGS" is set to "deny" or "ignore" when the app was installed from outside an app store.

More details:

Workarounds

There are two options in order to fix this:

Gestures are not replicated in real time

Issue description

Gestures are not replicated in real time. It would be much easier and nicer if the gestures (swipe, drag & drop, etc) would be simulated at the same time the tracker moves.

Response

Unfortunately, this is not possible in Android right now. I would love to implement that, but Android doesn't support this feature yet.

I understand and know that it would be much easier to swipe or to drag & drop if the gesture is mirrored in real time with the cursor, but Android doesn't support two simultaneous touches from two different sources. Examples:

  • two or more simultaneous real touches on the touchscreen ✅
  • two or more simultaneous simulated touches by Quick Cursor ✅
  • one real touch (finger on the tracker) simultaneous with one simulated touch at cursor position by Quick Cursor

This is described in Android Documentation - dispatchGesture: ⚠️ Any gestures currently in progress, whether from the user, this service, or another service, will be cancelled.

Android documentation screenshot

image

I tried implementing this and the gesture is automatically canceled if Quick Cursor simulates a touch on the screen when the user still touches the screen. A user touch cancels a simulated touch in progress, and vice versa.

Hopefully this limitation will be fixed in future Android versions.

The only solution right now is to use recorded gesture and then replicate them.

The click doesn't work on Samsung

Issue description

The click doesn't work on Samsung device. The following error message appear:

Quick Cursor couldn't send the gesture because the Android Accessibility Service stopped to dispatch gestures. This is a known bug in some devices and can be triggered by other apps that use accessibility services or by some integrated ones, like the One-Handed mode on Samsung devices. To resolve this issue, disable and enable the Quick Cursor accessibility service or restart your device. If you keep having this problem for unknown reasons, please try the limited mode.

Solution

  1. Disable Samsung "One-Handed mode": Settings -> Advanced features -> One-handed mode turned off

  2. Uninstall Samsung "One Hand Operation +" app.

  3. Restart the device

Root cause

Samsung "One-Handed mode" and "One Hand Operation +" have a bug that breaks the gesture replication in Android. There is no explanation or reason why this happens, it just stops working. It looks like the 2022 December Android 13 update fixed this issue, at least on some devices that I tried. I still need more feedback if this is resolved for all their devices.

The dispatchGesture accessibility method is not working if one of that two services is enabled.

Permanent QuickCursor concept / suggestion

Found your app, bought the pro license and it somewhat fulfills my needs but I was actually looking for something that Quick Cursor partially does and would like to make a suggestion for a mode that fits my niche use case.

concept

Size for everything is adjustable but the generally idea is for Quick Cursor to be permanently "mounted" to the bottom of the phone where your thumb can easily manage to move around and instead of mapping movements 1:1, it becomes basically a mouse cursor with the space QC takes acting as a trackpad on a laptop, so for example if you need to reach near the bottom you can just "scroll" the mouse to it and use the available functions of tap, swipe to navigate.

My idea comes from The Big Phone Problem where the concept is to remove all direct interactions with the device so it can be fully navigated one handed, this takes away majority of issues with one handed use on modern phones, QC currently partially fulfills this but for example if I have to tap something to either the far left / right, QC kind of gives up since you're not really moving a "mouse" but just extending your normal movements X amount of pixels.

Tracker Actions can be mapped to open if you long press in the "trackpad" area, maybe you can have an option to disable the navbar completely and have all functions mapped in Tracker Actions for a more seamless experience.

I don't know much of how Android dev works so many there are no APIs that support what I'm asking for but another example that somewhat showcases what I'm asking for is the Samsung Z Flip 4 touchpad please mute the audio, it's terrible.

Sorry for the wall of text, looking forward to hearing how feasible this is.

"Change brightness" action doesn't set the brightness to lowest or highest value

Issue description

The "Change brightness" action doesn't work correctly because the minimum brightness or maximum brightness is not enough

The root cause

According to Android documentation, the "screen_brightness" Android setting must be an integer value between 0 - 255

This can be tested using ADB:

adb shell settings put system screen_brightness 0
adb shell settings put system screen_brightness 1
adb shell settings put system screen_brightness 25
adb shell settings put system screen_brightness 150
adb shell settings put system screen_brightness 255

The Android documentation is pretty easy and clear, but the documentation is not respected, and in reality, the screen_brightness range is different between Android versions and different manufacturers.

And after some feedback from users and debug, there are some exceptions and complications that are not specified anywhere:

  • Before Android P: screen_brightness is linear
  • After Android P: screen_brightness is logarithmic
  • Changing "screen_brightness" if Auto brightness or adaptive brightness is turned on:
    • it might work on some manufacturers
    • it might not work at all on others
    • it might work as an relative value, not as an absolute value (to increase/decrease)
    • it might work only if done in small steps with a lot of time interval between each step (5 to 10 seconds to lower the brightness to 0)
  • on some devices the minimum value is not 0, it is 1 and setting it to 0 will not do anything
  • on some devices the maximum value is not 255, it is 127, or 2047, or 4095, or who knows what
  • there is no official API to get the device minimum or maximum value

Cursor App

It tends to go wonky when you rotate to portrait mode. Moto - - Edge. It's good other than that

Triggers don't work with gesture navigation on Oxygen OS, Color OS, Realme UI (OnePlus, Oppo, Realme)

Issue description

The back gesture navigation overlaps the Quick Cursor triggers and I can't grab a cursor.

This issue happens only on Oxygen OS, Color OS and Realme UI (all 3 are based on the same system) because they have a bug in their gesture navigation system. They don't respect the Android documentation about system gesture exclusion. Screenshot

OnePlus, Oppo and Realme UI are the only ones that don't respect this documentation and I can't fix this issue. I don't have enough permission to overlap the system gesture navigations.

There is an open bug raised in the OnePlus forum but they didn't fix it yet: https://community.oneplus.com/thread/1246522

Quick Cursor is not the only app that suffers from this bug, this is a bug in OnePlus implementation and can't be fixed by app developers.

Workarounds

  1. Swipe vertically on the screen margin (bottom to top) to grab a cursor:
    • Demo video
    • Swiping from bottom to top will trigger the cursor
    • Swiping from right to left will trigger the back gesture
  2. Use 3 button navigation instead of gesture navigation
  3. Make the triggers wider
  4. Use the floating tracker mode (PRO users only)

Google Chrome UI changes when Quick Cursor is enabled

Issue description

Google Chrome UI changes when Quick Cursor is enabled:

  • #2 - Fix available
  • Address bar doesn't hide on page scroll
  • Tab switcher animations are disabled
  • Swipe down on address bar to open tab switcher is not working - This is actually a Chrome bug, because it actually works randomly (aprox once in ten attempts)

What happens

Chromium based browsers (Google Chrome, Brave, Vivaldi, etc) automatically detects when touch exploration or an accessibility service that can perform gestures (like Quick Cursor) is enabled and automatically changes the UI to adapt for accessibility needs.

It doesn't let the user decide if they want to UI to be adapted for accessibility needs, Chrome automatically makes this decision based on what accessibility services are running.

There is no way for me as a developer to exclude Quick Cursor from that check, Chrome doesn't have any exceptions, it checks for any accessibility service that can perform gestures on screens.

Chromium source code

You can find the method that detects the accessibility service with "canPerformGesture" permission or touch exploration on Chromium source code website. File ui/android/java/src/org/chromium/ui/util/AccessibilityUtil.java, method isAccessibilityEnabled()

2022-12-29_23-26-56

But why other accessibility services don't have the same effect?

Because Chrome checks for any accessibility service that have the permission to "simulate a touch on the screen". Not all accessibility services needs or have this permission, but Quick Cursor needs this to simulate the tap at the cursor position.

Gesture recorder length

Can you make gesture recorder more longer like before. Now its hard to move some apps of dragging something. Its take less 3 sec now when use gesture recorder. Hope you can fix this.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.