We discussed today in the Needs: Decision meeting. It’s an interesting proposal and while visual-able users just have one extra tap, visually impaired/blind users need probably 4 extra swipes/taps. However, implementing shake monitoring adds some code complexity for a functionality/feature that users ideally shouldn’t need to worry about (i.e. automatic + smart refresh). If someone prepares the code, we’d have to review & accept it, and in the longer run, test & maintain it.
Also, if we were to have shake monitoring, it might be more logical to use this action for play/pause. (Or would there be another quick way for blind users to play/pause that we might not be aware of?)
However, implementing shake monitoring adds some code complexity for a functionality/feature that users ideally shouldn’t need to worry about (i.e. automatic + smart refresh). If someone prepares the code, we’d have to review & accept it, and in the longer run, test & maintain it.
@keunes, that makes a lot of sense.
Also, if we were to have shake monitoring, it might be more logical to use this action for play/pause. (Or would there be another quick way for blind users to play/pause that we might not be aware of?)
Indeed, there is a quicker way to play/pause media that's currently playing using TalkBack. Double tapping with two fingers plays/pauses media and does other myriad of things like answering phone calls, dismissing alarms etc. So implementing shake to play/pause would be redundant from a TalkBack users perspective.
In this case, I think using Tasker or MacroDroid is far more better. But I wish there was an easier way for a TalkBack user who's not a power user to refresh podcasts on AP.