[Ref][Kernel][Gestures]Triggering actions with touch gestures - Galaxy S II Original Android Development

Hi folks!
This thread explains a feature I first introduced in the Siyah kernel (available in 4.1beta5) that allows defining finger movement detection and triggering actions when certain gestures are made.
There are also apps available on the market to do it but this approach happens on the kernel level.
I welcome your feedback on any advantages and drawbacks you find.
Index
This post - feature explanation and samples
Post #2 - Configuring gestures
Post #3 - Actions​
Change log
01.12.2012
Added instructions on how to use camera from the lockscreen (see post #3).
Added link to Flint2's Kernel Gesture Builder (see post #2).
Added index.
27.10.2012
Added 3 additional actions (see items 9, 10 and 11 at the end of this post): v1.2 sample script.
Fixed mDNIe negative toggle for newer JB kernels.
23.08.2012
Added action commands and explanations to the 3rd post, with all that has been identified so far.
18.08.2012
Added sample CWM file for the S3 (different coordinates) - thanks to Gokhanmoral
16.08.2012
CWM-flashable zip with ready to use examples
8 gestures
Actions: invert mDNIe; launch camera (3 different apps detected, including JB); direct dial (must edit script); toggle bluetooth; toggle WiFi; play/pause; simulate power button (save the physical button); simulate home key
fixed JB / CM10 hanging on boot when script is present
13.08.2012
Initial post​
There are 2 steps required to use this feature:
1. Defining the gestures - in other words, the path that the fingers are expected to make for the gesture to be detected
2. Reacting to detected gestures
Defining gestures
The sysfs entry /sys/devices/virtual/misc/touch_gestures/gesture_patterns provides access to the gesture definitions - the hot spots for the path that each finger must travel for a gesture to be triggered.
"cat /sys/devices/virtual/misc/touch_gestures/gesture_patterns" will show you the current definitions, and some comments on the expected structure:
Code:
# Touch gestures
#
# Syntax
# <gesture_no>:<finger_no>:(x_min|x_max,y_min|y_max)
# ...
# gesture_no: 1 to 10
# finger_no : 1 to 10
# max steps per gesture and finger: 10
# Gesture 1:
...
Choosing the coordinates
Your S2 screen has the following X,Y coordinates:
Code:
+---------------+
|0,0 479,0|
| |
| |
| |
| |
| |
| |
|0,799 479,799|
+---------------+
Each hotspot is a rectangle from X1 to X2 and Y1 to Y2. For example, a hotspot for just the top half of the screen would be X between 0 and 479 and Y between 0 and 399 (~ half of 800).
A maximum of 10 gestures can be defined, each of them using 1 or more fingers (up to a maximum of 10 but in practice more than 4 might not be very feasible) and for each of them a maximum of 10 consecutive hotspots, which make a path.
All gestures must be defined in one go by writing multiple lines to /sys/devices/virtual/misc/touch_gestures/gesture_patterns, in the following form:
Code:
gesture_no:finger_no:(min_x|max_x,min_y|max_y)
gesture_no:finger_no:(min_x|max_x,min_y|max_y)
... additional hotspots for the same finger, or additional fingers, or additional gestures ...
Writing to "gesture_patterns" will erase all previous definitions and replace with what you're writing.
Some examples that can be used in practice (or define your own gestures)
1. swipe one finger near the top and another near the bottom from left to right
Code:
+----+-----------+----+
| | | |
| +-|-----------|-> |
| | | |
+----+ +----+
| |
| |
| |
| |
| |
+----+ +----+
| | | |
| +-|-----------|-> |
| | | |
+----+-----------+----+
Definition (bound to gesture 1; uses fingers 1 and 2):
Code:
1:1:(0|150,0|150)
1:1:(330|480,0|150)
1:2:(0|150,650|800)
1:2:(330|480,650|800)
2. swipe 3 fingers from near the top to near the bottom
Code:
+---------------------+
| |
| + + + |
| | | | |
+---------------------+
| | | | |
| | | | |
| | | | |
| | | | |
| | | | |
+---------------------+
| | | | |
| v v v |
| |
+---------------------+
Definition (bound to gesture 2; uses fingers 1, 2 and 3):
Code:
2:1:(0|480,0|200)2:1:(0|480,600|800)
2:2:(0|480,0|200)2:2:(0|480,600|800)
2:3:(0|480,0|200)2:3:(0|480,600|800)
3. draw a Z with one finger while another is pressed on the middle left of the screen
Code:
+----+-----------+----+
| | | |
| +--|-----------|-> |
+----+ +----+
| +--+ |
+----+ | |
| | +--+ |
| + | | |
| | +--+ |
+----+ | |
| +--+ |
+----+-+ +----+
| <-| | |
| +-|-----------|-> |
+----+-----------+----+
Definition (bound to gesture 3; uses fingers 1 and 2):
Code:
3:1:(0|150,0|150)
3:1:(330|480,0|150)
3:1:(0|150,650|800)
3:1:(330|480,650|800)
3:2:(0|150,300|500)
(notice that I mixed the way the lines are written, in order to show how you can organize the entries)
To wrap it all up, you can use the following in an init.d script - as the definitions aren't persisted across reboots - in order to define all these gestures whenever the device starts:
Code:
echo "
[COLOR="SeaGreen"]# Gesture 1 - swipe 1 finger near the top and one near the bottom from left to right
1:1:(0|150,0|150)
1:1:(330|480,0|150)
1:2:(0|150,650|800)
1:2:(330|480,650|800)
# Gesture 2 - swipe 3 fingers from near the top to near the bottom
2:1:(0|480,0|200)2:1:(0|480,600|800)
2:2:(0|480,0|200)2:2:(0|480,600|800)
2:3:(0|480,0|200)2:3:(0|480,600|800)
# Gesture 3 - draw a Z with one finger while another is pressed on the middle left
3:1:(0|150,0|150)
3:1:(330|480,0|150)
3:1:(0|150,650|800)
3:1:(330|480,650|800)
3:2:(0|150,300|500)
[/COLOR]
" > [COLOR="Blue"]/sys/devices/virtual/misc/touch_gestures/gesture_patterns[/COLOR]
There are 2 important things to keep in mind when defining gestures:
* The touches are still delivered to whatever applications are active. If a certain gesture proves to cause nuisance with the actual apps, change it to something different or use it only in certain situations;
* Whenever you're pressing or moving 2 fingers close together, at some point the screen will start detecting only one of them. For some of the gesture definitions this might cause the detection to fail or only work very rarely. Make sure to use the "Show pointer location" option in Settings / Developer in order to be able to track what the device detects, while you're setting things up the way you want.
Triggering actions
Defining gestures won't do anything by itself. Now you need to check the /sys/devices/virtual/misc/touch_gestures/wait_for_gesture entry to see which gesture is detected and do whatever you want.
Here's an example, also to be run from an init.d script:
Code:
( while [ 1 ]
do
GESTURE=`cat /sys/devices/virtual/misc/touch_gestures/wait_for_gesture`
if [ "$GESTURE" -eq "1" ]; then
mdnie_status=`cat /sys/class/mdnie/mdnie/negative | head -n 1`
if [ "$mdnie_status" -eq "0" ]; then
echo 1 > /sys/class/mdnie/mdnie/negative
else
echo 0 > /sys/class/mdnie/mdnie/negative
fi
elif [ "$GESTURE" -eq "2" ]; then
# Start the camera app
am start --activity-exclude-from-recents com.sec.android.app.camera
elif [ "$GESTURE" -eq "3" ]; then
# Edit and uncomment the next line to automatically start a call to the target number
### EDIT ### service call phone 2 s16 "133"
fi
done ) > /dev/null 2>&1 &
What this will do is:
- for the 1st gesture, toggle mDNIe inverted / normal
- for the 2nd gesture, launch the Camera app no matter what app is active (quick, that chick is almost out of view! )
- for the 3rd gesture - after you edit and uncomment the appropriate line - a call will be established to that number (the wife is impatient, I don't even have time to enter my PIN!!! )
It loops eternally looking for the next detected gesture and triggering the appropriate action.
NOTE - this has been edited to no longer cause hangs on CM10 startup. The problem was with comments inside the script that contained chars like ' ( ) etc.; be careful when changing the script not to introduce these problems.
Reading from "wait_for_gesture" blocks until one of them is detected, and therefore no CPU is consumed nor deep sleep prevented because of the infinite loop.
In some rare occasions (e.g. multiple scripts waiting for gestures, which can be awaken at the same time but only one of them will get each gesture) the script can wake up with a value of 0, which should just be ignored.
If no script is reading "wait_for_gesture", multiple gestures can be detected and buffered (at most one instance of each one) and be send immediately as soon as something starts reading the entry.
Doing an "echo reset > ..../wait_for_gesture" will flush that buffer so no pending gestures are reported, only future ones.
Sample script
The attached file is a CWM installable package that contains a sample script with all this and more.
It has both the definition of 8 gestures and actions to be performed for each of those.
Remember to edit and uncomment the line with the intended phone, otherwise it won't do anything when you draw the Z on the screen.
Just flash it on your primary or your secondary ROM and you're good to go, with the behavior described below.
Gestures:
1. one finger on the top left, another on the bottom left; swipe both horizontally to the right edge
triggered action - invert mDNIe
2. swipe 3 fingers from the top of the screen to the bottom
triggered action - launch the camera app
(currently recognizes the apps from stock Sammy 4.0.*, AOKP 4.0.4 and JellyBean / CM10)
3. press one finger on the middle left of the screen; with another finger draw a Z starting on the top left edge
triggered action - immediately dial a predefined number on the script (must edit the script to put the number you want or it won't do nothing as it is)
WARNING: This has a nice bonus but you need to be aware of it - it will work even on a locked screen. Anyone that knows the gesture will be able to dial that destination even without knowing your PIN or Unlock Pattern. They won't however be able to press any of the other phone buttons like Contacts, etc.
4. hold one finger on the bottom right while another goes from top-left to the middle of the screen and back
triggered action - toggle Bluetooth on/off (will also vibrate for 100ms to provide feedback)
5. hold one finger on the bottom left while another goes from top-right to the middle of the screen and back
triggered action - toggle WiFi on/off (will also vibrate for 100ms to provide feedback)
6. hold one finger on the top left and another on the bottom left, move both to the middle right
triggered action - Media play / pause
7. draw an X on the screen - top-left, bottom-right, top-right, bottom-left - while holding another finger on the middle left
triggered action - Power button (to spare the physical button)
8. swipe one finger from the bottom left to the bottom right, then again bottom left (5 times)
triggered action - Home button (to spare the physical button)
9. hold one finger on the bottom left and with another swipe from the top right to top left and back to top right
triggered action - Toggle between the last 2 activities, excluding the TW Launcher (edit the script if you use another launcher)
10. hold one finger on the middle left and with another swipe top-right, bottom-right, top-right (3 times)
triggered action - force closes the current activity
11. press 3 fingers in the positions: top-left, top-right, bottom-left
triggered action - temporarily disables finger detection by the apps (or re-enables) so you can then swipe other gestures without causing effects in the apps
All other gestures automatically re-enable detection after it has been disabled by this gesture.
These gestures and actions are already an evolution over the original sample I shared, as a result of people posting their suggestions and ideas on the thread.
It's your turn now - think of what is useful to you and make sure to share it with others

Configuring gestures
Refer to [GUIDE] Defining/Creating Triggering Actions Gestures easier by janreiviardo for a great visual explanation on how to setup gesture coordinates.
Also, be sure to follow Flint2's Kernel Gesture Builder app for those who are not so fond of editing script files.

Actions
Here's a collection of the several types of actions that have been identified so far. They're mostly ready to use as-is, but do read the script code and edit where necessary to suit your needs.
Please refer to the previous posts for instructions on how to include this in the gesture detection loop.
For test purposes you can simply execute these from the ADB shell, but for it to be part of your daily usage they must be included on your personal script.
Key presses
With these your gestures can simulate that certain keys were pressed, usually hard-keys that you may wish to avoid wearing out, or special keys that the device may even not have but the ROM can react to, depending on the ROM.
Examples: HOME, Power, Volume up/down, Media play/pause, Media stop, Media next/previous, Volume mute/unmute, Recent Apps, etc.
Script code:
Code:
input keyevent 26
This has the same effect as pressing the Power key.
For other key codes, check here for ICS or here for JB. Some examples:
3 - HOME
24/25 - Volume up/down
26 - Power
84 - Search
85 - Media play/pause
86 - Media stop
87/88 - Media next/previous
164 - Toggle volume mute
187 or 214 - Recent apps
220 - Voice search
212/213 - Brightness up/down
215 - App drawer
As an alternative to executing "input keyevent <code>", it is also possible to inject key press events and even choose the delay between the press and the release to simulate long presses.
Example for a HOME key long press:
Code:
sendevent /dev/input/[COLOR="Red"]event1[/COLOR] 1 [COLOR="Red"]102[/COLOR] 1
sendevent /dev/input/event1 0 0 0
usleep [COLOR="red"]500000[/COLOR]
sendevent /dev/input/event1 1 [COLOR="red"]102[/COLOR] 0
sendevent /dev/input/event1 0 0 0
102 is the scan code for the HOME key and it will have a delay of 500ms between pressing and releasing.
Possible scan codes (for the physical buttons):
102 - Home
116 - Power
115 / 114 - Volume up / down
For the touchkeys "menu" and "back", instead of using event1 (gpio-keys as stated by "getevent"), send the scan codes to event7 (sec_touchkey):
139 - Menu
158 - Back
Invoking services
There are quite a few services running on the device, which expose interfaces that can be invoked using the "service call <name> <transaction> <params>..." syntax.
How to explore existing services
To list running services:
Code:
# service list
Found 95 services:
0 sip: [android.net.sip.ISipService]
1 phoneext: [com.android.internal.telephony.ITelephonyExt]
2 [COLOR="Red"]phone[/COLOR]: [com.android.internal.telephony.[COLOR="Red"]ITelephony[/COLOR]]
3 iphonesubinfo: [com.android.internal.telephony.IPhoneSubInfo]
4 simphonebook: [com.android.internal.telephony.IIccPhoneBook]
5 isms: [com.android.internal.telephony.ISms]
6 voip: [android.os.IVoIPInterface]
7 FMPlayer: [com.samsung.media.fmradio.internal.IFMPlayer]
8 mini_mode_app_manager: [com.sec.android.app.minimode.manager.IMiniModeAppManager]
9 tvoutservice: [android.os.ITvoutService]
10 motion_recognition: [android.hardware.motion.IMotionRecognitionService]
11 samplingprofiler: []
...
Search sources (using for instance grepcode.com) for the interfaces, such as ITelephony in this example.
Here you can find the existing transactions for the "phone" service on 4.0.3 (in some cases it may be slightly different in JB).
If we want to invoke the TRANSACTION_call operation on this service, we'll need to indicate transaction code 2 (1+1) and check what parameters it expects. The code for that is in this line which shows that this particular call it needs a string (with the number to call).
So, in conclusion, to make the device call a certain number one only has to issue this commant:
Code:
service call phone 2 s16 "123456789"
replacing the destination number for the one you want.
Note that "service call" accepts arguments of type "i32 <number>" and "s16 <string>", which can be joined together as many times as needed. For transactions expecting "long", you'll need to pass in 2 i32's to make a long value.
In some of them - such as when asking for the current state of something like bluetooth - you'll need to analyze the output (using grep, for instance) to find if the result is "00000000 00000000" vs "00000000 00000001", or some other value.
Collected service calls so far
Calling a phone
Code:
service call phone 2 s16 "123456789"
(replace 123456789 by the destination number you want to call)
Toggling bluetooth enabled/disabled
This involves 3 transactions on the bluetooth service: isEnabled, enable, disable (this one changes from ICS to JB); the output of isEnabled must also be analyzed in order to know what to do next.
Code:
service call bluetooth 1 | grep "0 00000000"
if [ "$?" -eq "0" ]; then
service call bluetooth 3
else
[ "$is_jb" -eq "1" ] && service call bluetooth 5
[ "$is_jb" -ne "1" ] && service call bluetooth 4
fi
The "is_jb" variable should have been set to 1 prior to this in the case of a JB rom (the script on the OT includes it)
Toggling data connection
Similar to bluetooth, but on the connectivity service.
Code:
service call connectivity 18 | grep "0 00000000"
if [ "$?" -eq "0" ]; then
service call connectivity 19 i32 1
else
service call connectivity 19 i32 0
fi
Toggling WiFi on/off
Similar to bluetooth, but on the wifi service.
Code:
service call wifi 14 | grep "0 00000001"
if [ "$?" -eq "0" ]; then
service call wifi 13 i32 1
else
service call wifi 13 i32 0
fi
Vibration
Transaction "vibrate" can be called in the vibrator service (it requires a long parameter, which maps to 2 i32 entries)
It is asynchronous, meaning that the instruction ends but the device will continue vibrating for the requested duration. This is important in case you wish to insert pauses between multiple vibrations; in that case you'll need to call "usleep" (to have times smaller than 1s) but pause for the duration of the first vibration + the non-vibration time you want, before invoking it again.
Code:
service call vibrator 2 i32 300 i32 0
usleep 600000
service call vibrator 2 i32 300 i32 0
This starts the vibration with a timeout of 300ms, pauses for 600ms (enough for it to stop and stay off for another 300ms) and vibrates a second time.
Expand / collapse the status bar
Not particularly useful, but the transactions "expand" and "collapse" can be called on the statusbar service.
Code:
service call statusbar 1
Code:
service call statusbar 2
Enable / disable the touch screen
This can be very useful in order to prevent the finger movement for gestures to trigger side-effects on the active app that also receives those events (moving icons on the launcher, etc.)
For best experience, map these to a simple gesture such as pressing 2 or 3 fingers in the screen corners, preferably without movement in order not to cause active apps to do anything.
For ICS:
Code:
# Disable
service call window 18 i32 0
# Enable
service call window 18 i32 1
For JB use transaction code 15 instead of 18.
Force-stopping an activity
One of the ways to do this is invoking the FORCE_STOP_PACKAGE_TRANSACTION on the activity service.
Code:
service call activity 79 s16 com.swype.android.inputmethod
This will stop the swype package if it's running.
For a more dynamic script that will stop whatever is foreground app, the output of "dumpsys activity" can be combined:
Code:
service call activity 79 s16 `dumpsys activity top | grep '^TASK.*' | cut -d ' ' -f2`
Toggling between the last 2 applications / windows
For this, invoke MOVE_TASK_TO_FRONT_TRANSACTION on the activity service with the task id to activate and with the MOVE_TASK_NO_USER_ACTION flag.
Again, "dumpsys activity" can be used to identify the next-to-last activity which will be brought to front:
Code:
service call activity 24 i32 `dumpsys activity a | grep "Recent #1:" | grep -o -E "#[0-9]+ " | cut -c2-` i32 2
Since the launcher is just an app like any other, if the previous app was the launcher that's where you'll switch to. If you'd like to exclude it from this logic, a slightly more elaborate script is required:
Code:
dumpsys activity a | grep "Recent #1:.* com.sec.android.app.twlauncher"
if [ "$?" -eq "0" ]; then
service call activity 24 i32 `dumpsys activity a | grep "Recent #2:" | grep -o -E "#[0-9]+ " | cut -c2-` i32 2
else
service call activity 24 i32 `dumpsys activity a | grep "Recent #1:" | grep -o -E "#[0-9]+ " | cut -c2-` i32 2
fi
Basically, if the last app was ...twlauncher, switch to the one before that (#2) instead of the last (#1). You'll obviously need to edit the package name to match your launcher.
Launching applications (and other intents)
The "am" command can be used to launch applications, much like what happens when their icons are pressed in the launcher. In fact, this command is so powerful that it can be a challenge to know what to do with it.
The most usual scenario is to merely execute "am start <packagename>/<activity>".
To find out which values to pass as package and activity, you can launch whatever apps you're interested in and then execute "dumpsys activity a" to see what's running and what is their associated activities:
Code:
# dumpsys activity a
ACTIVITY MANAGER ACTIVITIES (dumpsys activity activities)
Main stack:
* TaskRecord{41d938c8 #48 A com.android.email}
numActivities=1 rootWasReset=false
...
Running activities (most recent first):
TaskRecord{41d938c8 #48 A com.android.email}
Run #3: ActivityRecord{41bbeda0 com.android.email/.activity.MessageListXL}
TaskRecord{41dfff40 #46 A com.seasmind.android.gmappmgr}
Run #2: ActivityRecord{41950668 com.seasmind.android.gmappmgr/.GmUserAppMgr}
TaskRecord{41d9a498 #2 A com.sec.android.app.twlauncher}
Run #1: ActivityRecord{41dd14c0 com.sec.android.app.twlauncher/.Launcher}
TaskRecord{41f12788 #40 A com.cooliris.media}
Run #0: ActivityRecord{4157e130 [COLOR="Green"]com.cooliris.media/.Gallery[/COLOR]}
...
Recent tasks:
* Recent #0: TaskRecord{41d938c8 #48 A com.android.email}
...
intent={act=android.intent.action.MAIN cat=[android.intent.category.LAUNCHER] flg=0x10400000 cmp=com.android.email/.activity.Welcome}
realActivity=[COLOR="green"]com.android.email/.activity.Welcome[/COLOR]
...
* Recent #1: TaskRecord{41dfff40 #46 A com.seasmind.android.gmappmgr}
...
intent={act=android.intent.action.MAIN cat=[android.intent.category.LAUNCHER] flg=0x10200000 cmp=com.seasmind.android.gmappmgr/.GmUserAppMgr}
realActivity=[COLOR="green"]com.seasmind.android.gmappmgr/.GmUserAppMgr[/COLOR]
...
From here you can see that good candidates for the "am start" command would be "com.cooliris.media/.Gallery", "com.android.email/.activity.Welcome", etc.
Invoking the Gallery app, for instance, will output this:
Code:
# am start com.cooliris.media/.Gallery
Starting: Intent { act=android.intent.action.MAIN cat=[android.intent.category.LAUNCHER] cmp=com.cooliris.media/.Gallery }
Warning: Activity not started, its current task has been brought to the front
In this case, the application was already running so it was merely brought to the foreground. Otherwise, it would have been launched.
On the script in the OP you can find the following for the gesture that launches the Camera app:
Code:
result=`am start com.sec.android.app.camera/.Camera 2>&1 | grep Error`
[ "$result" != "" ] && result=`am start com.android.camera/.Camera 2>&1 | grep Error`
[ "$result" != "" ] && result=`am start com.android.gallery3d/com.android.camera.CameraLauncher 2>&1 | grep Error`
Since different ROMs have different camera apps, in this code 3 different activities are tried, moving on to the next whenever the previous one failed. This still doesn't cover all possibilities, but at least is not only for stock 4.0.3 roms but also works e.g. on some JB roms.
An interesting use can obtained if you're using the Android 4.2 Camera. It supports taking pictures even from a locked phone, in a secure manner i.e. not requiring to unlock the phone but not allowing to browse the gallery contents (while still allowing to see the ones you took at that time). Here's the am command:
Code:
am start -a android.media.action.STILL_IMAGE_CAMERA_SECURE
Check this post for more details.
Finally, there are some topics that you can explore on the "am" command:
1. additional intent details for "am start": instead of merely passing <package>/<activity>, there are many more options to use if you know the action to launch, the category, any data it uses, extra parameters, etc.
2. optional arguments such as "--activity-exclude-from-recents" (which prevents the task to be added to the recents list, in the task manager). Just explore the available options by running "am" only.
3. actions other than "start": force-stop, kill, broadcast, etc.
Again, the "am" command allows lots of things to be done related with Intents, but a lot of investigation is required on what intents exist, what parameters they take, etc.
In pretty much all cases the standard "am start <package>/<activity>" will be the syntax to use.

Sweet jesus. Nice going man

Incredible amount of work, well done

### EDIT ### service call phone 2 s16 "133"
To call "111-111-1111", Should "133" alone be replaced and un-commented like given below
service call phone 2 s16 "111-111-1111"

rav4kar said:
### EDIT ### service call phone 2 s16 "133"
To call "111-111-1111", Should "133" alone be replaced and un-commented like given below
service call phone 2 s16 "111-111-1111"
Click to expand...
Click to collapse
Something like that, yes. I'm not sure whether you need to remove the dashes but I guess not. Just open a shell and run that command directly to see what happens.

Tungstwenty said:
Something like that, yes. I'm not sure whether you need to remove the dashes but I guess not. Just open a shell and run that command directly to see what happens.
Click to expand...
Click to collapse
Thanks so much, it worked as was ( diff. number though) with dashes from shell, greatly appreciate your efforts.
service call phone 2 s16 "111-111-1111"

This is awesome. For a sample can. I ask for a pinch screen with five finger gesture. On and to really p off Apple let me pull fingers away from the screen and use my fingers now holding a virtual joystick to move the gallery in three dimensions. Lol
Seriously though, interested in how you'd go about the five finger pinch.
Thank you
Edit : something like this...?
1:1219|259,0|40)
1:1219|259,359|399)
1:20|40,379|420)
1:2199|239,379|420)
1:3440|480,379|420)
1:3240|280,379|420)
1:4219|259,860|800)
1:4219|259,400|440)
Is there a margin allowed?

Awesome feature and tutorial! Much faster than any app as it's handled at kernel level, anyway an UI for this would be perfect though. Keep up the good work!

on my wife's phone drawing a heart on screen calls my number
Code:
4:1:(200|280,699|799)
4:1:(0|150,300|500)
4:1:(200|280,300|500)
4:1:(330|480,300|500)
4:1:(200|280,699|799)
it is actually a triangle at the lower part of the screen but works with almost any kind of heart figure as long as the action starts from the lowest middle part of the screen.
supercool feature to demonstrate to friends

loved your idea and creativity, you should patent this, some enterprise will want it for sure

Can installed (non system) user apps in /data/data be linked to the application to call in the gesture?
I'm trying to link the Phandroid app but it doesnt seem to want to run it.
I obtained the package name (com.blau.android.phandroidnews) and am trying to run this but nothing happens.
System apps (camera and browser) are opening ok.
One thing I've noticed, if you define for example 3 fingers down from top to bottom, and also an identical gesture but for 2 fingers down from top to bottom, and you perform 3 fingers down on screen, then the 2 gestures get detected and executed (as both 2 fingers down and 3 fingers down gets detected)
Thanks mate

rav4kar said:
Thanks so much, it worked as was ( diff. number though) with dashes from shell, greatly appreciate your efforts.
service call phone 2 s16 "111-111-1111"
Click to expand...
Click to collapse
You don't need the dashes this will also work
service call phone 2 s16 "1111111111"
You can also use international dialing code like this
service call phone 2 s16 "+271111111111"
Sent from my Galaxy S2

great work ,,, it,s amazing

Heroeagle said:
@gokhanmoral
how can i trigger this gesture ?? how can i add the phone number in the init.d file?
Click to expand...
Click to collapse
Here it's a script that you need to put it in your init.d folder ( remember this is an example )
Code:
#!/system/bin/sh
echo "
# Gesture 1 - Heart gesture. Dial favorite number
1:1:(200|280,699|799)
1:1:(0|150,300|500)
1:1:(200|280,300|500)
1:1:(330|480,300|500)
1:1:(200|280,699|799)
" > /sys/devices/virtual/misc/touch_gestures/gesture_patterns
(while [ 1 ];
do
GESTURE=`cat /sys/devices/virtual/misc/touch_gestures/wait_for_gesture`
if [ "$GESTURE" -eq "1" ]; then
#Replace 133 with your favorite number you want to call
service call phone 2 s16 "133"
elif [ "$GESTURE" -eq "0" ]; then
sleep 2
fi
done &);

what a brilliant idea master tungstwenty.. bowing m(-_-)m
PS: if someone has a good template please share

install 4.1Beta 6 on galaxys2/9100
flashed this sample script to try
it wont past siyah logo if this script is loaded.., have to wipe init.d folder from recovery to make my phone boot...
any help?
i just want this script badly (want to be able to pull notification bar even from locked screen).
btw very great work man this script looks very promising.

Tungstwenty
Wow...what a great idea. Impressive work! :thumbup:
I am still learning and have been reading (intensively) the Siyah thread to learn as much as i can about kernels, which is how i came to this thread. I have two quick questions, to help me understand a bit better how kernels work--i apologize if the questions sound dumb.
-since this is done at the kernel level, i assume these gestures take precedence over gestures defined in a launcher or app? For instance, if i define the same gesture to trigger an action in, say, apex launcher, and the same gesture to trigger a different action via your kernel feature, what would happen? Would the kernel action occur and block the launcher action, or would the kernel action happen, and then the launcher action happen?
-can these actions be triggered from the lockscreen without having to unlock the phone the "normal" way?
thanks!

crypticc said:
This is awesome. For a sample can. I ask for a pinch screen with five finger gesture. On and to really p off Apple let me pull fingers away from the screen and use my fingers now holding a virtual joystick to move the gallery in three dimensions. Lol
Seriously though, interested in how you'd go about the five finger pinch.
Thank you
Edit : something like this...?
Code:
1:1:(219|259,0|40)
1:1:(219|259,359|399)
1:2:(0|40,379|420)
1:2:(199|239,379|420)
1:3:(440|480,379|420)
1:3:(240|280,379|420)
1:4:(219|259,860|800)
1:4:(219|259,400|440)
Is there a margin allowed?
Click to expand...
Click to collapse
You're on the right track but still not quite there.
All these lines are for the definition of gesture number 1 (first token) - good
You're defining the expected movement / hotspots of 4 fingers, not 5 as you mentioned (second token) - good if 4-finger pinch is ok instead of 5-finger.
For each of them, the first line defines a BOX where it must start (or pass through for the tracking to start) and another one where it must reach. Once all of the 4 fingers have passed from the initial to the final box - even if they then proceed moving - the gesture will fire up.
Example: the 1st finger (any of them, really) must pass through the box with X between 214 and 259 (about the middle width), and Y between 0 and 40 (top of the screen). Perhaps a box of 40px per 40px might be too small or restrictive, you'll need to test it out. Just think that this means about 1/12 of the total width of the screen (480), and 1/20 of the height (800).
The finger that started there must then proceed to a box near the middle of the screen - same horizontal limits / tolerance, but Y not a bit above half (400).
2nd finger follows the same logic - from the west side to the middle.
3rd as well, from east to the middle.
The 4th is not to good - you have a typo in the minimum Y for the starting position. You meant 760|800 instead of 860|800
Other than that it should work, although depending on your tests you might want to widen a bit more the detection boxes.
I think your final question is also answered - there is a margin allowed, which is the one you defined (in this case 40px wide and tall in every box).

Related

[Q] IconBIT Toucan Nano hard keys remapping

Hello.
I am new to this forum, so please bear with me, thanks. I did watch the noob video.
I am building a carPC system based on Android 2.3 device IconBIT Toucan Nano. It's an ARM9 standalone box w/ HDMI and 2 USB ports, no GPS, no WiFi, no screen. Device comes rooted.
I've built a HID USB keyboard emulator based on a PIC18F2550 to use car radio keys to control the Android box, they basically map to F1-F10 keys, but I can change the setting to any HID USB scan code of course.
What I am trying to achieve is to be able to start Music player, Navigation and some other apps of my choice by pressing a single hardkey. When you drive, you don't really want to browse menus and look thru Applications, you just want a seperate keys for Music player, Navigation, Camera, etc etc. I have found a keylayout file (/system/usr/keylayout/*) which maps hardkeys to keys and apps, however it only runs pre-chosen default apps (like Music.apk or VideoPlayer.apk), and no user-specified apps.
I would like e.g. to run PowerAMP instead of Music.apk when 'MUSIC' key is pressed, also I would like to run Navigation (e.g. Google Maps) when some other key (say 'CAMERA') is pressed.
How do I specify my own apps for hard keys? Some posters suggest I should use some Application->Launch menu, but my device does not have this menu. Also, another suggested way is again go to Applications, All and change defaults, but no defaults are set for music player.
Also, for some reason, when I installed the latest firmware provided by IconBIT, HOME button stopped working. I have a mapping in keylayout file like key 62 HOME # F4, but just nothing happens when F4 is pressed.
I think there's gotta be some config file responsible for the apps launched on hardkey press, but where is it?
Thanks.
Anyone please? Can't believe no-one out of 4 million users can help me?
At least, can you please point out which system file (or app) is responsible for launching applications when a hard-key is pressed? I understand, *.kl files are only map files, which map Linux keycodes to Android keycodes (e.g. 'HOME' is a key code)...
Is it compiled into the core? Can't be so! Is the launcher app responsible for this?
I have found a working solution and would like to share it with the forum readers.
Using a .sh script, it is possible to read keyboard events using getevent and then launch required app using am start command. Here is my sample script:
#!/system/bin/sh
while true # infinite loop
do
s=$(getevent -v0 -c1) # get a single event from input stream
# -v0 switch utilized to sort out unneeded garbage data
s=$(echo $s | awk '{print $4}') # sort out key scan code
case $s in # based on scan code, launch required application
0007003f) am start -n com.maxmpz.audioplayer/.PlayListActivity # launch PowerAMP
;;
00070040) am start -n ru.yandex.yandexmaps/.MapActivity # launch nav app
;;
0007003d) echo "HOME" # reserved for HOME button
;;
00070045) am start -a android.intent.action.MAIN -n com.speedsoftware.rootexplorer/.RootExplorer # launch Root Explorer
;;
0007003b) echo "CALL" # etc etc
;;
esac
sleep 1 # delay to prevent double events
done
Put the script into a skeys.sh file and chmod 0755 skeys.sh - that's it.

[Q] Samsung Galaxy S3 - Slow "input tap" from Script

Hello,
This is my first post on the forum and I am relatively new to programming on Android so please forgive my ignorance.
I own a Samsung Galaxy S3 and am trying to write a shell script that will simulate screen presses in a very rapid fashion. To do so, I stumbled across the "input tap" function from /system/bin/input and my script looks like this:
Code:
x=1
while [ $x -lt 10 ]
do
input tap 640 110
(( x++ ))
done
Anyway, the code works great in that it will tap the screen at location 640,100. However, each call to "input tap" takes almost a second to return. My script takes around 10 seconds to complete. However, I was hoping to find a way to speed up the call to "input tap". I am not sure if this is a limitation of my phone. Any suggestions would be greatly appreciated.
Thanks!
You can speed it up by changing:
Code:
input tap 640 110
Into:
Code:
input tap 640 110 &
And if you must wait until all touch events have been sent before proceeding you can add this at the end of the loop:
Code:
wait
So the final result would be this:
Code:
x=1
while [ $x -lt 10 ]
do
input tap 640 110 &
(( x++ ))
done
wait
But I honestly don't understand why it takes so long for a touch event to be sent. This limitation makes it impossible to automate testing of applications that require low latency of user-based input.
Edit: I probably should have explained. When you add
Code:
&
at the end of a command, you are telling the shell to run it "in the background" so your script continues even though the tap command has not returned yet. When you type
Code:
wait
at the end, you are asking the shell to pause until all the commands that you spawned in another thread have returned.

HOWTO: Tasker Padfone S/X-easily detect phone/tablet switch-swap GMD Gesture Control

I've had my Padfone S for a few months now but never really found the time to figure this out till I woke up early this morning.
Background, the Padfone series of phones have a tablet 'dock' which acts as an external larger screen when its plugged in. Does not appear as a dock in the android sense of the word, so the dock state for Tasker isn't useful. The HDMI state activates for a short while when plugged into the tablet but deactivates a short while later, so no use as well.
Earlier posts in Tasker's google group suggested using `dumpsys display` with grep, but that's quite ROM-specific, and besides its hard to debug without a terminal due to the sheer volume of output.
**Check Tablet mode part**
I used to just grep wm size or wm density for Override, but that had a bug.
So it turns out 'Override' simply means 'not the same as the mode I booted in'. My task failed when I rebooted as a tablet, it would give the opposite answer. The solution, after I've played around a bit, is two calls to wm size/density.
The first call is `wm size | grep "Physical size: 1920x1200" | wc -l` - I save this in %tabletboot, it has the value of 1 when booted as tablet, 0 if booted as phone.
The second call is `wm size | wc -l` - I save this as %samemode, it has the value of 1 if we're in the mode we booted in (tablet/phone) and of 2 if we're not (because there's an additional Override line).
The test is simply "If %tabletboot + %samemode is even" or is equal to 2, if you prefer. This would be true if we booted as tablet AND we're still in that mode or if we booted as phone AND we're in the opposite mode.
**GMD Gesture Control part**
The reason I wanted to do this was mainly because GMD Gesture Control could only work either in phone or tablet mode for me. Auto-detection of the touchscreen input didn't work because of course there are 2 touch screens, so for tablet mode I needed to select 'asus_dds_sis_touch' and 'HC_DEFAULT' in GMD's advanced settings while for phone mode I needed to select 'Himax-touchscreen' and 'HC_FT5X'. Of course, you may also want to have different gestures for the different screens.
First, from a Padfone perspective, you need to go to ASUS Customized Settings -> Padfone Settings -> Dynamic Display List and add GMD Gesture Control to the list not to be killed when switching modes. Tasker and any Tasker plugins which need to constantly run (such as Whatstasker) should also be here.
GMD Gesture Control puts its settings in two files (both in /data/data/com.goodmooddroid.gesturecontrol/shared_prefs) named 'GestureControl.xml' and 'gestures.xml'. You'll need to create two additional copies of these files and delete the originals. So I ended up with 'tabGestureControl.xml', 'phoneGestureControl.xml', 'tabgestures.xml', and 'phonegestures.xml'.
Now, back to Tasker. Set up a state profile based on the value of the %TABLET variable. You'll need an entry task and exit task, which are almost identical. Basically, both tasks need to do these steps in order:-
1. Delete both GestureControl.xml and gestures.xml (use Root, continue after error)
2. Symbolically link the right copy of GestureControl.xml/gestures.xml
3. Kill GMD Gesture Control using root to restart it.
4. Pause GMD Gestures (use AutoShortcut), wait 1 second, and then unpause GMD Gestures
Of course, step 2 differs in the entry/exit task, one is for tablet configuration, one is for phone. The rest are identical.
Step 1 can be done using Run Shell and calling `rm /data/data/com.goodmooddroid.gesturecontrol/shared_prefs/GestureControl.xml` or using Tasker's Delete File. Remember to delete both files.
Step 2 needs Run Shell to call `ln -s /data/data/com.goodmooddroid.gesturecontrol/shared_prefs/tabGestureControl.xml /data/data/com.goodmooddroid.gesturecontrol/shared_prefs/GestureControl.xml`. Again, remember to do this for both files.
Step 4 needs autoshortcut, so download that and use it, under Plugins.
That's about it. The same technique can be used for anything you want to change in this awesome Padfone device. Perhaps you'd want to activate an overlay or change an app's font settings. With Tasker, no limits =).

Wearshell+Tasker Recipes (& Wearshell +Tasker+Autovoice Recipes)

It's time to post some of recipes I find useful. I drive for a living so operating my mobile gadgets HandsFree+EyesFree is how I have to do it.
[Rooted Watch + Rooted Phone Required ]
Most of these recipes incorporate the Autovoice App that uses Google App to grab spoken commands, HOWEVER, just switching out a variable in these recipes/tasks for your specific input will get the job done ... if you're interested in Autovoice off your watch the final pages of this thread are informative http://forum.xda-developers.com/xposed/modules/mod-google-search-api-t2554173 ... sadly though, it looks like no-one who has Marshmallow ON THEIR PHONE has a working flow at this time.
ANNOUNCEMENT: This new 5.0 version of wearshell comes included with some better versions of some recipes so far posted here by me, especially tts. here is the thread: http://forum.xda-developers.com/android-wear/development/app-wearshell-t2902242
FROM THAT THREAD:
WearShell version 0.6.0 has just been released.
I also put together some BeanShell commands for download:
https://goo.gl/2mA1uU
These commands are included:
Wearshell Command Collection 1
brightness
filepull
filepush
killws
mediaplayer
openuri
tts
unzip
vibrate
volume
WearShell Command Collection 2:
https://goo.gl/2mA1uU
battery
md5
notify
record
showimage
startapp
sysload
toast
uptime
zip
Wearshell Commands Collection 3
WearShell 0.6.0 has just been released. It includes support for Result Intents, sensors support and bug fixes.
alarm
charging
freestorage
heartrate*
light*
stepcount
stt
timer
wifioff*
wifion*
Commands marked with * I could not test because my G Watch is lacking the sensors and WiFi support. If you find any errors please let me know, I'll try to fix them.
Hope it's useful
...................
I'll leave this thread up and adding recipes, because these recipes usually don't require a file to be loaded onto the watch, they just run from an intent. so I can add things to the list and others too.
All I know is these work on my Huawei Watch with 1.4 update and Galaxy Note 3 on KK4.4.2. Please post recipes you find useful (and earn my undying gratitude) and I'll repost what I find and post ones I've made.
[Quick Summary on How These Work: Wearshell installs on phone and watch. the Wearshell app ON PHONE doesn't even need to be running to function because it sleeps until a tasker intent sent to it wakes it up, then Wearshell connects to it's companion app ON WATCH (companion app is tiny and uses like zero battery) and passes code to the companion app that it then executes the code. Wearshell is an awesome bridge to pass shell commands and java code via tasker]
Thanks and hopefully enjoy.
Reboot Watch By Voice
"Ok Google Reboot"
PROFILE: "i named it watch reboot"
Event> Plugin>Autovoice>Recognized> Command: "reboot"
TASK: "i named it watch reboot"
Add Action> System>Send Intent
ACTION
de.fun2code.android.wear.shell.EXEC
EXTRA
bsh: Runtime.getRuntime().exec("su -c reboot");
LEAVE ALL OTHER FIELDS AS IS
Set Watch Brightness by Voice
"Ok Google Brightness xxx"
(xxx's are a number between 0 - 254 to set brightness level)
PROFILE: "i named it watch brightness"
Event> Plugin>Autovoice>Recognized> Command: "brightness" (then I speak a number between 0-254)
Advanced> Replacements:to =2,+=, =,:=,-=,one=1,two=2,three=3,four=4,five=5,six=6,seven=7, eight=8,nine=9,ten=10,eleven=11,twelve=12,thirteen =13,fourteen=14,fifteen=15,sixteen=16,seventeen=17 ,eighteen=18,nineteen=19,twenty=20
TASK: "i named it watch brightness"
Add Action> Variables> Variable Set
NAME
%spokenbrightness
TO
%avcommnofilter
Add Action>Task> If
CONDITION
%avcommnofilter > 254
Add Action> Variable> Variable Set
NAME
%spokenbrightness
TO
254
Add Action>Task>Endif
Add Action>Task> If
CONDITION
%avcommnofilter < 20
Add Action> Variable> Variable Set
NAME
%spokenbrightness
TO
20
Add Action>Task>Endif
Add Action> System>Send Intent
ACTION
de.fun2code.android.wear.shell.EXEC
EXTRA:
bsh: Runtime.getRuntime().exec("su -c settings put system screen_brightness %spokenbrightness; su -c input keyevent 26");
LEAVE ALL OTHER FIELDS AS IS
Notes: you might have noticed that the second if condition prevents the watch from going below brightness level 20. also that "input keyevent 26" that you see there just simulate a power button click on my phone which clears the Google search results from the screen of my watch so that I have my watch face back when the task completes
Voice Search Results in UC Mini Browser on Watch (VERY fast+smooth over Bluetooth)
"Ok Google search lions tigers and bears"
[Instructions on easy installing UC Mini Browser after the recipe]
PROFILE: "i named it watch uc mini"
Event> Plugin>Autovoice>Recognized> Command: "search"
TASK: "i named it watch uc mini"
Add Action>Variable>Variable Set
NAME
%url2wearshell
TO
"https://www.google.com/search?site=&source=hp&q=%avcommnofilter"
Add Action> System>Send Intent
ACTION
de.fun2code.android.wear.shell.EXEC
EXTRA
bsh: import android.content.Intent; import android.net.Uri; uri = %url2wearshell; intent = new Intent(Intent.ACTION_VIEW, Uri.parse(uri)); intent.addFlags(Intent.FLAG_ACTIVITY_NEW_TASK); context.startActivity(intent);
LEAVE ALL OTHER FIELDS AS IS
Install
grab "UC Mini Browser" from the playstore on your phone and use an app like "Apk Share" to get the apk, then on the watch go to developer options enable "adb debugging" and "adb over bluetooth," then use the app "Apps2Wear" from playstore to sideload it (it'll take a few tries, and make sure on the watch you clicked ok to allow connections over adb) ... then grab " /system/app mover root" from the play store and sideload it with "Apps2Wear" (it'll take a few tries) then run "/system/app mover root" on the watch to install "UC Mini Browser" as a system app (so that it gets permission to use the internet)...you might get a weird error saying it couldn't move UC Mini, but it moved 'enough of it' (the base apk) to get this working. So reboot for good measure and it'll be ready. (btw: because 'enough' of UC Mini got moved to system folder it won't survive a factory reset like a fully installed system app and will need a reinstall.)
Usage:
on my Huawei Watch there is no back button, so I recommend you open some links from the Google results 'in the background' (long press the links), then use the tabs button on the bottom bar to switch between pages (this speeds up browsing too). also I recommend you go to settings in UC Mini (bottom right corner on my watch and hard to click) and go to Customize and check "Hide Bottom Bar" = this gives you full screen without losing the bottom bar, it's great.
Text Messages Automatically Spoken On Your Watch (Watch TTS)
[ ** This recipe needs work to be good, but it's 'good enough' to allow me to keep my eyes on the road, and much better solutions are on the way I'm sure. Also, It requires a tiny tasker app to be made and sideloaded (it's only 5 easy actions to input)..]
The tasker app we'll make in this recipe is named "Watch Speak"
THE APP FACTORY APP::
TASK: "I named it watch Speak"
Add Action>File>Read File
FILE
/sdcard/speak.txt
TO
%watchspeak
Add Action>Variables>Variable Search and Replace
VARIABLE
%watchspeak
SEARCH
€ (<--- don't type this, but see that weird symbol?)
STORE RESULTS IN
%watchspeak
REPLACE MATCHES
...check the box
REPLACE WITH
' (<--- see that apostrophe?)
Add Action>Variables>Variable Search and Replace
VARIABLE
%watchspeak
SEARCH
¥ (<--- don't type this, but see that weird symbol?)
STORE RESULTS IN
%watchspeak
REPLACE MATCHES
...check the box
REPLACE WITH
" (<--- see that quote mark?)
Add Action> Alert>Say
TEXT
%watchspeak
LEAVE ALL OTHER FIELDS AS IS, UNLESS YOU WANT TO
CHANGE SPEED AND PITCH
Add Action>Task>Stop
IN TASKER:
PROFILE: "I named it Watch Speak"
Event> Phone>Received Text
TASK: "I named it Send Watch Speak"
Add Action>Variables>Variable Set
NAME
%watchSpeak
TO
text from, %SMSRN. it says, %SMSRB
Add Action>Variables>Variable Search and Replace
VARIABLE
%watchspeak
SEARCH
' (<---don't route this, but see that apostrophe?)
STORE RESULTS IN
%watchspeak
REPLACE MATCHES
...check the box
REPLACE WITH
€ (<--- don't type this, but see that weird symbol?)
Add Action>Variables>Variable Search and Replace
VARIABLE
%watchspeak
SEARCH
" (<---don't type this, but see that quote mark?)
STORE RESULTS IN
%watchspeak
REPLACE MATCHES
...check the box
REPLACE WITH
¥ (<--- don't type this, but see that weird symbol?)
Add Action> System>Send Intent
ACTION
de.fun2code.android.wear.shell.EXEC
EXTRA
bsh: Runtime.getRuntime().exec("su -c echo '%watchspeak' >| /sdcard/speak.txt"); Runtime.getRuntime().exec("su -c monkey -p com.gmail.paulporter1.watchspeak -c android.intent.category.LAUNCHER 1");
LEAVE ALL OTHER FIELDS AS IS
Add Action>Task>Stop
NOTES:
(a) the weird symbols ¥ and € in search and replace are because I do not know how to transfer the symbols ' and " from a variable into linux commands that are embedded in java code, because ' and " will break a command that is supposed to put text into a file, that file being the /sdcard/speak.txt file that our tasker app on the watch reads from, but without those symbols tts can't speak words correctly, so we switch them then switch them back.
(b) on my Huawei Watch the texts won't read out loud if you are actively using the watch (which I like actually) but you might get a duplicate reading once the watch goes idle.
Voice Command Your Watch To Read Last Text Message
PROFILE: "i named it repeat message"
Event> Plugin>Autovoice>Recognized> Command: "repeat message"
TASK:
(just use/reuse the task from the previous post, the task that resides on the phone)
Automate theater mode
This is a repost originally posted by ShadowEO.
In the task, add a Send Intent action with these parameters:
On:
Action: de.fun2code.android.wear.shell.EXEC
Extra: bsh: Runtime.getRuntime().exec("su -c settings put global theater_mode_on 1");
Off:
Action: de.fun2code.android.wear.shell.EXEC
Extra: bsh: Runtime.getRuntime().exec("su -c settings put global theater_mode_on 0");
Off will turn off theater mode, however still leave the watch screen off, however the screen will come back on with the next tap and theater mode will be off.
Notifications Automatically Spoken On Your Watch (Watch TTS
Use/Reuse the same recipe from a few posts back for speaking text messages, but just change these lines:
From this:
Add Action>Variables>Variable Set
NAME
%watchSpeak
TO
text from, %SMSRN. it says, %SMSRB
to this:
Add Action>Variables>Variable Set
NAME
%watchSpeak
TO
notification, %NTITLE
Change LCD Density By Voice
"Ok Google density xxx"
(where. xxx is a number I speak between 180-340)
PROFILE: "i named it watch density"
Event> Plugin>Autovoice>Recognized> Command: "density" (then I speak a number between 180-340)
Advanced> Replacements: to =2,+=, =,:=,-=,one=1,two=2,three=3,four=4,five=5,six=6,seven=7,eight=8,nine=9,ten=10,eleven=11,twelve=12,thirteen=13,fourteen=14,fifteen=15,sixteen=16,seventeen=17,eighteen=18,nineteen=19,twenty=20
TASK: "i named it watch density"
Add Action> Variables> Variable Set
NAME
%spokendensity
TO
%avcommnofilter
Add Action>Task> If
CONDITION
%avcommnofilter > 340
Add Action> Variable> Variable Set
NAME
%spokendensity
TO
340
Add Action>Task>Endif
Add Action>Task> If
CONDITION
%avcommnofilter < 180
Add Action> Variable> Variable Set
NAME
%spokendensity
TO
180
Add Action>Task>Endif
Add Action> System>Send Intent
ACTION
de.fun2code.android.wear.shell.EXEC
EXTRA:
bsh: Runtime.getRuntime().exec("su -c wm density %spokendensity; su -c input keyevent 26");
LEAVE ALL OTHER FIELDS AS IS
Pablo, tonight i want test all. Its wonderful.
I m curious how work speaker tts! Are there delay with this method?
Inviato dal mio SM-N9005 utilizzando Tapatalk
vingar said:
Pablo, tonight i want test all. Its wonderful.
I m curious how work speaker tts! Are there delay with this method?
Click to expand...
Click to collapse
you rock man, good luck. the watch will vibrate then 7-10 seconds later it talks. (the watch reads out emojis too, like ??? makes it say 'smiley face ring penguin) ... android is cool
pablo71 said:
you rock man, good luck. the watch will vibrate then 7-10 seconds later it talks. (the watch reads out emojis too, like ??? makes it say 'smiley face ring penguin) ... android is cool
Click to expand...
Click to collapse
Pablo you have a PM!
Any alternative for nonrooted phones? Don't want to root my new s7 edge yet... I miss the simple root and warranty from the nexus :/
Sent from my SM-G935F using XDA-Developers mobile app
[email protected] said:
Any alternative for nonrooted phones? Don't want to root my new s7 edge yet... I miss the simple root and warranty from the nexus :/
Click to expand...
Click to collapse
most of the tasker tasks I've exported as apps/apks and sideloaded onto my watch work just fine, and I don't think root is needed for any of that. just use your watches button combination to reboot into bootloader, then select recover mode, connect cable to your computer and use adb commands to sideload apps. there are some good threads in Android Wear section that I've been using to do stuff.
pablo71 said:
most of the tasker tasks I've exported as apps/apks and sideloaded onto my watch work just fine, and I don't think root is needed for any of that. just use your watches button combination to reboot into bootloader, then select recover mode, connect cable to your computer and use adb commands to sideload apps. there are some good threads in Android Wear section that I've been using to do stuff.
Click to expand...
Click to collapse
Thanks! I would simply use Adb reboot bootloader as usual on my nexus 4 and 7.
But saw the root needed disclaimer in your first post.
Sent from my SM-G935F using XDA-Developers mobile app
pablo71 said:
most of the tasker tasks I've exported as apps/apks and sideloaded onto my watch work just fine, and I don't think root is needed for any of that. just use your watches button combination to reboot into bootloader, then select recover mode, connect cable to your computer and use adb commands to sideload apps. there are some good threads in Android Wear section that I've been using to do stuff.
Click to expand...
Click to collapse
So this part of your statement: "most of the tasker tasks I've exported as apps/apks and sideloaded onto my watch work just fine"
I've been adding profiles in Tasker on my phone with WearShell installed and they are not working on my watch, I really wanted the brightness changes to be triggered by connect and disconnect from my home wifi network. I guess now I understand why the tasks don't work, since I have not sideloaded them into my watch. I also tried the automate theater mode that fails as well,
I totally missed the fact that they need to be exported as apps and sideloaded into the watch via adb. I missed that in this thread and the WearShell thread. Gah!
Pkt_Lnt said:
So this part of your statement: "most of the tasker tasks I've exported as apps/apks and sideloaded onto my watch work just fine"
I've been adding profiles in Tasker on my phone with WearShell installed and they are not working on my watch, I really wanted the brightness changes to be triggered by connect and disconnect from my home wifi network. I guess now I understand why the tasks don't work, since I have not sideloaded them into my watch. I also tried the automate theater mode that fails as well,
I totally missed the fact that they need to be exported as apps and sideloaded into the watch via adb. I missed that in this thread and the WearShell thread. Gah!
Click to expand...
Click to collapse
no no, I didn't explain myself very well in the previous post ... these recipes I've posted are not apks to sideload ... what I was referring to in the previous post is a few little Tasker apks I've side-loaded onto my watch, but I did not include those in this thread because this thread was for recipes that included wearshell ... pardon me for not communicating clearly in the previous post. If you wouldn't mind copying and pasting the intents in Tasker that are being sent to wearshell I would like to see them
this would set brightness to full, the number 254 does that in this intent
SEND INTENT
ACTION
de.fun2code.android.wear.shell.EXEC
EXTRA
bsh: Runtime.getRuntime().exec("su -c settings put system screen_brightness 254");
LEAVE ALL OTHER FIELDS AS IS
pablo71 said:
no no, I didn't explain myself very well in the previous post ... these recipes I've posted are not apks to sideload ... what I was referring to in the previous post is a few little Tasker apks I've side-loaded onto my watch, but I did not include those in this thread because this thread was for recipes that included wearshell ... pardon me for not communicating clearly in the previous post. If you wouldn't mind copying and pasting the intents in Tasker that are being sent to wearshell I would like to see them
Click to expand...
Click to collapse
Brightness:
Wifi disconnect (this actually has about 15 steps for various changes to the phone for sound and security)
Code:
Action: de.fun2code.android.wear.shell.EXEC
Extra: bsh: Runtime.getRuntime().exec("su -c settings put system screen_brightness 254 ");
Wifi connect (same, changes to sound and security)
Code:
Action: de.fun2code.android.wear.shell.EXEC
Extra: bsh: Runtime.getRuntime().exec("su -c settings put system screen_brightness 106 ");
I have also tried removing the space after the number and before the quote, since other bsh: entries I have seen don't show the space.
Code:
Extra: bsh: Runtime.getRuntime().exec("su -c settings put system screen_brightness 106");
Theater:
Start sleep tracking (Sleep as Android works as a plugin with Tasker, I set silent mode and try to start theater)
Code:
Action: de.fun2code.android.wear.shell.EXEC
Extra: bsh: Runtime.getRuntime().exec("su -c settings put global theater_mode_on 1");
End sleep tracking when alarm sounds (again, Sleep as Android as a plugin for Tasker allows the sleep tracking to terminate on alarm and run an intent)
Code:
Action: de.fun2code.android.wear.shell.EXEC
Extra: bsh: Runtime.getRuntime().exec("su -c settings put global theater_mode_on 0");
Pkt_Lnt said:
Brightness:
Wifi disconnect (this actually has about 15 steps for various changes to the phone for sound and security)
Code:
Action: de.fun2code.android.wear.shell.EXEC
Extra: bsh: Runtime.getRuntime().exec("su -c settings put system screen_brightness 254 ");
Wifi connect (same, changes to sound and security)
Code:
Action: de.fun2code.android.wear.shell.EXEC
Extra: bsh: Runtime.getRuntime().exec("su -c settings put system screen_brightness 106 ");
I have also tried removing the space after the number and before the quote, since other bsh: entries I have seen don't show the space.
Code:
Extra: bsh: Runtime.getRuntime().exec("su -c settings put system screen_brightness 106");
Theater:
Start sleep tracking (Sleep as Android works as a plugin with Tasker, I set silent mode and try to start theater)
Code:
Action: de.fun2code.android.wear.shell.EXEC
Extra: bsh: Runtime.getRuntime().exec("su -c settings put global theater_mode_on 1");
End sleep tracking when alarm sounds (again, Sleep as Android as a plugin for Tasker allows the sleep tracking to terminate on alarm and run an intent)
Code:
Action: de.fun2code.android.wear.shell.EXEC
Extra: bsh: Runtime.getRuntime().exec("su -c settings put global theater_mode_on 0");
Click to expand...
Click to collapse
I assume that all the other fields in this Send Intent action are left as is, so these intents are written perfectly. Now, the things that I'm about to say next are probably things you already know, but Tasker is extremely finicky with profiles, especially cloned profiles, and you have to back out of Tasker completely using the back button to activate profiles. Sometimes I have to delete profiles that should be working, and then recreate them. Once I do that they work flawlessly forever. Sometimes I have to turn the profiles on and off as well. Also I am pretty sure you are using the latest version of wearshell, because I read in that Forum that the previous version of wearshell did not work with the screen off on the phone. Hopefully the wearshell companion app is in memory on the watch so that it can receive communication from the app on the phone. You could try opening the app on the phone and starting and stopping at the server just to wake the it up and testing to see if they are running.. That server, by the way, is not part of any of this, you probably already know that the server is just to run test code through a web server interface that is on the watch. But running that server would be a great way to check that the companion app and the phone app are both operating and talking to each other.
EDIT: In tasker, is 'allow external access' granted in the settings? I'm not that great at tasker, but I think something like that in settings could be it maybe
pablo71 said:
I assume that all the other fields in this Send Intent action are left as is, so these intents are written perfectly. Now, the things that I'm about to say next are probably things you already know, but Tasker is extremely finicky with profiles, especially cloned profiles, and you have to back out of Tasker completely using the back button to activate profiles. Sometimes I have to delete profiles that should be working, and then recreate them. Once I do that they work flawlessly forever. Sometimes I have to turn the profiles on and off as well. Also I am pretty sure you are using the latest version of wearshell, because I read in that Forum that the previous version of wearshell did not work with the screen off on the phone. Hopefully the wearshell companion app is in memory on the watch so that it can receive communication from the app on the phone. You could try opening the app on the phone and starting and stopping at the server just to wake the it up and testing to see if they are running.. That server, by the way, is not part of any of this, you probably already know that the server is just to run test code through a web server interface that is on the watch. But running that server would be a great way to check that the companion app and the phone app are both operating and talking to each other
Click to expand...
Click to collapse
Yes, I know the idiosyncrasies of Tasker, I've been beta testing for a couple years, and if you saw the complexity of my myriad tasks.....
When I have the WearShell running, I get a screen wakelock that keeps the display on. I don't have it running because joschi70 in the WearShell thread stated (in response to you) it does not have to run for broadcasts .I have tried running it and still nothing changes on the watch.
Actually the server is only needed if you use the web API or the web interface itself.
If you use broadcasts to send commands it should work without having to start the WearShell server on the phone.
Click to expand...
Click to collapse
I did see the message on the watch that WearShell was installed. Of course, it does not show in the apps menu.

Lock screen. Help

Hello,
How to make it possible to take a selfie and send that to the cloud every time when someone touches the lock screen.
Samsung note10 rooted.
Thanks.
IMHO means unauthorised use of one’s image.
Lock screen Help
For security purposes, if you only touch the screen on lockscreen a photo is taken by front cam and sent to the cloud, all these types of applications do such things but when entering the password only. How to do it, please help.
jbxr said:
For security purposes, if you only touch the screen on lockscreen a photo is taken by front cam and sent to the cloud, all these types of applications do such things but when entering the password only. How to do it, please help.
Click to expand...
Click to collapse
This is close what you are looking for, except the cloud.
https://play.google.com/store/apps/details?id=org.twinone.intruderselfie
As one-liner, just for the buttons, might need some adjusting to your device:
(make sure that the front camera is selected one)
while true; do toolbox getevent -c 1 /dev/input/event1 && am start -a android.media.action.STILL_IMAGE_CAMERA_SECURE && sleep 1 && input keyevent 27 && input keyevent 26; done
Click to expand...
Click to collapse
Pros: -
Cons: phone stolen/mashed
How about just keeping that thing near you...
Thanks but, I tried the Intruder Selfie, but I meant a solution not necessarily from the app store, because I will add that is about a rooted phone, wich is taking a photo or log with an entry about the event, not when entering the password wrong or good, but whenever when you contact/ touch/ swipe, or the screen somehow wakes up when the lock screen is active and send it to the cloud or store it outside the phone.
I would like to add that I have been looking for it for a long time and I must admit that it is a challenge for me, not just a need, and it is strange that there is no such thing yet.
If just logging is enough then you can find everything what has happened from android's own log I'm sure. How to filter things you're interested in? IDK...
By using getevent:
while true; do echo $(date) >> /sdcard/mylog.txt && toolbox getevent -l -q -c 1 >> /sdcard/mylog.txt; done
Click to expand...
Click to collapse
Photo would be better, but if there's a problem or is it impossible then just a log entry about a specific event, i.e. touching the screen when the lock screen is enabled, and sending it immediately to an e-mail or to the cloud. Sending it outside the phone is a must have not on sd.
/sdcard is usually your internal memory.
jbxr said:
Photo would be better.
Click to expand...
Click to collapse
There is all sort of spy|hidden|background cameras, but very few stay in PlayStore.
Here is one that says it's opensource, but where's the source?...
https://forum.xda-developers.com/showthread.php?t=1934513

Categories

Resources