[DEV] BMA150 Accelerometer Interface - Nexus One Android Development

Hello,
I would like to develop an application that makes use of the accelerometer sensor on an Android smartphone.
My app needs a measurement range of +-4g. I looked for accelerometer hardware specifications for various models and found that Nexus One mounts BMA150 by Bosch Sensortec, which supports three ranges (+-2g/+-4g/+-8g).
While browsing Android source code from Nexus One, looking for bma150 driver, I read this issue:
code.google.com/p/android/issues/detail?id=8143
I wish to modify the bma150 driver to set a default range of +-4g and then write a Java app to get and use data from the sensor. By the way, as far as I understand, the driver is actually not implemented at all. Or it is just not exported to the Java API? However, if it is not implemented at all, how can the OS change the screen orientation when the user rotates the device? And if it is not exported to the Java API, what would actually happen when an app tries to read data from the sensor?
Thank you for your help,
Nhexus

Using the accelerometer is pretty easy from the SDK, just google for "android accelerometer" you'll get loads of examples, like:
http://stuffthathappens.com/blog/2009/03/15/android-accelerometer/
Note the "onAccuracyChanged" method, I think the ranges/accuracy depend on the magnitude of the forces currently applied.
Hope that helps.

Thanks for the answer. I'd already had a look at the Sensor API in Android.
I was trying to understand a bit more about the driver that actually stands behind the Java API. I thought that the measurement range could be set via software (maybe via i2c communication with the device) and that it is not dynamically adapted.
However, I'm not an expert in the Android Open Source Project, so I don't clearly understand the source code structure and design. Maybe someone with an in-depth knowledge of HAL and drivers could give me some hints.
Thank you again for you help!

Related

[Q] How to encode or decode in Android ?

Hi,
First i'm new and maybe i didnt choose the correct subforum, so if i'm wrong i'm sorry. Second, i entered in 'similar' threads about my questions and nothing similar.
Well, i'm comp. engineer student and i have to make one project about encoding and decoding video using the Device Hardware, so i began to investigate and i could find that Google released one API with some classes that access to low-level codecs (eg. MediaCodec, MediaExtractor or MediaFormat), so that API solve my problem, so i should learn how to develop with it and no more but...
I have readed that before that API, released this year, there was some frameworks named OpenCore and StageFright, and people who wanted to use hw device codecs had to use that frameworks, but i really dont know how people did it, if i can find some examples, and which differences can i get using directly this frameworks instead of the Android 4.1 API.
As last question, does anybody knows if this API access to the stagefright or opencore frameworks when you creates an encoder or decoder?
I know that maybe are totally noob questions, but i have googled for it from time ago and i can't get nothing usefull exception about 4.1 API which i could find Google I/O videos and documentation and now i know a bit better how it works.
Thanks!

[Q] Context Simulation: Decision Help

Hello together,
I`m writing currently my master thesis with the topic "Context Simulator For Mobile Business Application". The goal is, to test how an Android application reacts during changing context conditions: How reacts an application, if the battery is almost empty? How reacts an application, if internet connection breaks down during data transmission? How reacts an application, if a SD-Card is available/not available? ...
I want to simulate all of these factors on the PC and send the data to my android device. Some more examples:
- Simulating sensor data for accelerometer, gyroscope, ...
- GPS
- Camera and microphone (if an application requests a camera image, it should receive a image from my simulator)
- Fake connection for Wi-Fi, HDSPA, EDGE
- Fake time, time zone and date
- Simulate a specific battery level
- Fake calendar entries
------------------ My approaches ------------------
No 1:
Extend an existing custom rom with my features => Some calls should not transfer to OS (example: GPS) but to my simulator on PC. Also send data (example: battery level) to android OS. For example to pretend a low battery level.
No 2:
Write my own sandbox application (I haven`t found information to this topic so far). In this sandbox application, I`m going to start my application to test. So it is possible, to fetch all request from this System under test and I can decide if I want to transfer them to Android OS or to my simulator.
No 3:
Develop my own library, which will be included from my system under test. This library extends some android classes (e.g. Activity, Location Manager, Sensor Manager). My extensions classes will transmit the request to my simulator instead to the OS.
I`m afraid, I only have limited functionalities when I`m using this approach.
No 4:
Take sensor simulator from open intents as basic and extend it as good as possible.
-------- About Me --------
I only have few experience in Android development, but a lot experience in Java development. I know, I should read now a lot about custom roms, ... Unfortunately this thesis should be finish at the end of march.
------- What I want from you -------
Advice. I hope you understand my problem. Which is the best way to realize this project? I would like to have as much functionalities as possible. My prototype doesn`t need to support all context factors, but I should consider all factors in my system design.
I wanted to attached two graphics, but unfortunately I`m not allowed to. These are two possibilities and I`m not sure, which one is better (and also, if they are possible):
http ://s7.directupload.net/images/131212/bnpuo8gh.png
http ://s7.directupload.net/images/131212/e7u8dv4r.png
Thanks a lot,
Michael

Compass / magnetometer potential bug fix

I own a JFLTEVZW i545, but I understand that this affects other variants as well.
I'm not a developer, but in the last 4 weeks or so, I've been trying to learn more about android, linux, and kernels. Hopefully what I've come up with can be attempted by someone with a more advanced skill set, because although I've had what appears to be success in attempted fixes, I really don't know if I'm implementing the changes appropriately because I don't see the appropriate fix. I'm also wasting many, MANY hours (200-300?) learning, tinkering, and waiting on compiling because I'm not skilled enough to make a quick change that I want, or to implement that change easily, without a complete recompile (which costs me 2 hours each time). Someone who is more knowledgeable with kernels and building/compiling from source could probably do everything I'm doing in 1%, or less, of the time that I'm doing it.
Core issue:
The magnetometer's X and Z axes are off 180degrees. This has been a consistent issue since early/mid 2014 builds in both CM11 and CM12, as well as CM12.1 (as of a few days ago when I last tested). This causes problems with navigation and multiple user apps.
Ways to experience the issue:
If you aren't familiar with this bug, or if you're of the opinion that the compass doesn't have any problems, fire up google sky and you'll see that things are wonky when the phone flips around crazily and none of the constellations, planets, or moon are where they should be via the augmented reality. This app is NOT incompatible--the data it's being fed is erroneous.
Alternatively, you can use Physics Toolbox Sensor Suite to view the true raw data (other sensor apps are either adulterated or show false or useless data). With this, the sections I've found most worth looking at are the Linear Accelerometer, Magnetometer, and Orientation, and you can compare the data to an OEM phone if you have another one handy.
For the magnetometer, I've found that the absolute best way to calibrate it is by performing a figure-8 movement in three-dimensional space, rather than two-dimensional as shown in some apps and videos, or by the method mentioned in GPS Status & Toolbox. See this video for an example. I perform a larger figure-8 and do it multiple times--once can work, but a few times really settles it down.
What aren't contributing factors:
Calibration
Hardware malfunction (many others have confirmed)
App malfunction
Magnetometer driver source code (note: the code itself in the files I've looked at are the same as Samsung's source, but the way in which it's implemented may not be)
Please keep in mind that because I'm very new to this, I don't have instant intuitive feedback to know how to confirm these things in the contributing factors or possible solutions. I really need to pass the reigns on this one to someone much more advanced than myself, who will see this post and churn out the fix in a half hour.
Possible contributing factors (could be more than this):
Driver implementation. The source in /kernel/samsung/jf/drivers/sensors/ is new, and OEM, but I don't see it compiling.
Driver implementation. I'm having difficulty knowing which source files from Samsung need to be dropped in to try and compile a kernel without CM modifications which pertain to the sensors. I also have significant difficulty knowing if it succeeded, correctly, rather than something in CM taking over and undoing my changes. This is the case for one particular thing, so I have no idea how to confirm the things that I can't readily see.
Sensor(s) orientation configuration(s).
Possible solutions:
Should /kernel/samsung/jf/drivers/sensors/ actually be compiling? I don't think it is (because I don't see the folder), or even know if it's necessary for our phones, but Samsung has it in their source and I cannot successfully compile Samsung source to try and compare. I also don't know if it gets merged in with other files somewhere else.
Dropping in all OEM necessary files and compiling, without CM interrupting. I don't understand linux and the filesystem enough to know what happens when, and I've resorted to using shred/srm to try and truly delete files, but I still struggle with understanding what's going on. I also don't know what encompasses a swap like this. I don't know if replacing sensorhub is all that needs to be done or if there are 3 other files in completely different directories that are critical, and must overwrite the CM modifications for things to compile appropriately.
Setting sensor orientation correctly with CONFIG_SENSORS_SSP_ACCELEROMETER_POSITION=0, CONFIG_SENSORS_SSP_GYROSCOPE_POSITION=0, CONFIG_SENSORS_SSP_MAGNETOMETER_POSITION=0 being different values than zero.
I've tried the first two and had intermittent success. Sometimes things compile, sometimes they don't. But I also don't know if what I'm changing even matters. I've been checking file hashes to see when things change, but it's becoming tedious and someone who knows all of the linux commands and knows how the source gets compiled would know without having to check.
My favorite possible solution is the third. This is in the .config file which is made by the make menuconfig process, which I believe is influenced by various defcofig files. I've tried changing the .config directly, but CM undoes that. I've tried adding those lines to the defconfig files, but CM either undoes or ignores that. I've tried compiling a kernel outside of compiling CM as a whole and am hitting roadblocks with my lack of experience and knowledge. I've successfully compiled kernels, but I don't even know if my changes are sticking. I've taken what I though may have been an appropriately compiled kernel (Image and zImage) by modifying the .config and then manually doing a make zImage, but even dropping those in to compile with CM, chmod 555, chown/chgrp root and CM somehow manages to overwrite the renamed zImage-->kernel file, but it would actually leave them alone when I did all that nonsense to the Image and zImage in their normal output spot, /arch/arm/boot/kernel/ I believe.
The third possible solution sets how the sensors are physically placed within the phone. If the readings are off by right-angles, it seems that a coding change for one or more of these would be appropriate:
CONFIG_SENSORS_SSP_ACCELEROMETER_POSITION=0
CONFIG_SENSORS_SSP_GYROSCOPE_POSITION=0
CONFIG_SENSORS_SSP_MAGNETOMETER_POSITION=0
I've had no success in making this happen, but as I said, someone who is a genuine programmer would be able to make these things happen and compile and test quickly...rather than me spend an entire day trying 20 different ways to see if I can get something to stick, and then not even being able to confirm if what I changed, actually made its way into the final compiled files.
Hopefully someone is willing to take a stab at this, because I'm apparently the equivalent of an elderly person having their first encounter with a computer when it comes to this stuff. It seems so simple, but I'm not the one to make it happen, and I feel like this may be the route to take. Thanks, y'all!
EDIT: I've tried other methods to make this work that I didn't list, I just can't remember everything and my mind is breaking down after going at this for about 13 hours straight today.
I suppose you own a Verizon phone, the unique with Compass issue. I'm currently helping jfltevzw guys to find a fix, and still nothing real even after some tries...
Already tried to change magnetometer physical angle (the correct value must be 3 or 5 according to board-jf_vzw), but if you think: even in CM10.2 the MAGNETOMETER_POSITION was 0
I'm going to try some other things...
Sorry, yes, I own a JFLTEVZW
What are your thoughts on the new "sensors" source folder and it seemingly not being compiled/built? The \Kernel\drivers\sensors\geomagnetic\Kconfig has a completely separate orientation reference of INPUT_YAS_MAGNETOMETER_POSITION:
Code:
#
# Copyright (c) 2010 Yamaha Corporation
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301,
# USA.
#
config INPUT_YAS_MAGNETOMETER
tristate "YAS Geomagnetic Sensor"
depends on I2C
config YAS_MAG_DRIVER_YAS532
tristate "YAS Geomagnetic Sensor - yas532"
depends on I2C
help
Say Y here if you want support for the yas532 sensor
device.
To compile this driver as a module, choose M here: the
module will be called yas532.
config INPUT_YAS_MAGNETOMETER_POSITION
int "YAS Geomagnetic Sensor Mounting Position on Board"
depends on INPUT_YAS_MAGNETOMETER
default "0"
help
Chip mounting position (pin 1).
0: top, upper-left
1: top, upper-right
2: top, lower-right
3: top, lower-left
4: bottom, upper-left
5: bottom, upper-right
6: bottom, lower-right
7: bottom, lower-left
This is one of the points where I get stuck, because even if I can forcefully input a different number and reference, I don't know how to reverse engineer and get the pseudo-code (or how to read it) to confirm that what I input actually made it into the kernel. I want to confirm one of two things:
1) The change made it into the kernel successfully, and there is proof of that, yet the magnetometer data is not fixed, or
2) The change cannot be confirmed that it made it into the kernel successfully, with proof, so things such as this are still viable options.
Side note:
Samsung is also using this driver setup for their new "wear" devices. Both sensors and sensorhub source folders, for the same YAS532 (YAS532B is the same chip from my research, it's akin to calling the phone the s4 or galaxy s4). This source code change was made without any hardware change to our phones, so that why I wonder if something is awry and something completely unexpected and seemingly unrelated, on first glance, is expecting the sensors source folder to be compiled, but it isn't.
jfltevzw compass now works on CM
Feel free to donate to invisiblek as part of the bounty
Heh, I saw that commit and was tinkering around before sunrise today--very excited!
Hi there everyone -
I am running an AOSP / CM12.1 / Lollipop 5.1 ROM (Fusion) with KT Kernel. IT's a Sprint variant and I, too have the Compass / magnetometer bug. North points south / east points west. Maddening. Everything else in the ROM is Really wonderful, but without the compass / GPS / Maps, it's a deal breaker for me.
My last ROM - GPE on this forum had NO issues with the compass, so I am assuming that it is either the Kernel, or the ROM, or some odd combo.
If anyone else has any other info, please let me know? Thanks in advance!

[Development] Discovering, reverse-engineering and using vendor HALs

Project Treble provides a great help in getting access to vendor-specific HALs, I'll try to explain how, and how to exploit it.
Presentation of vendor HIDLs
Thanks to Project Treble, all HALs must be defined through HIDL.
Standard AOSP HALs means that a generic AOSP system works with standard AOSP features.
But, vendor HALs are also going through HIDL!
APIs defined through HIDL are stable, versioned, hashed and easy to access.
Just plug the HIDL inside your build system, and you get easy access from your applications to the HAL behind the HIDL!
Also, APIs defined through HIDL are supposed to be clean, and mostly self-documented, so getting the HIDL can help understand how the HAL works.
To understand how easy it is, here is a real world client usage of an HIDL:
Code:
IExtBiometricsFingerprint service = IExtBiometricsFingerprint.getService();
service.sendCmdToHal(NAV_ON);
With the HIDL, enabling gestures on the fingerprint sensor on Huawei devices is a two liners! [1]
Browsing vendor HIDLs
Now, the problem is that HIDLs are part of the source code, not the firmware.
So if the vendor doesn't publish it, there is some additional work to do.
Android build system is capable of generating two client APIs for HALs using HIDL. Either a C++ library, or a java library. Not all firmwares will contain java libraries for all APIs, but C++ is almost always available.
Both languages make it fairly easy to reverse engineer the prototype of the functions, which is almost all of HIDL (c++ is missing function arguments)
So, I made a script to reverse engineer those HALs ( https://github.com/phhusson/treble_experimentations/blob/master/vendor-HAL/reverse-hal.sh ).
It is far from perfect, it can't generate a full-blown HIDL, but it makes discovery much easier!
Here are a few examples of APIs it is giving access to:
- Touch screen gives us access to Glove mode and cover mode
- Display gives us access to functions like setColorTemperature, or updateRgbGamma
- Infrared HAL gives us access to learning capability
- fingerprint sensor gives us access to sendCmdToHal
With the first three examples, we can see things can be easy, but the last one makes things trickier. With the last one, there is still a magic value to give to the sendCmdToHal function.
Anyway, that's still quite an improvement compared to before Treble.
Using vendor HALs
So, let's say we've discovered [email protected]::hwTsSetGloveMode(bool) function, and we want to call it.
As I mentioned in the first section, if we have the HIDL, it's a two liners.
But, we don't have it, so what do we do?
Android build system usually generates HIDL client code for C++ and Java, so we just need to piggy-back to this code to be able to call the functions!
Using vendor HALs from Java
For Java, the idea is fairly simple, we need to copy/paste the code from the original firmware (either from an app or framework), and copy it in our own environment.
It's a bit more complicated to realize, here is how I did it:
- grep -rF <name of the function> system # To find where to find the java symbols
This gave me system/system/framework/oat/arm64/hwServices.vdex
- deodex it
- Retrieve all symbols inside proper folder. In this case, vendor/huawei/hardware/tp
- Create a mock of the java class ( here is an example )
- Create the code to call this (here is an example)
- Build the mock and the caller (here is an example)
- Decompile the result into smali
- Replace the mock code by the actual classes from the original firmware
- Recompile the whole
- Run the resulting app/dex
Things to improve
First problem here, is this code is annoying to automate and put into a build system.
Them, we can only partially reverse-engineer the HIDL, which leads to two problems:
- We need to piggy-back previous firmware's APIs
- Calling C++ is much harder (because ABI behaviour when changing headers is tricky)
The cleanest solution would be to fully reverse-engineer the HIDLs.
Though current reverse engineering is so bad that it doesn't even list all the functions available, so a lot of work is needed.
Conclusion
Project Treble's HALs are fun to play with, look at it!
[1] I'm a bit lying here. This is enough to enable event reporting, but there are some additional changes needed to make sense of those events
Thanks for your great work here OP
BTW, on Honor 9 stock EMUI HwCamera2 can't take a photo ( checked on landscape or portrait mode on the both camera ), but video recording it's working like expected !
Also with your solution, home button it's working in this way : when I press home button it's open search with a "=" into it
surdu_petru said:
BTW, on Honor 9 stock EMUI HwCamera2 can't take a photo ( checked on landscape or portrait mode on the both camera ), but video recording it's working like expected !
Click to expand...
Click to collapse
This is fixed by https://github.com/phhusson/huawei_camera_aosp/commit/177459fdb76f0aa68fa4ffb869b633d9460a2fb0
Also with your solution, home button it's working in this way : when I press home button it's open search with a "=" into it
Click to expand...
Click to collapse
Yeah, that's what my footnote basically says
Huawei is defining the meaning of fingerprint evdev in /vendor/usr/keylayout/fingerprint.kl, but the KEYCODE_FINGERPRINT_* it uses doesn't exist in AOSP.
What I'm planning to do there is a daemon listening exclusively to /dev/input/eventX of fingerprint, and launch commands based on the events.
This could have performance issues (input keyevent KEYCODE_HOME takes half a second), so I might switch to creating an uinput.
phhusson said:
This is fixed by https://github.com/phhusson/huawei_camera_aosp/commit/177459fdb76f0aa68fa4ffb869b633d9460a2fb0
Yeah, that's what my footnote basically says
Huawei is defining the meaning of fingerprint evdev in /vendor/usr/keylayout/fingerprint.kl, but the KEYCODE_FINGERPRINT_* it uses doesn't exist in AOSP.
What I'm planning to do there is a daemon listening exclusively to /dev/input/eventX of fingerprint, and launch commands based on the events.
This could have performance issues (input keyevent KEYCODE_HOME takes half a second), so I might switch to creating an uinput.
Click to expand...
Click to collapse
Thank you very much, and good luck
surdu_petru said:
Thank you very much, and good luck
Click to expand...
Click to collapse
Does your fingerprint sensor mechanically click? Or is it just some capacitive sensor?
phhusson said:
Does your fingerprint sensor mechanically click? Or is it just some capacitive sensor?
Click to expand...
Click to collapse
No, the only one withe the click is into Honor 8 ... I already see it this : "key 28 ENTER" into fingerprint.kl ... maybe here we can defined as virtual or something like this if I'm not wrong, but sure need to be tested
EDIT :
I guess it's used only on Honor 8, as fingerprint.kl is almost the same for all devices
surdu_petru said:
No, the only one withe the click is into Honor 8 ... I already see it this : "key 28 ENTER" into fingerprint.kl ... maybe here we can defined as virtual or something like this if I'm not wrong, but sure need to be tested
EDIT :
I guess it's used only on Honor 8, as fingerprint.kl is almost the same for all devices
Click to expand...
Click to collapse
Well, my Device (Mate 9), which doesn't mechanically click, does trigger click event.
The way I'm doing it doesn't require changing vendor partition. But yes, changing vendor/usr/keylayout/fingerprint.kl would be much easier.
xx
Diggin on "my own"
Good evening out there!
First of all: Thank you very much for all of your afford till now.
I own a honor 9 lite and im investigating right now how oreo and treble works.
So i took your reverse_hal and tried to improve it a little bit (see attachment).
Maybe it can keep things easier?
Does it make sense to read out all classes in the so's?
I've uploaded the script (reverse-hal-grork.sh) and an example-output from vendor.huawei.hardware.biometrics.fingerprint.
Please, have a look at it.
greetings
vsrookie
PS: If there are any ideas to improve just tell. I thought about automatically create JavaClasses? Or Smali? Will it work?
vsrookie said:
Good evening out there!
First of all: Thank you very much for all of your afford till now.
I own a honor 9 lite and im investigating right now how oreo and treble works.
So i took your reverse_hal and tried to improve it a little bit (see attachment).
Maybe it can keep things easier?
Does it make sense to read out all classes in the so's?
I've uploaded the script (reverse-hal-grork.sh) and an example-output from vendor.huawei.hardware.biometrics.fingerprint.
Please, have a look at it.
greetings
vsrookie
PS: If there are any ideas to improve just tell. I thought about automatically create JavaClasses? Or Smali? Will it work?
Click to expand...
Click to collapse
Looks good
Could you make a pull-request to https://github.com/phhusson/treble_experimentations/ ?
The ideal target would be to generate the original .hal file, so that we can generate c++ and java code automatically with hidl-gen.
I don't know how big is the gap to be able to do that though...
Hi.
I dont think that it will be earlier then the weekend.
But i will do.
Also i go on with my investigation at weekend.
Greetings
Vsrookie
This is interesting... Going to see if I could extract something useful from the Nabi SE, as I'm curious if it can be Treble'd.
phhusson said:
Project Treble provides a great help in getting access to vendor-specific HALs, I'll try to explain how, and how to exploit it.
Presentation of vendor HIDLs
Thanks to Project Treble, all HALs must be defined through HIDL.
Standard AOSP HALs means that a generic AOSP system works with standard AOSP features.
But, vendor HALs are also going through HIDL!
APIs defined through HIDL are stable, versioned, hashed and easy to access.
Just plug the HIDL inside your build system, and you get easy access from your applications to the HAL behind the HIDL!
Also, APIs defined through HIDL are supposed to be clean, and mostly self-documented, so getting the HIDL can help understand how the HAL works.
To understand how easy it is, here is a real world client usage of an HIDL:
With the HIDL, enabling gestures on the fingerprint sensor on Huawei devices is a two liners! [1]
Browsing vendor HIDLs
Now, the problem is that HIDLs are part of the source code, not the firmware.
So if the vendor doesn't publish it, there is some additional work to do.
Android build system is capable of generating two client APIs for HALs using HIDL. Either a C++ library, or a java library. Not all firmwares will contain java libraries for all APIs, but C++ is almost always available.
Both languages make it fairly easy to reverse engineer the prototype of the functions, which is almost all of HIDL (c++ is missing function arguments)
So, I made a script to reverse engineer those HALs ( https://github.com/phhusson/treble_experimentations/blob/master/vendor-HAL/reverse-hal.sh ).
It is far from perfect, it can't generate a full-blown HIDL, but it makes discovery much easier!
Here are a few examples of APIs it is giving access to:
- Touch screen gives us access to Glove mode and cover mode
- Display gives us access to functions like setColorTemperature, or updateRgbGamma
- Infrared HAL gives us access to learning capability
- fingerprint sensor gives us access to sendCmdToHal
With the first three examples, we can see things can be easy, but the last one makes things trickier. With the last one, there is still a magic value to give to the sendCmdToHal function.
Anyway, that's still quite an improvement compared to before Treble.
Using vendor HALs
So, let's say we've discovered [email protected]::hwTsSetGloveMode(bool) function, and we want to call it.
As I mentioned in the first section, if we have the HIDL, it's a two liners.
But, we don't have it, so what do we do?
Android build system usually generates HIDL client code for C++ and Java, so we just need to piggy-back to this code to be able to call the functions!
Using vendor HALs from Java
For Java, the idea is fairly simple, we need to copy/paste the code from the original firmware (either from an app or framework), and copy it in our own environment.
It's a bit more complicated to realize, here is how I did it:
- grep -rF <name of the function> system # To find where to find the java symbols
This gave me system/system/framework/oat/arm64/hwServices.vdex
- deodex it
- Retrieve all symbols inside proper folder. In this case, vendor/huawei/hardware/tp
- Create a mock of the java class ( here is an example )
- Create the code to call this (here is an example)
- Build the mock and the caller (here is an example)
- Decompile the result into smali
- Replace the mock code by the actual classes from the original firmware
- Recompile the whole
- Run the resulting app/dex
Things to improve
First problem here, is this code is annoying to automate and put into a build system.
Them, we can only partially reverse-engineer the HIDL, which leads to two problems:
- We need to piggy-back previous firmware's APIs
- Calling C++ is much harder (because ABI behaviour when changing headers is tricky)
The cleanest solution would be to fully reverse-engineer the HIDLs.
Though current reverse engineering is so bad that it doesn't even list all the functions available, so a lot of work is needed.
Conclusion
Project Treble's HALs are fun to play with, look at it!
[1] I'm a bit lying here. This is enough to enable event reporting, but there are some additional changes needed to make sense of those events
Click to expand...
Click to collapse
So my theory is.
Trebel is a wrapper for closed source drivers ?
If so. Knowing the calls to the drivers give you the code to make any driver trebel ...

Sensor management on Tizen

I'd like to develop an application for a smart watch which will periodically turn on & off sensors on the watch/band, ala microphone/camera/etc, record their activity, do minimal processing and store the result on the watch/band. This result will be uploaded to the mobile phone when a bluetooth connection is established.
To my best understanding, this can only be done with Tizen/WearOS/FitbitOS, not with other watch operating systems such as Huami's ones (Amazfit/Xiami MI/etc). This also means that it's a (big) watch and not a (small) band that is suitable for the above.
Is this correct and how to do this otherwise if it's wrong?
How much of a hassle it is to do this in Tizen for a seasoned professional programmer who is a complete noob in Tizen/smartwatches/mobile devices?
Can we develop a headless app that runs constantly?
Are there any tutorials on the subject?

Categories

Resources