sensor (pressure) polling rate - C++ or Other Android Development Languages

hi (this is my first post),
i'm trying to tweak the maximum from the pressure sensor in the galaxy s4 coding up a variometer app (yes, i know there are several).
the chip in the device is a bosch180 with quite impressive specs.
i've tried to increase polling rate so as to do filtering without encoding too much delay (which is what other apps are suffering from, imho - hence the motivation).
i found that no matter what poll interval i specify (either through SensorManager.registerListener in java or ASensorEventQueue_setEventRate in native), the refresh rate is always the same (as per logcat):
Sensors(814): Pressure old sensor_state 129, new sensor_state : 137 en : 1
or and diffing event timestamps in the app (that actually shows 170-180 millisecs)
so i went digging further, trying to untangle the mysteries of pressure sensor. i checked the chip specs and linux drivers, and still believe faster polling should be possible.
i'm on cm11-jflte (both release and built from source), rooted
cm11 kernel is configured to use the sensorhub ssp drivers for the pressure sensor (not bmp085.c or bmp18x.c), i tried to re-hardcode the default polling rate for the pressure sensor in the ssp driver files (which is otherwise 200ms for all sensors across the board), which changes the poll_delay file contents on the sysfs, but doesn't change the behavior in the app.
i currently don't have the time to untangle the workings of ssp.
QUESTION: where in the (java or) native android layer or the ssp kernel drivers are the default(?) polling rates for specific sensors specified?
ALT: other ideas as to how to proceed?
(i will reconfigure the kernel to use one of the the other bmp drivers next, but fear how that will impact the rest of the sensorhub...)
(i realize this is more of a hacking/tweaking topic, but as i just signed up to ask this specific question, i'm not yet allowed to post in the rom/tweaks development section - if a moderator could maybe move the thread...)
thanks for all ideas in advance!

other drivers don't help.
the hardware is being run by ssp kernel drivers, i could not locate the pressure sensor as a stand-alone anywhere on the i2c.
i'm guessing the driver is being interacted with through the sensors.msm8960.so and sensorhub.msm8960.so hal implementations (which are proprietary blobs).
playing around in shell, i found that in fact the polling rate (in sysfs) is set to 66.7ms (15Hz) when i register my listener at SENSOR_DELAY_FASTEST, but the events are still only reported every 180ms.
i was able to intercept pressure events from the shell using getevent /dev/input/event6 and the reported timestamps are also showing 180ms between events.
ideas, anyone?

Related

[REQ] Gps refresh rate

Hi everyone, is there a way to boost the refresh rate of the internal GPS.
I think that interpolating the coordinates from 1 Hz to 5 Hz would be very simple. Is there anyone that know something about this kind of hacks.
Thanks,Giacomo
Interpolation involves knowing both the start and the end, so you must wait for the GPS to update, so your current position will be smoother but delayed, which is bad.
You could, instead, predict the next position but this can create fake results under certain conditions.
When a developer is using an API he expects that it follows the specifications, feeding all of the Apps with possibly-incorrect data is a bad design choice. It's safer to let the App itself do the smoothing for you while keeping the correct positions for internal computations. Most navigation software do this and sometimes the prediction fails (that awkward moment when your car turns but it follows a straight line on the GPS screen and then jump to the right position).
Questions or Problems Should Not Be Posted in the Development Forum
Please Post in the Correct Forums and Read THIS
Moving to Q&A

[Q] Driver or app / use GPS to spoof magnetometer

There are a lot of devices out there without a built-in magnetometer, such as the Parrot Asteroid Smart (runs gingerbread, it's a double-DIN sized car stereo head unit). Without a magnetometer, lots of apps that expect one default to true north and otherwise don't function well. I've searched an there isn't a current app/driver that will respond to magnetometer direction requests with current (or previous) GPS heading.
So here's my plan:
create a driver that sits in between the magnetometer and user-ring applications. When a request is made for current compass direction, the driver redirects to the GPS and requests current (or if not moving, previous) heading. Voila! Apps that want to talk to the compass are now happy.
This could be used for other things as well, such as bypassing the internal magnetometer (on devices where there is one), useful for playing Ingress to override the cruddy compass experience (on my HTC one, which I prefer to call my HT cone).
So, why the post? I think I've searched pretty well, looking for "no compass", "no magnetometer" and '" simulate magnetometer' gps" but the only app I found was for some ancient version of Windows Mobile, and that app put predictive data in between the applications and GPS data, smoothing the GPS output. (The concept is the same, however .... intercept geo requests and respond with modified (or in my case, new) data.)
I'm getting to work creating a driver, but so far it's not very easy going, but it is almost going (thank you github, stackoverflow) but I can't be the first person to try and use GPS heading to simulate magnetometer direction. There are base classes that provide direction, 99% of the direction apps out there are simply requesting compass direction, certainly not crunching raw magnetometer data. Conceptually it should be easy to hook that call (compassGetDirection) and respond with gpsGetHeading data.
I still haven't asked a question. So, in the interest of being annoying (yes, <==noob), please assume all the above sentences are in the form of a question, like "what do you think of ..." and "have any of you ..." or "is it possible to ..."

Which system parameters (kernel/sys/anything) rule automatic suspending of cores?

Hello.
I think my problem is a general Android question, although - in my particular case - it is dedicated to Samsung Galaxy Note 3 (SM-N9005).
I would like to learn, which system parameters rule automatic suspending of single cores in runtime. This question is NOT deep sleep - related, I have no problems with that. My problem consists in cores not going offline, even if nothing is happening in the system (I wanted to post a link to a screenshot but I'm not allowed to). Imagine an Idle system - my phone has 4 cores, 3 of them should go offline. And that's what happens, when the system is in a "good" state. But after some time something is changed (I don't know what, yet) and even if completly nothing is running, all the cores stay online and they operate at their lowest speed (which is 300MHz in my case). I know, that it improves responsivenes, but, at the same time, increases power consumption. And it's not how it should work.
So, I'm looking for what triggers such behavior. Not for the initial cause (it must be one of the applications), but for the "intermediate" result. I believe that there are kernel parameters, probably available via the /sys/ subsystem, which may be checked and adjusted here.But... which of them? What to look for?
My phone is rooted, I'm quite fluent in bash programming and I'm not affraid of it. I also know Linux itself (I'm not a hacker, but I have few machines which I administrate). But I don't know Android-specciffic issues. Can anyone help me here?
Thanks in advance

MCU Source / Feature Request

Has anyone had any luck find the source code for the MTCB MCU? I'd like to mod the shutdown timer such that it leaves the device fully operational for the timer duration. Or if anyone has info on the instruction set used, I could perhaps made the mod using assembly or bitbanging.
This may help. It uses an 8051 instruction set
KLD2-v277 "source"
I have been studying and commenting the MCU code for a few months now -- specifically the KLD2-2.77 version. I used darksimpson's mcu decryption tool on the mcu.img file to get the 8051 binary file, then used the D52 disassembler to create "source" code from that.
Studying this code is an ongoing, very slow process but still I have made significant progress.
A few observations (any of which may contain errors!):
1) The original source appears to be written in a higher level language, probably C (not sure which compiler).
2) It appears to communicate with the Android CPU using Serial Port 2 and also an SPI port. There are two I2C buses that the MCU uses to control the various peripheral chips such as the radio, sound processor, and video matrix.
3) Serial data is in the form of packets consisting of a 3 byte preamble followed by one or more bytes of data and ending with a checksum. The specific form of the packet and checksum can vary and is determined by the preamble.
4) Serial packets mostly go from MCU to Android. The MCU can receive packets too, but that capability doesn't seem to be used much, if at all.
5) The primary method of control from Android to MCU seems to be through the SPI port. This consists of 16 bit command words that are handled by a very large CASE statement. Each command word can be followed by additional data (depends on the command) and may also cause the MCU to send data back, either through SPI or more commonly by serial port packets.
6) It appears to me that the designers have nearly maxed out the code space. In other words there is very little room left for additional code, and also evidence that they did things to try to squeeze more code into the available space.
7) Finally -- and not to bash the Chinese designers in any way (tremendous effort contained in this code) -- there are some obvious bugs and also significant room for improvement.
If anyone wants more specific details on what I've learned please PM me. If anyone wants to dive in and study as well, please share what you learn!
One thing that would help me tremendously would be more information on the actual hardware environment. My HU is an 8133 with 480x800 display (specifically a Pumpkin C0235) but it is already installed in my car and I really like it there instead of on a workbench. . So if anyone is in a position to provide hardware details (such as clear photos and schematic diagrams -- either partial or complete, handmade is ok!), sharing them here will help a lot! The code does a lot with MCU I/O pins and without knowing what is connected to those pins it is difficult to really understand their true purpose.
Zipped text file is attached
dhmsjs said:
I have been studying and commenting the MCU code for a few months now -- specifically the KLD2-2.77 version. I used darksimpson's mcu decryption tool on the mcu.img file to get the 8051 binary file, then used the D52 disassembler to create "source" code from that.
Studying this code is an ongoing, very slow process but still I have made significant progress.
A few observations (any of which may contain errors!):
1) The original source appears to be written in a higher level language, probably C (not sure which compiler).
2) It appears to communicate with the Android CPU using Serial Port 2 and also an SPI port. There are two I2C buses that the MCU uses to control the various peripheral chips such as the radio, sound processor, and video matrix.
3) Serial data is in the form of packets consisting of a 3 byte preamble followed by one or more bytes of data and ending with a checksum. The specific form of the packet and checksum can vary and is determined by the preamble.
4) Serial packets mostly go from MCU to Android. The MCU can receive packets too, but that capability doesn't seem to be used much, if at all.
5) The primary method of control from Android to MCU seems to be through the SPI port. This consists of 16 bit command words that are handled by a very large CASE statement. Each command word can be followed by additional data (depends on the command) and may also cause the MCU to send data back, either through SPI or more commonly by serial port packets.
6) It appears to me that the designers have nearly maxed out the code space. In other words there is very little room left for additional code, and also evidence that they did things to try to squeeze more code into the available space.
7) Finally -- and not to bash the Chinese designers in any way (tremendous effort contained in this code) -- there are some obvious bugs and also significant room for improvement.
If anyone wants more specific details on what I've learned please PM me. If anyone wants to dive in and study as well, please share what you learn!
One thing that would help me tremendously would be more information on the actual hardware environment. My HU is an 8133 with 480x800 display (specifically a Pumpkin C0235) but it is already installed in my car and I really like it there instead of on a workbench. . So if anyone is in a position to provide hardware details (such as clear photos and schematic diagrams -- either partial or complete, handmade is ok!), sharing them here will help a lot! The code does a lot with MCU I/O pins and without knowing what is connected to those pins it is difficult to really understand their true purpose.
Zipped text file is attached
Click to expand...
Click to collapse
Great work! Thank you! I'm not profound in this MCU programming but I really believe that it is of great importance to hack that code and mod it in order to squeeze more from these units. A applause you and hope other will join your effort.
Open source implementation could be interesting down the road
Wonder if we could steal any of the tricks from old days of DSS cards and double up code by using lookup tables for some references to make more room for improvement code.
webdude12 said:
Wonder if we could steal any of the tricks from old days of DSS cards and double up code by using lookup tables for some references to make more room for improvement code.
Click to expand...
Click to collapse
There is a lot of "wasted" space (duplicate initialization data for instance) so there's plenty of room available if the software is written efficiently. However modifying what is there, other than small tweeks, is probably not likely to succeed. It looks to me like there is a lot of "band aid" fixes in there and (in my experience anyway) modifying that kind of code often ends up creating more subtle, unforeseen problems than it fixes.
Yes an open-source rewrite will be the best long term solution but that will require a thorough understanding of both the MCU and the Android side of the interface. On the other hand we have potentially a large group of international talent to apply to the task, and no hard deadlines either. It is certainly possible to do.
The MCU processor itself is well known. The peripheral devices are well known (or knowable). So reimplementing those interfaces can be largely independent of the Android-MCU interface. But knowing what the Android CPU expects, and accommodating that for all variants is currently the big unknown for me.
dhmsjs said:
There is a lot of "wasted" space (duplicate initialization data for instance) so there's plenty of room available if the software is written efficiently. However modifying what is there, other than small tweeks, is probably not likely to succeed. It looks to me like there is a lot of "band aid" fixes in there and (in my experience anyway) modifying that kind of code often ends up creating more subtle, unforeseen problems than it fixes.
Yes an open-source rewrite will be the best long term solution but that will require a thorough understanding of both the MCU and the Android side of the interface. On the other hand we have potentially a large group of international talent to apply to the task, and no hard deadlines either. It is certainly possible to do.
The MCU processor itself is well known. The peripheral devices are well known (or knowable). So reimplementing those interfaces can be largely independent of the Android-MCU interface. But knowing what the Android CPU expects, and accommodating that for all variants is currently the big unknown for me.
Click to expand...
Click to collapse
Well that is great to hear. So what I would think that needs to be done first is a complete comment dis-assembly. This has been completely useful in other "hacking" attempts as it allows you to fully understand step by step what the original coders were doing. It also allows for debuggers / emulators to written.
From there it can be determined if it makes sense to re-write sections, re-route code, or completely re-write from scratch providing the same functions.
webdude12 said:
Well that is great to hear. So what I would think that needs to be done first is a complete comment dis-assembly. This has been completely useful in other "hacking" attempts as it allows you to fully understand step by step what the original coders were doing. It also allows for debuggers / emulators to written.
From there it can be determined if it makes sense to re-write sections, re-route code, or completely re-write from scratch providing the same functions.
Click to expand...
Click to collapse
While it is by no means complete, I've worked through enough of this MCU code that I think I have a pretty good high-level understanding of how and what the MCU does. As I said above, what I'm missing now is much of the hardware context (for example which MCU pins connect to what, some of the peripheral chip #s, etc), and also the Android software context -- specifically the code that sends commands through SPI to the MCU. I have looked briefly at some of the Android code (for example MTCManager.apk) but didn't find anything useful there. It seems like the SPI interface is lower level -- perhaps as in a device driver??
This is just a dumb question because you guys would be so far away from modifying mcu code, yet I'm going to ask it.
Could some of the hardware hacks (Using different sound processor for radio, mic mods) be accomplished through software, if you got a good handle on things?
DRidilla said:
This is just a dumb question because you guys would be so far away from modifying mcu code, yet I'm going to ask it.
Could some of the hardware hacks (Using different sound processor for radio, mic mods) be accomplished through software, if you got a good handle on things?
Click to expand...
Click to collapse
Sound processor yes since that is just sending different commands to the sound processor. Mic mod no since that is a hardware mod only (as far as I can see it anyway).
Isn't the problem with the mic though that even when you use an external mic, the internal does not shut off? Could you send a message stating that an external mic exists, close off internal? Hypothetically of course.
Would be amazing if down the road those sort of mods could be done on the software side.
Also add "Control radio before android boots" to my dream list.
DRidilla said:
Isn't the problem with the mic though that even when you use an external mic, the internal does not shut off? Could you send a message stating that an external mic exists, close off internal? Hypothetically of course.
Would be amazing if down the road those sort of mods could be done on the software side.
Also add "Control radio before android boots" to my dream list.
Click to expand...
Click to collapse
My understanding of the mic hardware interconnection is not good enough to say one way or the other. Probably varies from model to model as well (I don't seem to have a big problem with my external mic).
Control radio before Android boots might be tough since most all of the radio user interface comes through Android anyway. Might be possible to restore previous radio config before Android is up, but that might also not be such a great thing for many users. Not sure I'd want my radio to come up blasting away if I don't also have a way to silence it.
Then again with an open source solution, all you would really need is the will to make it so. I guess we all have our own dream lists, no?
DRidilla said:
Isn't the problem with the mic though that even when you use an external mic, the internal does not shut off? Could you send a message stating that an external mic exists, close off internal?
Click to expand...
Click to collapse
Both microphones are hardwired together. It is not possible to "shut off" one with software. There's also a modification to improve the radio's frequency response by replacing some capacitors. Again, not possible with software.
dhmsjs said:
...
Click to expand...
Click to collapse
Wow, nice work!! I typically have a few KGL units on the bench. Attached are a few images to get you started. These are of the 1024x600px resolution units.
Next time I'm at the office, I'll check out which I/O expansion chip(s) are being used and see if I can identify some of the lines.
Aaaron16 said:
Wow, nice work!! I typically have a few KGL units on the bench. Attached are a few images to get you started. These are of the 1024x600px resolution units.
Next time I'm at the office, I'll check out which I/O expansion chip(s) are being used and see if I can identify some of the lines.
Click to expand...
Click to collapse
Thanks! Really appreciate the photos. Difficult to read the chip #s though.
I know the radio device is similar to a TEF6624 - maybe a TEF6686?
Pretty sure the sound processor is a BD37531 or BD37534.
Not sure what the video matrix switch is -- FMS6502 maybe?
These all connect to the MCU via I2C. Interested in identifying the other devices that connect via I2C too. I can see the I2C addresses they use to communicate with the chips, but I can't generally look up a chip # by its I2C address. The goal is to find the data sheets for each device so that I can understand exactly what the I2C comm is doing. I have data sheets for the devices listed above.
Also as an update: I stated above that the MCU is controlled by Android through an SPI channel. I've been studying this more closely and it is clear that it is not actually SPI. Looks more like a custom 3 wire interface. Uses pins 1, 2 and 3 of the MCU (the IAP15F2K61S2 device).
Pin 1 (P0.5) is data (or ACK) to Android from MCU, driven by MCU
Pin 2 (P0.6) is the clock, which can be driven by either side (open collector with pullup?)
Pin 3 (P0.7) is data (or ACK) to MCU from Android, driven by the Android CPU
It would be helpful to know which pins these connect to on the Android CPU.
Each bit sent must be acknowledged by the receiver before the next bit is clocked out. There is debouncing and timeouts applied to the signals and the transfer is aborted if either fails. Maximum bit rate is around 1MHz I think, and probably slower in actual use. If anyone recognizes this as a particular comm standard, let me know! Otherwise it must be custom for these HUs.
Component Listing (KGL Mainboard)
I noticed the KGL devices use different radio modules even across the same model. I have some units with TEF6624 modules and others with TDA7786 modules. The following are the IC part numbers for the newer 1024x600px KGL main board (date code 2014-07-20):
1. AU6258J61-JES-GR (USB 2.0 Controller)
2. 15L2K61S2 (MCU)
3. FMS6502MTC (Video Sw Matrix)
4. BD37033FV (Sound Processor)
5. GM8283C (? - 10+ traces going to 50-something pin IDC display connector)
6. T132BT (TFT Video Controller)
7. WM8751L (Stereo DAC) - markings rubbed off. See attached photo.
I couldn't find an I/O expander chip on the main board, so I'm guessing the MCU drives most of the I/O lines directly (aside from those handled by their respective audio/video controllers).
Possibly related, I found this document which seems to include many of the same components:
http://p.globalsources.com/IMAGES/PDT/SPEC/216/K1127159216.pdf
MCU Memory Map/Resource Use
So attached below is a document containing a partial memory map, resources used, and I/O pin assignments as I currently understand it from studying the MCU code. The document changes every day as I learn more.
Again this is a KLD2 unit and the V2.77 version of MCU code I'm studying. For Aaaron16 or anyone else who has the ability to explore the hardware of a KLD2 head unit, what will help me a lot in this effort is understanding what the I/O pins connect to. They are listed at the top of the document.
If you explore your hardware, post what you find in this thread and I'll add it to the doc. TIA!
Could Malaysk make modified MCU with redisigned sound processor and sleep mod fo MTCB-MD-V2.67:angel:. I cant post download link
drive.google.com/file/d/0B9-2UI8L0wScel92bUFlaExkWnc/view?usp=sharing
dhmsjs said:
So attached below is a document containing a partial memory map, resources used, and I/O pin assignments as I currently understand it from studying the MCU code. The document changes every day as I learn more.
Again this is a KLD2 unit and the V2.77 version of MCU code I'm studying. For Aaaron16 or anyone else who has the ability to explore the hardware of a KLD2 head unit, what will help me a lot in this effort is understanding what the I/O pins connect to. They are listed at the top of the document.
If you explore your hardware, post what you find in this thread and I'll add it to the doc. TIA!
Click to expand...
Click to collapse
How is this progressing @dhmsjs ?

Help changing the maximum sample rate for an accelerometer in the kernel.

Just as a warning, I have very little idea about what I am doing, so keep that in mind
Basically, I am trying to increase the sample rate of an accelerometer, and figured the best way to do this was to create a custom kernel.
To give specifics: I am doing this on a Moto 360 v1, however, I'm not sure how important this is, as I'm sure the general solution will be pretty translatable.
The githhub repo for my kernel source is here: Mrcl1450/android_kernel_motorola_minnow
Any help to go about doing this is greatly appreciated.
Hey,
I've spent the last three weeks writing sensor drivers, and Sensor HAL for an OEM, so I'll try to provide some insight into how the sensor framework works, and what the sensor frequency is limited by.
First of all, there's three components to the sensor system.
1) The sensor itself
2) The Linux kernel
3) The Sensor HAL
Only point 1 and 2 affects the limits of the sensor frequency, and Android is informed about the limit via the Sensor HAL. If you run:
Code:
adb shell dumpsys sensorservice
You'll see what the current limits of the sensors are. You'll also get information about what type of sensors is on the watch.
Sensor - MPU6051
Now, since I don't have the device, I took a check at iFixit and according to the teardown ( https://www.ifixit.com/Teardown/Motorola+Moto+360+Teardown/28891 ) the Moto 360 v1 uses a MPU6051. You're in luck, since it's the MPU6500 I've been writing drivers for all week. So I know a thing or two about this chip series
Now we know the chip (MPU6051). The 6051 is PROBABLY just a 6050, but with a partly customized Digital Motion Processor binary for step detection (pedometer). So we check the datasheet for the 6050 (since 6051 isn't public): https://www.invensense.com/wp-content/uploads/2015/02/MPU-6000-Datasheet1.pdf
Since it's a MPU chip, you also need the register map: https://www.researchgate.net/file.P...2ca&assetKey=AS:[email protected]
According to section 6.2 (Accelerometer Specifications) the output datarate is 1000Hz, and the low pass filtered output datarate is 260Hz. So, that is your limit. No matter what...you won't get higher than 1000Hz. So, now we know limit #1 (1000Hz).
Kernel
Then the next is the kernel. The kernel will ask the MPU for data at programmable intervals. E.g. every 10ms. This is limited AND controlled by the CONFIG_HZ parameter. My guess is that your kernel will run at CONFIG_HZ=128 (since it's an OMAP device). If you're polling one piece of data, every kernel scheduled tick, then that's 128Hz of data. So, that is your kernel limit....except....
The MPU series has an on-board FIFO, so it can store results for you. This 1024 byte FIFO can store up to 170 results (Section 4.18 in the register map states that one accelerometer measurement uses 6 registers => 6 bytes), and with this you can unlock the full 1000Hz of data.
But Blystad....what do I do with all this information?
Right....
1) find the current limits (dumpsys sensorservice)
2) ensure that you can build a version of the 6050 HAL for your device
3) start some accelerometer test, and find the current frequency limit of your accelerometer.
4) Try to increase the minDelay in the Sensor HAL (this results in a higher max frequency reported to Android)
Also, be aware, I only have experience with normal Android OS, not Android Wear, so I don't know if Google introduced any limits on Android Wears sensors.
blystad said:
Hey,
I've spent the last three weeks writing sensor drivers, and Sensor HAL for an OEM, so I'll try to provide some insight into how the sensor framework works, and what the sensor frequency is limited by.
...
.
Click to expand...
Click to collapse
Awesome, thanks man, I just skimmed through this and it seems extremely helpful!
I'll try working on this ASAP!
kevinnout said:
Awesome, thanks man, I just skimmed through this and it seems extremely helpful!
I'll try working on this ASAP!
Click to expand...
Click to collapse
Hi, kevinnout,
I am currently working on moto 360 2nd gen kernel - and getting stuck :'( , I'm curious to know if you've successfully created a kernel for overclocking the accelerometer. I'm very happy if you can tell me what to do if you have done that.

Categories

Resources