NPU in terms of Semantic Image Segmentation Technology in Honor 10 - Honor 10 Questions & Answers

When referring to Kirin 970, HP fans always discuss its AI technology powered by an independently built-in NPU. But what does the NPU bring to camera? Now, I’ve got two sample photos to explain this.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
These two pictures cited from Honor Forum China describe the Semantic Image Segmentation Technology of Honor 10. The picture on the top is the original one and the one on the bottom was from AI. From these two pictures we can see that the trees and grass are glorious of color in the latter picture, while the lady and the horse are sharper and clearer. The reason for this phenomenon is that the SIST is able to identify multiple objects in one single image. In other words, it can pinpoint the outlines of the trees, the grass, the lady, and the horse separately in this picture. Then it automatically processed these objects respectively and automatically adjusts your camera settings to help you shoot like a professional photographer. So in this case, the NPU did work and made the photo more attractive and professional.

Related

Message to HTC or Sprint

Quick thought for anyone working on the evo3d project. Bundling the hard case that protects the cameras lens and serves as a kickstand would eliminate both issues (even as noted by evo4g testers.) This could also be an excellent tool when marketing the device.
I picture a commercial with a naked 3d showing off some power in portrait mode then switching to landscape to take a photo then some CGI of the case coming in, snapping onto the phone, kickstand deploying, then video/picture viewing.
With the glut of aftermarket shells that will be on the shelves of retailers and online at discount prices shortly after release, it would be most profitable to market your shell along with the device. Do not include the belt clip if you would like to make some residuals of of an accessory.
This will certainly work to project the "hot new device" appeal. There were many second generation adopters that didn't own a 6700, but picked up a mogul partially due to the impressiveness of the whole package (not a silly egg-crate, btw.)
Thanks
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}

[Q] How to increase frequency of input events?

Hello!
I've started toying around with android app development and decided to code a drawing app that can use a stylus (I have a galaxy note 3). First thing I did was to implement a naive drawing machanism. What I do is I create lines from the motion events that I get, including the event history (getHistory*() funcions), expand them to rectangles and render lines with varying widths according to the pressure with the help of a fragment shader and a framebuffer. I've noticed that when using other drawing software (Sketchbook for Galaxy, S Note), I get smoother curves. By which I don't mean the anti-aliasing, but a higher amount of "key points". First thing that came to mind was that those programs interpolate the events to generate smoother lines - the points appear to be about 2x as dense. But then again, it's possible that they somehow get more input events because the processing is faster.
I don't have much experience with android, so I thought I might ask around if it was possible to increase the amount of input events that I get, and if it is, how? Or does the amount of events depend on processing time or do they just land in the history if the process was busy?
Here's what lines in Sketchbook and my code look like:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}

Request: photos to small objects.

Hello guys, as the title says I'm asking for photos to small objects or fine prints from very near, better if it's of electronic components with labels on them as such example
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Why? Because I'm thinking of buying this device and I'm looking, if it even exists in this price range, for a device that can take pictures of said objects, the reason I won't buy a camera or a more expensive phone is because I can't take a camera with me and I don't want to risk to smash an expensive phone while I have it in my pockets (manual labor )
Right now I have a P8lite 2017 and it can't focus on small objects, but it has a good battery, it's not expensive, it's fairly good to have on me while I work, sadly I have to send it to RMA 'couse the screen is starting to go yellow in a corner so I need to have it replaced while the warranty lasts.
Recomendation on other phones are also welcome
Thanks
Hey man, just buy a macro lens.
I don't know how to scale photo here, so its hidden...
I feel ashamed I haven't thought of that..thanks for the advice, really

capture 2 camera (front and back) in same time

hi developer
i need capture front camera and back camera in same time and use 2 pieces fisheye lens for capturing 360 degree same as insta360 one x
how can you help me?
try hmd camera/nokia camera 8.0260.50 not sure about 360°
Nobody can help?
nokia apk (for all androids). see photos taken with pocophone
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
works like this with videos and photos
Short answer: most likely no.
From what I understand you want a joint picture between the 2 cameras. That doesn't exist AFAIK. The only thing u can get with *some* cameras is a small window on the back camera that houses the image from the front camera. I think Sony had that feature on one extension... But it's pretty much impossible to port. There must be other apps that support this but certainly not full on 360. How would you use 2 *different* cameras for this? You'd have dramatic differences in quality, locked white balance and exposure (if you want a viable image at all) etc..

Transparent Galaxy Note 3 (Skin Template)

Well, as you all know, Galaxy Note 3 is quite old, and barely gets supported by most accessory manufacturers nowadays, so I decided to take some pictures, and scan the internals of my Galaxy Note 3, and stitch it all up, so I can use it like a transparent skin mod on my clear case, and I wanted to share it to you guys, so enjoy printing lol.
(You might be wondering, "take some pictures"? Yeah I did, I used multiple images and stitch them all up. Normal methods doesn't produce the desired clarity: Scanning the whole phone opened, produced blurred components, and are out of focus, and only those that are touching the glass surface of the scanner are that on focus, such as the battery, pen and the camera; Taking a normal whole picture of the phone, on the other hand, didn't have the blurring issue found on scanning, although the resolution of the picture is affected by my other phone's Camera resolution (which is 12MP). And so, to workaround this, I instead took multiple pictures of the parts of the phone. That's why you can see some unbalanaced lighting here and there throughout the image.)
BTW USE THE LINK BELOW TO DOWNLOAD THE ORIGINAL IMAGE
https://www.androidfilehost.com/?fid=12420606652095401892
Preview:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}

Categories

Resources