HUAWEI P10 and P10 Plus: Three Tricks to Compose Captivating Photos - Huawei P10 Guides, News, & Discussion

A good photo needs an interesting subject, but the photo's composition is equally important. This principle can easily be observed when two people take a photo of the same subject at the same location: their photos will often look completely different. This tutorial walks you through some of the key composition rules used by professional photographers to take striking photos. Armed with these tricks, you'll soon be taking jaw-dropping shots with your smartphone.
The Basics
The Rule of Thirds
If you do a web search for the keywords "photography" and "composition", you'll churn up page after page of results featuring titles such as 9 Top Photography Composition Rules You Need To Know, 20 Composition Techniques That Will Improve Your Photos, or 18 Composition Rules For Photos That Shine. However, many of these articles are riddled with unfamiliar concepts and technical jargon that can prove overwhelming for beginners. This article takes a different approach, focusing on some more fundamental rules of composition that are easily mastered, such as the rule of thirds, the golden ratio, and the Fibonacci spiral.
Novice photographers tend to position their subjects in the center of the frame, which makes them stand out, but results in a less visually-appealing photo. If this applies to your photos, you may want to try applying the rule of thirds. To use the rule of thirds, take your frame and overlay a nine-block grid (dividing the vertical and horizontal space into three parts). Important elements in the photo should be aligned with one (or more) of the four intersection points in the grid.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Studies show that these intersection points are where the human eye tends to go first, so placing important elements on these points can help emphasize the subject and produce photos that are more visually appealing.
The Golden Ratio
The golden ratio is a similar technique that can be used to draw the viewer's glance. This technique uses a famous mathematical ratio to draw a nine-block grid with the unique, eye-pleasing proportions of 1:0.618:1. This type of grid is known as a phi grid. As with the rule of thirds, points of interest in the shot are lined up with intersection points in the grid to produce visually-harmonious, captivating images.
By now, you are probably wondering which of these two grids you should be using. In fact, the difference between the two is quite subtle. Research shows that our eyes are drawn intuitively towards the four intersecting points in the rule of thirds grid, while the points of intersection in a phi grid can be harnessed to create more harmonious, aesthetically-pleasing photos. However, every picture is different, so there is no hard-and-fast rule about which guideline you should use.
Advanced Composition
Fibonacci Spiral
The Fibonacci spiral is another camera overlay which is derived from the golden ratio. The subject is usually placed at the smallest part of the spiral, which guides the viewer around the image in a natural flow. The Fibonacci spiral is a useful tool for organizing and framing visual elements that expand outwards from the center of a photo.
Configuring Assistive Grids
Most experienced photographers know exactly how they will compose a shot before they press the shutter. Many of their photos conform to the rules of thirds or golden ratio even when they don't use a viewfinder overlay. Moreover, true masters of the art will deliberately break these rules of composition to achieve even more spectacular results. However, less experienced snappers may find that an assistive grid allows them to get more out of their camera. The HUAWEI P10 and P10 Plus feature built-in assistive grid support, allowing budding photographers to put these rules of composition to the test.
To enable an assistive grid on the HUAWEI P10/P10 Plus, open Camera, swipe left on the screen to open the camera settings, and then touch Assistive grid. Select the type of grid you want to use, return to the viewfinder screen, and then start snapping away!

deleted

Related

Four steps for taking portraits with blurred backgrounds

Black and white photos have a degree of detail and contrast that confers them a unique, moody intensity. However, a carefully-composed, artistic photo is easily ruined by background objects, which can distract the viewer. Good photographers sometimes manage to use creative camera angles to keep some of this "background noise" out of shot, but such techniques only get you so far.
For example, I originally intended for the photo below to center on the removal men at work, but they were drowned out by other objects in the foreground and background.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
I took the following photo at a low angle to try to give the teddy bear a "larger-than-life" look, but once again background objects stole the show and detracted from the desired effect.
When I place my photos side by side with some of the slick, glossy photos my friends share with me, I'm too ashamed to even contemplate posting them on social media services such as Instagram and Facebook.
However, more recently, I discovered a clever trick on the HUAWEI P10/P10 Plus that can be used to blur out background objects and make the subject more prominent. This technique produces absorbing, arty shots that are guaranteed to garner you more "likes" on social media. Moreover, no fancy camera angles are necessary; simply take your phone, find an interesting subject, and point and shoot.
When you take ordinary black and white photos, usually both the foreground and background are in focus, so there is no obvious subject or theme. However, by combining the black and white and wide aperture shooting modes on the HUAWEI P10/P10 Plus, you can blur out the background and place emphasis on a particular object or person.
If you look closely at the images below, you will observe that the photo on the left is overexposed and has a cluttered background. The photo on the right, on the other hand, was taken with the HUAWEI P10/P10 Plus and effectively combines the black and white and wide aperture shooting modes to reduce background interference and create a more dramatic contrast. This is particularly noticeable in the "Cloud Park" lettering, which has a much clearer outline.
After learning and applying this technique, and with a bit of practice, my black and white photos now look infinitely better, to the point that I can proudly post them on social media for my friends to see. To achieve similar results yourself, simply follow the four steps that are set out in the animated graphic below.
By combining these two shooting modes on the HUAWEI P10/P10 Plus, you can produce photos with that timeless black and white look, while enjoying all of the speed and convenience that modern technology can offer.

Making photos a way better on Galaxy A10

Hey there geeks! Today I wanna share with you the results of my long so-called research in the field of camera applications for the galaxy. I have tested SO MUCH options besides the built-in app and after several months of comparisons, I can say with confidence that I found the best of them. Of course, this is not a GCam port that is not supported on our device. However, this is so far the only application with the correct HDR, which I use on an every day basis. So, let's start the comparison.
There is always a standard camera on the left, and SnapCamera on the right. In both cases, HDR is turned on. There is no post processing.
Photo 1. Backlight sunlight.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
It seems that the standard camera wins here, however, pay attention to the leaves of the tree. A standard camera tries to snatch parts from an underexposed area and gives a touch of artificial and a plane picture. SnapCamera maintains balance and space for further processing.
Photo 2. Crops, focus point on the carpet.
Pay attention to the light behind the tulle. Comments aren't needed I guess.
Photo 3. Exposition of reflections, focus point on the phone keys.
A good example when a standard application could not save the dynamic range and overexposed the sources of reflected light. However, SnapCamera did just fine with this.
Photo 4. Random object.
In general, there are almost no differences, but SnapCamera produces more balanced colors with larger dynamic range (look at the illuminated area of ​​the carpet).
Photo 5. Shot on the front camera.
Here is totally up to your taste. The standard application uses built-in algorithms for selfies — it gives photos warm tones, smoothes the skin and tries to compensate for the lack of details by highlighting some areas. If you need a quick photo for instagram - use the standard application. But personally, I like the result with SnapCamera — the photo is clearer, has much more information and is closer to life.
My HDR settings:
And guys (!) I didn't try to take these pictures as pieces of art, it's just a matter of demonstration.
Verdict
Until Samsung improves algorithms for budget cameras, SnapCamera is the best alternative on a dayly basis.
(But we do not give up hope that someone will still port gcam)
Sorry I'm new here so I can't insert side links to download
Unfortunately in my experience snapcamera is not so good in lower light conditions, the shots are grainy. Be nice to get 64bit ROMs so we can use gcam, which wins hands down on my past devices.

Explore the World in a Whole New Way with Visual Searches

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
It's probably happened to you before: You're scrolling through Instagram, and see a friend's post about their recent vacation. You notice they're wearing a beanie, but can't figure out the brand. This is where visual searches can step in.
What is a visual search?
Visual searches are backed by a power-efficient hardware architecture, a suite of advanced AI technologies, and some very complicated algorithms. For instance, artificial neural networks, such as a convolutional neural network (CNN), a deep learning algorithm, are commonly used in the process of image recognition and processing. These technologies help visual search engines understand and identify objects in photos and even videos, in order to generate related search results.
A visual search engine mimics the human brain's ability to recognize objects in images, using data collected from cameras and sensors. Whether you're trying to identify a plant in your yard, or a cute little puppy on the street, you can easily do so, just by taking a picture – without having to type words to describe what you're looking for in a search box.
A couple of things a visual search can do:
- Perform a search faster than a typed search
- Allow you to easily compare prices between different stores
- Support searching for multiple results simultaneously
Performing visual searches in Petal Search
Huawei has introduced visual search technology into Petal Search, to provide a better searching and shopping experience. All you have to do is snap a photo, and the app will identify the object and find information on it. You can search for a wide range of content across different categories, such as animals, food, clothes, and furniture.
To search by image, simply open Petal Search, and tap the camera icon in the main search bar. You can take a photo with your camera or choose one from your gallery. When you search for a product – let's say a houseplant – Petal Search picks out details of the plant from the photo you upload to the engine, and then processes the image data. It can consequently discover information such as how much it costs, where to buy it, and can even return images of other plants that are visually similar.
Better yet, Petal Search is capable of searching for and detecting multiple targets in an image at a time. For instance, when you take a photo of a sofa with a blanket and cushion on it in a furniture showroom, Petal Search can identify all three items at the same time and deliver comprehensive search results in a single list. This saves you the hassle of having to take tons of photos and search them one by one. Visual search technology has made shopping so much easier.
So, next time when you're trying to identify a plant in your yard, a bag you've seen your coworker carry, or solve a math problem, simply take a photo, and Petal Search will find it in a matter of seconds!

General Product Review | OPPO Reno8 5G | OPPO Ambassador (Part 4)

Please find quick links to:
Part 1
Part 2
Part 3
[Shooting for night photography - indoor]
I am impressed by the result, considering the lighting situation at night at Rain Vortex. Details and colors are retained for most areas. Noise is detected in darker areas but not clearly visible.
In wide-angle, the quality around the edges are reduced (chromatic aberration), but overall, it’s still great result.
Camera mode: Night
Location : Jewel, Changi Airport
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
normal zoom​wide - angle​
[Shooting for night photography - outdoor]
I think these photos speak for themselves. I am impressed. Detailed, good exposure and color reproduction.
Camera mode: Night
Location : Esplanade & Merlion Park, Singapore
normal zoom​2x zoom​
normal zoom​
[Shooting in Pro mode]
I use Pro mode when I take photo in a very low light setting or for a specific type of photography, e.g., light painting. For info, Reno8 does not have RAW format option and it provides histogram.
Pro mode allows me to adjust several aspects to achieve particular result.
Histogram at top left ​Light painting : ISO 100, Exposure 3s​
ISO 100, Exposure 0.5s​​[Shooting in Portrait mode]
What I love the most in this mode is the camera ability to blur the background nicely. It will bring more focus to the subject. The bokeh effect from the lights on the background is beautiful. Reno8 is truly ‘The Portrait Expert’.
[Video mode in 4K at 30fps - night]
I haven’t had a chance to take video during daytime as it was raining almost every day in Singapore. But I took this video from a light show in Jewel Changi Airport to show night video quality taken by the phone in 4K.
https://youtu.be/kuBV7NUim3A
CONCLUSION
OPPO is very caring to put features on what users might need. It shows in how detailed and personal the personalization in Reno8 5G. Depends on what’s important to you, for me, OPPO has gone beyond to provide the needs of security and privacy of user, and it’s something to think about. In terms of productivity, the phone has included features to help users working faster and efficiently.
For the price, you get a camera that allows you to create a professional-looking photographs. It’s great for beginner and hobbyists. Not to mention the stylish look and a smooth, no lags performance phone.
However, there is still room for improvement, such as Air Gesture feature, and the most important thing is the slot for SD Card. As a phone photographer working with photos and videos, who do most of the work using the phone, I am sure I need extra storage. I hope OPPO will include the SD Card slot for future model.
All in all, with Reno8 5G, you get more for the price you pay. Thank you OPPO!!
[end]

Lighting Estimate: Lifelike Virtual Objects in Real Environments

Augmented reality (AR) is a technology that facilitates immersive AR interactions by applying virtual objects with the real world in a visually intuitive way. In order to ensure that virtual objects are naturally incorporated into the real environment, AR needs to estimate the environmental lighting conditions and apply it to the virtual world as well.
What we see around us is the result of interactions between lights and objects. When a light shines on an object, it is absorbed, reflected, or transmitted, before reaching our eyes. The light then tells us what the object's color, brightness, and shadow are, giving us a sense of how the object looks. Therefore, to integrate 3D virtual objects into the real world in a natural manner, AR apps will need to provide lighting conditions that mirror those in the real world.
Feature Overview​
HMS Core AR Engine provides a lighting estimate capability to provide real lighting conditions for virtual objects. With this capability, AR apps are able to track light in the device's vicinity, and calculate the average light intensity of images captured by the camera. This information is fed back in real time to facilitate the rendering of virtual objects. This ensures that the colors of virtual objects change as the environmental light changes, no different than how the colors of real objects change over time.
How It Works​
In real environments, the same material looks different depending on the lighting conditions. To ensure rendering as close to the reality as possible, lighting estimate will need to implement the following:
Tracking where the main light comes from​
When the position of the virtual object and the viewpoint of the camera are fixed, the brightness, shadow, and highlights of objects will change dramatically when the main light comes from different directions.
Ambient light coloring and rendering​
When the color and material of a virtual object remain the same, the object can be brighter or less bright depending on the ambient lighting conditions.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Less bright lighting
Brighter lighting
The same is true for color. The lighting estimate capability allows virtual objects to reflect different colors in real time.
Color
Environment mapping​
If the surface of a virtual object is specular, the lighting estimate capability will simulate the mirroring effect, applying the texture of different environments to the specular surface.
Texture
Making virtual objects look vivid in real environments requires a 3D model and high-level rendering process. The lighting estimate capability in AR Engine builds true-to-life AR interactions, with precise light tracking, real-time information feedback, and realistic rendering.
References
HUAWEI Developers
AR Engine Development Guide

Categories

Resources