LightBlog

mercredi 28 février 2018

LG G7 Spotted with a 6″ 3120×1440 Display with a Notch and a Snapdragon 845

LG’s announcements at MWC 2018 have been, to put it bluntly, underwhelming. While a refresh of their main flagship line, the LG G series, is always present at LG’s press conferences, they opted for skipping the announcement of the LG G7 and instead went for a refresh of the LG V30, dubbed the LG V30S ThinQ. This is likely related to LG supposedly shelving current development of the LG G7 and beginning things from scratch. But a leaked video has surfaced showcasing the unreleased phone, so it’s possible they’re pretty far along in the process.

Israeli technology site ynet apparently got into a private LG event to get hands-on with a demo unit of the unreleased device, which seems to be heavily influenced this time around by Apple’s latest smartphone, the iPhone X. If the video is accurate, the LG G7 will be featuring a notch on top of the display, which houses the earpiece and the front-facing camera. These “notched” displays seem to be the trend going forward with most major players, as screen bezels keep getting smaller and smaller.

Going into the specifications, the LG G7 would be featuring a 6-inch POLED display with a 3120×1440 resolution, translating into a 19.5:9 aspect ratio—the same aspect ratio found in the latest Apple flagship. Internally we’re dealing with a typical 2018 flagship phone: It’s powered by a Snapdragon 845 processor and 64 GB of storage with 4 GB of RAM or 128 GB of storage with 6 GB of RAM configurations. Other specifications include a 3,000 mAh battery, a dual rear camera mount with a regular and wide-angle lens, and quite possibly the same AI features found in the recently announced LG V30S ThinQ.

It’s not clear yet whether this is the previously scrapped model of the LG G7 or the new, rebuilt one that’s possibly going to be announced, but given that it’s being shown in a private event we’re leaning towards the latter. Of course, this could be an early prototype and doesn’t have to necessarily resemble the final LG G7, but only time will tell whether this particular phone will make it to store shelves.


Source: ynet (Israeli) Via: @evleaks



from xda-developers http://ift.tt/2F0ySOF
via IFTTT

Native Support for Iris Scanners is finally coming to Android

Biometric authentication may not be as secure as pins or passwords, but its convenience is a big selling point for many consumers. The extremely quick fingerprint scanner on the OnePlus flagships has been praised almost universally, but lately companies have been gravitating towards facial recognition technology as an alternative. For instance, there’s the OnePlus 5T and the Honor 7X with their respective takes on a Face Unlock feature. Samsung phones also have facial recognition for unlocking their devices, but the biometric authentication technology the company is most proud of is its iris scanner. Now, it appears that iris scanners may be coming to more Android phones in the future, as official support for it is being added to Android.


Iris Scanners on Existing Android Hardware

The first mainstream Android smartphone with an iris scanner was the ill-fated Samsung Galaxy Note 7. That technology later made its way over to the Samsung Galaxy S8/S8+ and the Galaxy Note 8. We also know it will launch with the Samsung Galaxy S9/S9+ and it will offer incremental improvements in hardware, but by combining it with facial recognition, the overall experience should improve. (There’s also a possibility an iris scanner may make its way over to an unannounced Samsung Galaxy phone, but that’s up in the air as of now.)

For those of us without a Samsung Galaxy flagship, there aren’t very many options when it comes to a smartphone with an iris scanner. In fact, there’s actually only a single option, and the phone isn’t even yet available for sale. An obscure smartphone called the BitVault that is aimed at cryptocurrency enthusiasts.

BitVault: the self-proclaimed “World’s First Blockchain Phone”. Source: Swiss Bank In Your Pocket.

This smartphone, along with an unannounced smartphone from a Japanese smartphone OEM, are the only non-Samsung Galaxy devices that I’m aware of that offer iris scanning. The chip that powers these phones’ iris scanners is the FPC ActiveIRIS by Fingerprints.

FPC ActiveIRIS

FPC ActiveIRIS. Iris Recognition for Smartphones. Source: FPC.

You may have never heard of this company, but you have most likely used a smartphone that incorporates their technology. Some of the smartphones that use fingerprint scanners from FPC include the Google Pixel, the Honor 8, and the Huawei Mate 9 Pro. Their fingerprint sensors are found on many other devices, including several from Xiaomi, so it’s safe to say that FPC is one of the leading vendors in selling the biometric authentication technology found in smartphones.

FPC Fingerprint Scanners on the Home Button, Rear, and Side of the Device. Source: FPC.

So why is this company important? It’s because several of their engineers have been working on incorporating native support for biometric iris scanners in Android. There are several commits here, all of which should be looked at together to get a good picture of what’s going on.

Iris Scanners in a Future Version of Android

Let’s start with the most important commit: the Biometrics Iris HAL interface.

Android P Iris Scanner

The inclusion of a HAL interface will standardize how the Android framework will communicate with Iris scanners. This means that products from multiple vendors, not just from FPC themselves, will be able to function on Android. Most importantly, this also opens up the ability for AOSP-based ROMs to function generically with Iris scanning hardware. For instance, the Project Treble GSIs rely on this in order for basic fingerprint scanner functionality to work out of the box, so without this, the new Exynos Samsung Galaxy S9 and Galaxy S9+ will be unable to use the Iris scanner on an AOSP ROM.

The SELinux policies for the Iris scanners are wholly uninteresting for end users, but they’re there if you want to take a look at it. The inclusion of the base Iris feature in Android will allow for apps to detect if the device has an Iris scanner in place. Finally, the inclusion of the Iris framework is what will actually allow for third-party apps to utilize the Iris scanner for authentication in the future. Here are the relevant strings:

Iris Scanner in Framework


<string name="permlab_manageIris">manage iris hardware</string>
<!-- Description of an application permission, listed so the user can choose whether they want to allow the application to do this. -->
<string name="permdesc_manageIris">Allows the app to invoke methods to add and delete iris templates for use.</string>
<!-- Title of an application permission, listed so the user can choose whether they want to allow the application to do this. -->
<string name="permlab_useIris">use iris hardware</string>
<!-- Description of an application permission, listed so the user can choose whether they want to allow the application to do this. -->
<string name="permdesc_useIris">Allows the app to use iris hardware for authentication</string>

<!-- Message shown during iris acquisision when the iris cannot be recognized -->
<string name="iris_acquired_insufficient">Couldn\'t process iris. Please try again.</string>
<!-- Message shown during iris acquisision when the iris image is too bright -->
<string name="iris_acquired_too_bright">Iris is too bright. Please try in low light.</string>
<!-- Message shown during iris acquisision when the iris image is too dark -->
<string name="iris_acquired_too_dark">Iris is too dark. Please uncover light source.</string>
<!-- Message shown during iris acquisision when the user is too close -->
<string name="iris_acquired_too_close">Move further.</string>
<!-- Message shown during iris acquisision when the user is too far -->
<string name="iris_acquired_too_far">Move closer.</string>
<!-- Message shown during iris acquisision when the user eyes closed-->
<string name="iris_acquired_eyes_closed">Open eyes.</string>
<!-- Message shown during iris acquisision when the user eyes partially obscured-->
<string name="iris_acquired_eyes_partially_obscured">Open eyes wider.</string>
<!-- Array containing custom messages shown during iris acquisision from vendor. Vendor is expected to add and translate these strings -->
<string-array name="iris_acquired_vendor">
</string-array>

<!-- Error message shown when the iris hardware can't be accessed -->
<string name="iris_error_hw_not_available">Iris hardware not available.</string>
<!-- Error message shown when the iris hardware has run out of room for storing iriss -->
<string name="iris_error_no_space">Iris can\'t be stored. Please remove an existing iris.</string>
<!-- Error message shown when the iris hardware timer has expired and the user needs to restart the operation. -->
<string name="iris_error_timeout">Iris time out reached. Try again.</string>
<!-- Generic error message shown when the iris operation (e.g. enrollment or authentication) is canceled. Generally not shown to the user-->
<string name="iris_error_canceled">Iris operation canceled.</string>
<!-- Generic error message shown when the iris operation fails because too many attempts have been made. -->
<string name="iris_error_lockout">Too many attempts. Try again later.</string>
<!-- Generic error message shown when the iris operation fails because strong authentication is required -->
<string name="iris_error_lockout_permanent">Too many attempts. Iris sensor disabled.</string>
<!-- Generic error message shown when the iris hardware can't recognize the iris -->
<string name="iris_error_unable_to_process">Try again.</string>

<!-- Template to be used to name enrolled irises by default. -->
<string name="iris_name_template">Iris <xliff:g id="irisId" example="1">%d</xliff:g></string>

<!-- Array containing custom error messages from vendor. Vendor is expected to add and translate these strings -->
<string-array name="iris_error_vendor">
</string-array>

<!-- Content description which should be used for the iris icon. -->
<string name="iris_icon_content_description">Iris icon</string>

<!-- Title of an application permission, listed so the user can choose whether they want to allow the application to do this. -->

In the Manifest of the Framework, the suggested permission titled “android.permission.USE_IRIS” has a protection level of “normal,” so third-party apps would indeed be able to request the permission and it would be up to the user to grant it.

Iris Scanner Android

Lastly, another commit adds support for iris identification in the keyguard. This is what will actually allow the user to scan their iris to dismiss the lock screen. According to the commit, iris authentication only occurs as soon as the screen turns on in order to reduce power consumption. Further, the Iris scanner can be disabled according to the Device Policy Manager if that authority (such as a workplace) deems the iris scanner an insecure method of authentication.

Iris Authentication DPM

Something interesting going on in all of these commits is how, in many places, references to fingerprints in the Android framework are being genericized to refer to biometrics. This prepares Android for potentially additional methods of biometric authentication in the future, though it’s unclear what that may be.

Android P Iris Scanner

I won’t bore you with the rest of the implementation details, so I’ll move on to discuss the significance of these commits. What this means for Android is that a future version of Android, likely Android P, will include native support for Iris scanning hardware. I say “likely” because the commits haven’t been merged yet—the changes are very lengthy, and could take a few weeks or even months to pass code review.

It’s very likely that it’ll make it in for Android P, however, and there are even hints of the Iris scanner framework code having P-specific changes in place (such as doing away with storing user-information in /data/system/users and instead re-locating them to a new /data/vendor directory, likely secondary to undisclosed Project Treble requirements).

Further, this does appear to be full support for Iris scanners, though this doesn’t mean that additional features won’t be added by other vendors (in fact, the comments explicitly mention that). The basic implementation is there, though, so we should expect to see future smartphones shipping with biometric Iris scanners. There is no evidence in these commits that the Google Pixel 3 will have such a feature, though, so don’t assume that any particular device will have an Iris scanner because of these changes.

Note: I did reach out to FPC for comment on these changes, but did not receive a response from them by the time of this article’s publication.



from xda-developers http://ift.tt/2F5Dkri
via IFTTT

Huawei P20 Lite with 5.6″ FHD+ Screen and Dual Rear Cameras Makes an Appearance

Huawei has decided to skip over Mobile World Congress this year to hold an event of their own, and as previously announced, the Chinese giant is preparing to launch their P20 line of devices on March 27, in Paris, France. Leaks for these devices have been making appearances despite Huawei’s absence at MWC, to the point where we already know almost everything about the 3-device lineup—the P20 Pro, with a triple rear camera mount, will be joined by the P20 and the smaller P20 Lite. We’ve seen the regular P20 already, and we’re now getting to know more about its smaller sibling, the Huawei P20 Lite.

Source: Evan Blass // VentureBeat

The P20 Lite is decidedly a mid-range device, at least specifications-wise: It will allegedly be powered by the octa-core HiSilicon Kirin 659 SoC with 4 GB of RAM and 64 GB of storage; the same formula that has already been tried-and-tested with the Honor 7X. On the outside, though, it looks much more premium that your average mid-range device. Huawei is going all in with the iPhone X/Essential Phone notch trend, for better or worse. The display is also being upgraded to welcome the aforementioned notch. The 2250 x 1080 FHD+ panel is not quite 18:9, as the resolution actually translates to 18.72:9—way closer to 19:9, and slightly taller than the Galaxy S9/S9+ panel.

Turning over to the back, we have a glass back with a rear-mounted fingerprint scanner and a vertically-mounted dual lens setup resembling the trend initiated by the iPhone X. The Leica co-developed camera setup with 16 MP sensors should provide more than decent camera performance, especially given the mid-range specifications. Other hardware specifications include a 3,520 mAh battery that, given the switch from a metal build to a glass back, could be aided by wireless charging.

All in all, the P20 Lite is looking up to be pretty similar to its flagship siblings while cutting the right corners. No information regarding pricing has been provided yet, but we should know more about that during the announcement event on March.


Source: VentureBeat



from xda-developers http://ift.tt/2F0UL0p
via IFTTT

[Update: Confirmed] Google May Remove Access To Undocumented/Hidden APIs In Android P

Update 2/28/18: Google has published a blog post today confirming the changes. More details at the end of the article.

While some Android enthusiasts are speculating what dessert the next version of Android will be named after, there are some interesting developments going on behind-the-scenes. We’ve spotted a few note-worthy upcoming features in Android P, but a more recent discovery in the Android Open Source Project (AOSP) has proven far more interesting. According to these recent commits, applications may be restricted from accessing APIs that are undocumented in the Android SDK (such as APIs marked by the javadoc’s attribute @hide).


Why this matters

The Android Software Development Kit (SDK) provides developers with API libraries and tools that they need to test and build new Android applications. With each new release of Android comes a whole host of new APIs that are available to developers through the Android SDK. What APIs are available to an app depend on what compileSDKVersion the developer sets. That’s why Google’s new Play Store requirements are so significant—it will force applications to update and migrate to using newer APIs.

Google hosts documentation pages for each class and all of its methods that are available in each API level. These are the set of documented APIs that are available in the official Android SDK. You can browse the list of classes easily using an Android app such as the recently released Android SDK Search app by Android Engineer Jake Wharton.

Android SDK Search (Free, Google Play) →

However, not all APIs that are available in each Android release are documented by Google, or available in the official Android SDK. There are often useful APIs that are undocumented, but are nonetheless very useful. It isn’t recommended that developers build their apps using undocumented, or hidden, APIs, but many do so because there’s simply no alternative if they want to offer a certain feature. Developers that use hidden or undocumented APIs can put themselves at a competitive advantage as well, since they can offer features that their competitors—who stick to the APIs offered by the Android SDK—cannot.

While I cannot provide a list of apps that utilize undocumented APIs (developers probably don’t share which ones they use because it would give their competitors a leg up), the list is probably rather large. Thus, I would conclude that banning access to hidden APIs would be significant. Mark Murphy, founder of Commonsware, agrees:

I agree with the assessment that bulk-banning access to @hide-annotated items will be a big deal, if that comes to pass. Hopefully, few apps access these items as part of key functionality. However, I suspect that lots of name-brand apps use them on occasion, directly or through a library.


What is happening in Android P?

These upcoming changes were first noted by XDA Senior Recognized Developer rovo89, the developer of the Xposed Framework. He pointed out two commits to me, one of which has been merged, which introduces a new build tool called ‘hiddenapi.’ This tool modifies the access flags of all class members within a DEX file if their signatures appear on an input greylist or blacklist, and if so, the marked methods will be treated as internal APIs with restricted access. The other commit describes how the API blacklist works; it prevents access to boot class methods and fields marked by the aforementioned ‘hiddenapi’ that developers may access by static linking, reflection, and JNI.

According to rovo89, the end result of these two changes in Android P is the following:

If these commits get merged, it would mean that apps can no longer use/access hidden APIs, that is classes, methods and fields which are annotated with @hide in AOSP and therefore not part of the official SDK. This wouldn’t be a problem for Xposed modules as I could easily revert those commits or allow modules to also access these APIs. But there are many apps which take advantage of hidden APIs, and those would fail in the future.

Indeed, further commits show that this may be what Google is planning. This commit states the following:

Android P

While this particular commit wasn’t merged as it was abandoned in favor of 3 smaller commits, the commit message describes the purpose of these changes. Another set of commits show that Google will suggest alternatives to developers who seek to use non-public APIs:

Android P

However, there are often no alternatives to certain hidden APIs. We at XDA can speak from experience here as unfortunately this change may spell the end of some innovative apps, or it may require some big-name apps to reduce their functionality. This upcoming change seems similar in spirit to the recent crackdown on Accessibility Services (that was thankfully paused as Google evaluated innovative uses). While most apps that utilize undocumented APIs do so for benign reasons, there may be some apps that have misused them for nefarious purposes.

Because of this, Google may be locking down access to all hidden APIs in Android P in order to safeguard users from the few that abuse them. It’s hard to say just how much of an impact this may have on users, but if you are a developer considering looking through AOSP to find an innovative use of a hidden API, then you may want to reconsider.


Update: Google Confirms

In a blog post published today, February 28th, Google has confirmed these changes. Citing crash risks for users and subsequently forcing developers to roll out emergency fixes, Google states the company has been gradually shifting towards discouraging developers from accessing non-SDK interfaces. Starting with Android P, the restrictions will expand to cover the Java language interfaces of the SDK.

The company states that “some non-SDK methods and fields will be restricted,” though they did not elaborate on which ones would be restricted. Initially the restriction will focus on interfaces that are rarely used, and for a while the company will allow developers to continue to use non-SDK methods and fields where transitioning to an SDK method is technically challenging. However, eventually the restrictions will broaden, so developers of apps using non-SDK methods should transition as soon as possible in preparation for Android P. As for methods without an SDK alternative, Google is requesting developers to post on their bug tracker with more information.

The next developer preview, ostensibly arriving soon, will allow developers to test existing apps against the blacklist or greylist before the final release.



from xda-developers http://ift.tt/2DGETw6
via IFTTT

How to Beta Test EMUI 8.0 (Oreo-based) on the Honor 7X

Samsung Galaxy S9 and S9+ Hands On: More of the same, but with a bit more polish

If there were any doubts that Samsung had lost its touch, the South Korean company quelled them at Mobile World Congress 2018… mostly.

On Sunday, the smartphone maker formally announced the Galaxy S9 and S9+, the newest phones in its storied Galaxy series. Both have lightning-fast processors in Samsung’s Exynos 9810 or Qualcomm’s Snapdragon 845 (depending on the model), industry-first variable f/1.5 + f/2.4 aperture rear cameras, professionally-tuned stereo speakers, and new software features such as AR Emoji, Samsung’s take on the Apple’s Animoji.

But while the Galaxy S9 and S9+ check every box imaginable, they lack the element of surprise.

There’s no beating around the bush: Samsung’s new standard-bearers are less revolutionary than evolutionary. They’re nearly identical to their predecessors in terms of design, right down to the curved edge-to-edge 18:9 displays, glossy back plates, and Gorilla Glass-shielded exteriors. And with the exception of new processors and speakers, not much has changed on the inside.

Passing judgment from afar isn’t exactly fair, though, so when Samsung extended XDA an invitation to try out the Galaxy S9 and S9+ for ourselves at its New York City venue, we eagerly accepted. Our impressions after an hour with both phones? Positive. Still, we can’t help but feel that while the Galaxy S9 and S9+ are faster, brighter, and louder than every other Galaxy smartphone that’s come before them, Samsung played it safe.


Design

The Galaxy S9 is distinguished from the Galaxy S9+ only by its screen size (it has a 5.8-inch display compared to the Galaxy S9+’s 6.2-inch display), dimensions (it measures 147.7 millimeters in length and 68.7 millimeters in width; the Galaxy S9+ is 158.0 millimeters long and 73.8 millimeters wide), and weight (it’s 26 grams lighter than the Galaxy S9+). It also lacks the Galaxy S9+’s secondary camera, and settles for 4GB of RAM instead of the S9+’s 6GB. Otherwise, the two phones are pretty much identical.

That’s especially true when you’re squinting at the two from a distance. It’s only when you hold them side-by-side that the differences become more apparent, albeit only slightly.

What was more striking to us, though, was just how similar the Galaxy S9 and S9+ feel and look like the Galaxy S8 and S8+. The pair isn’t a perfect analog for its outgoing forerunners, but most folks will have a tough time making out changes such as the ever-so-slimmer top and bottom bezels and subtler curve to the left and right of the screen.

Galaxy S8/S8+ fingerprint orientation on the left; Galaxy S9/S9+ fingerprint orientation on the right.

One thing they might notice is the fingerprint sensor, which was adjacent to the rear camera on the Galaxy S8 and S8+. It’s been moved beneath the sensor module (which is now oriented vertically as opposed to horizontally on the S8 and S8+) on the S9 and S9+, which is a welcome improvement. Swiping a fingertip across the sensor, which used to require shimmying your hand up the sides of the phone to reach a finger around the volume rocker (or the power button, if you’re left-handed), is a much less arduous task than it used to be. The sensor is now situated below the camera on the rear panel. No finger kinesthetics required.

Fingerprints, while we’re on the subject, are something of a given on the Samsung Galaxy S9 and S9+’s Gorilla Glass 5 front and back. The scanner’s tweaked placement might prevent wayward digits from smudging the phones’ camera lenses, but does little to shield the highly reflective cover from sweaty, oily skin. As with the Galaxy S8 and Galaxy S8+, you’re going to want to throw the S9/S9+ in a protective case or carry around a microfiber cloth to keep it spick and span.


Camera

The Galaxy S9/S9+’s camera interface.

The Galaxy S9 and S9+’s design may not be radically different from the Galaxy S8 and S8+’s design, but the cameras are where the phones really shine. In fact, they’re easily the highlight.

There was initially some confusion about whether the Galaxy S9 and S9+ are capable of 4K HDR video recording. It’s a feature of the Snapdragon 845’s imaging chip, and a Qualcomm press release on Monday, since edited, contained language suggesting Samsung’s flagships would be one of the first on the market to support it. Unfortunately, that’s not the case: A Samsung representative confirmed to XDA that there are no plans to support 4K HDR video recording on either the Galaxy S9 or S9+. That puts the phones at a disadvantage compared to Sony’s newly announced Xperia XZ2, which has the same chipset and does support 4K HDR recording.

The Galaxy S9 has a 8MP f/1.7 aperture autofocusing front-facing camera (1/3.6″ sensor size, 1.22µm pixel size, and 80-degree field of view) and a 12MP rear camera (1/2.55″ sensor size, 1.4µm pixel size, and 77-degree field of view), with the S9+ packing an additional 12MP telephoto lens (1/3.4″ sensor size, 1.0µm pixel size, 45-degree field of view) for “2x zoom”. The sensors have Super Speed Dual Pixel, a faster and more accurate version of Samsung’s Dual Pixel focusing technology, but they otherwise haven’t changed — they retain the Galaxy S8 and S8+’s optical image stabilization, LED flash, and phase detection autofocus.

But the aperture is a smartphone first. It’s mechanical. The Pro mode in the S9 and S9+’s camera app gives you two settings to choose from: f/1.5, a lower aperture better suited for low light conditions (think nighttime and dimly lit offices), and f/2.4, the default setting. (Alternatively, the app’s Automatic mode switches to the f/1.5 aperture when ambient lighting dips below 100 lux.) A tiny motor in the Galaxy S9/S9+’s camera module is responsible for the adjustment — it contracts (when set to f/2.4) or expands (when set to f/1.5) a ring around the sensor’s lens.

The switch between the two apertures is nearly instantaneous — a major plus. And when we compared results from the two aperture settings at the same ISO and shutter speed, the photos captured in f/1.5 aperture seemed a little bit brighter and crisper than their f/2.4 counterparts.

We tested the Galaxy S9+’s camera in the camera app’s Pro mode with the focus, shutter speed, and white balance set to “auto”, and the exposure set to “0.0”. We took four photos in two different locations around Samsung’s demo venue: one with the aperture set to f/2.4, and a second with the aperture set to f/1.5. Here are the results:

The Galaxy S9 and S9+’s other camera improvements take advantage of the image signal processors (ISP) in the Exynos 9810 and Snapdragon 845 (the Spectra 280) and dedicated DRAM. Snapping a photo on either phones triggers a burst shot of 12 images, which the ISPs divide into three sets of four, combine on a per-set basis, and generate a single picture. Samsung calls it multiframe noise reduction; previous-generation Galaxy smartphones combined just three images.

The resulting composite is much more vibrant, crisp, and clear than a one-shot picture. (That won’t come as a surprise to anyone who’s used the Google Camera’s HDR+ mode, which takes a similar approach.) Samsung says the Galaxy S9 and S9’s improvements translate to 30 percent less noise in low-light conditions — a claim we’ll have to put to the test at a later date. The photos we took with the Galaxy S9 and S9+ seemed sharp and colorful to our untrained eyes.

Samsung gave the selfie sensor some love, too. On the Galaxy S9 and S9+, the 8MP front camera can optionally blur the background of images while keeping the foreground in focus in Selfie focus mode, much like the bokeh effect on the Google Pixel 2 and Pixel 2XL. It’s accomplished entirely in software, and the results aren’t perfect — several of our test selfies, the outer edges of the subject’s face are a bit smudged where the algorithm blended the image.

On the video side of things, the Galaxy S9 and S9+ have a new trick up their sleeves: 960FPS recording. Taking a page from the Sony Xperia XZ Premium‘s playbook, the handsets can capture clips in what Samsung calls Super Slow Motion. Unlike Sony’s Xperia XZ2 and XZ2 Compact, which can record at 1080p resolution, they’re capped at 720p (the clips are captured in 0.2-second bursts and play back as six-second videos). But we have no complaints about the quality: The few clips we captured were razor sharp and buttery smooth. We especially liked the automatic capture feature, which triggers Super Slow Motion when an object enters an adjustable, predefined boundary in the camera’s viewfinder.

Another nifty tool is a GIF generator that turns Super Slow Motion videos into shareable images, with effects such as an Instagram-esque Loop, Swing, and Reverse. (You can save the resulting image as your wallpaper, if you so choose.) It’s sure to come in handy when your social medium of choice doesn’t support video.


Display

If you’re like most people, you’ll spend a majority of time staring at the Galaxy S9 and S9+’s screen — not their back covers. Both phones have 2960×1440 Quad HD+ Super AMOLED displays with 18.5:9 aspect ratios (570 pixels per inch on the Galaxy S9; 529 ppi on the Galaxy S9+), and Samsung says they’re the “brightest ever” on a Galaxy series smartphone (they both reach 700 nits, or 15% higher than the Galaxy S8 series’ maximum).

That may be so, but the overhead lighting in Samsung’s demo space made it tough to judge the difference with the naked eye. Unfortunately, we didn’t have a brightness tester and were instructed not to take the phones outside, where direct sunlight might have made it easier to judge the improvements (and/or trigger high brightness mode). Suffice it to say that the Galaxy S9 and S9+’s panels are just as colorful and vibrant as they are on the Galaxy S8 and S8+, if not more so.

If the default, slightly oversaturated color palette isn’t to your liking, there are four to choose from:

  • Adaptive Display, the default option
  • AMOLED Cinema, which uses DCI-P3, the standard wide color space common in 4K HDR TVs
  • AMOLED Photo, which uses the Adobe RGB color gamut
  • Basic Screen Mode, which uses the sRGB/Rec. 709 color space.

Each has their advantages and disadvantages, with the AMOLED Cinema and Basic modes producing flatter but ostensibly more accurate colors than the two alternatives. It’s ultimately a matter of personal preference.

It’s worth mentioning that the Galaxy S9 and S9+ are certified by the UHD Alliance for Mobile HDR Premium content (thanks in part to support for DCI-P3). The nuances of HDR are a little complicated, but in essence, HDR videos and video games boast higher contrast and brightness than non-HDR media, contributing to a picture with more accurate colors overall.

It’s not just HDR content that benefits — according to a Samsung representative, the S9 and S9+ have Samsung’s Video Enhancer feature, a carryover from the S7 and S8 that boosts the brightness and color contrast of streaming and local video.

Samsung’s words rang true in our limited time with the Galaxy S9 and S9+. The HDR YouTube videos we watched were richly rendered on the phones’ screens, with the AMOLED screens’ deep blacks highlighting the bright reds, yellows, and greens.


Iris scanner

The Galaxy S8 and S8+ shipped with an iris scanner. It worked, but somewhat inconsistently in certain lighting conditions — especially if you wore color contacts or sunglasses, or held your phone beyond the recommended distance from your eyes. The iris scanner is present and accounted for in the Galaxy S9 and S9+, but with a fallback this time: facial identification.

A new feature called Intelligent Scan uses both the iris scanner and the front-facing camera to secure the phones. In practice, when you tap the power button, both sensors start scanning your face for matches. As soon as there’s a positive ID, it’s open sesame — you’re greeted with the home screen.


Audio

A great screen is nothing without great speakers to match, and the Galaxy S9 and S9+ are Samsung’s strongest showing yet in that regard. The down-firing, AKG Acoustics-tuned stereo speakers easily clear the low bar set by the Galaxy S8 and S8+. They’re noticeably louder (40 percent louder, Samsung says), and they’re capable of delivering a “simulated surround sound experience” thanks to Dolby’s Atmos 3D technology. (Samsung’s venue wasn’t particularly conducive to testing this.)

A dearth of supported content makes Dolby Atmos less of a value-add than it otherwise might be, but a Samsung spokesperson said that Atmos-supported videos and movies will come to the Netflix on smartphones later this year. Mum’s the word on the number and date.

Don’t expect Galaxy S9 and S9+’s speakers to blow you away, though. They might sound better than last year’s models, but they’re still too tinny and boomy to stand in for a decent boombox or Bluetooth speaker.


AR Emoji

Apple’s Animoji, which tap the iPhone X’s depth-sensing Face ID camera for goofy animated iMessages, have achieved something of a cult following. It’s enough to have caught Samsung’s (and Asus’s) attention: The Galaxy S9 and S9+ ship with AR Emoji, a face-mapped camera feature that uses the phones’ front-facing sensor to mimic your mouth, eyebrow, and head movements on a humanoid caricature.

They’re easy to get up and running: On-screen instructions have you stare head-on at the camera and select your gender, and the camera app does the rest, analyzing more than 100 points on your face to render a cartoon version of you — replete with hair, eyebrows, customizable clothing, and a disproportionately small body.

A mini-me isn’t the only AR Emoji on offer. Samsung partnered with Disney to bring 3D-rendered versions of Mickey Mouse, Minnie, and characters from Pixar’s Incredibles.

Whichever model you choose, the camera app automatically generates 18 animated AR Emoji stickers in a shareable format (MP4). There’s a host of additional masks, filters, and accessories to choose from. And unlike Apple’s Animoji, which can’t be exported from iMessage, AR Emoji work in any app — be it a messaging service like WhatsApp, a social network like Facebook, or a plain old email.

AR Emoji crashed and burned during Samsung’s press event in Barcelona on Sunday, and they were a little stiff in our experience, too. The single front camera struggles to track head movements and mouth movements beyond a fairly narrow field of view, and if you don’t hold the S9/S9+ close to your face when you’re creating an AR Emoji, the resulting animation can be really janky.

Suffice it to say, AR Emoji aren’t quite as endearing as the 1-to-1-tracked, cute and cuddly characters on the iPhone X.


Bixby improvements

Digital lipstick, courtesy Bixby Vision.

Bixby, Samsung’s homegrown digital assistant, makes a return on the Galaxy S9 and S9+. The latest incarnation can be launched via the Galaxy S9 and S9+’s dedicated Bixby button (below the volume rocker on the left-hand side): a single press pulls up Bixby Home, a collection of cards that contain timely information. You’ll see the weather report, a preview of your commute (based on your location and proximity to your saved work/home address), upcoming alarms, and health information (like you step count) from S Health.

None of that’s new, but Bixby Vision, Bixby’s machine vision feature, is improved in a few key ways. An augmented reality feature overlays shades of lipstick, eyeshadow, and other makeup on your face, letting you “try on” beauty products before you purchase them through Sephora and Cover Girl. Bixby Vision now supports real-time translation a la Google Translate. And if you point Bixby Vision’s viewfinder at food, it’ll serve up the estimated calorie count and other nutritional data.

The “digital makeup” feature worked well in our testing (maybe too well), but we didn’t have an opportunity to try Bixby’s new food recognition or real-time translation features.

It’s worth noting that after the Galaxy S9 and S9+ ships in March, Bixby will gain additional features. In August, Bixby 2.0, which launched in public beta in December, will roll out to phones, Samsung mobile chief DJ Koh told members of the press at MWC 2018. It’ll recognize multiple voices and integrate tightly with TVs, refrigerators, home appliances, and other connected appliances.


Performance

The Galaxy S9 and S9+, like other recent flagship Samsung phones before them, ship with one of two system-on-chip (SoC). This time around, it’s Samsung’s Exynos 9810 or Qualcomm’s Snapdragon 845.

It’s worth diving into the technical weeds to get a better sense of chips’ differences.

The Exynos 9810, the second SoC in the Exynos 9 series, is built on a 10nm FinFET process and adopts ARM’s DynamIQ architecture. It has four high-performance custom cores clocked up to 2.7GHz and four ARM Cortex-A55 cores clocked at 1.7GHz, and a wider pipeline with improved cache memory. Performance is substantially improved over the Exynos 8895 in the Galaxy S8 and S8+: Samsung says the Exynos 9810 is two times faster in terms of single-core performance and 40 percent faster in terms of multi-core performance.

The Exynos 9810 ships with the Mali-G72MP18 GPU, which has a slightly decreased core count compared to the Exynos 8895’s Mali-G71MP20, but improved per-core efficiency.

The chip’s Cat. 18 Gigabit modem supports download speeds over LTE up to 1.2Gbps thanks to 6X carrier aggregation (6CA), 4×4 MIMO, 256-QAM, and License-Assisted Access (eLAA), and it has neural network deep learning technologies that power Bixby’s image recognition features and 3D Emoji’s face-tracking filters. Finally, there’s a secure element that safeguards biometric data such as fingerprints, iris scans, and facial information.

The Qualcomm Snapdragon 845, which we recently benchmarked, is also built on a 10nm process and adopts ARM DynamIQ. It has eight custom Kryo cores: four Cortex-A75 “Gold” performance cores clocked up to 2.8GHz and four Cortex-A55 “Silver” efficiency cores clocked at 1.7GHz, which contribute to a 30 percent boost in overall performance and a 25 to 30 percent improvement in power-efficiency compared to the Snapdragon 835 in the Galaxy S8 and S8+.

On the visual processing side of things, the Snapdragon 845 packs the Adreno 630, Qualcomm’s latest GPU. It’s 30 percent faster and 30 percent more power-efficient than the Snapdragon 835’s Adreno 540, and it has 2.5 times the display throughput.

Diagram of the Snapdragon 845’s Hexagon DSP.

The Snapdragon 845’s other notable peripherals include the X20 modem, which supports Cat. 18 LTE download speeds up to 1.2Gbps, carrier aggregation, 4×4 MIMO, 256-QAM, and eLLA; the Hexagon DSP, a chip custom-designed for neural network workloads; and Qualcomm’s Secure Processing Unit, a secure element for biometric data.

Our Galaxy S9 and S9+ demo units had the Exynos 9810, and felt as swift and speedy as you’d expect. Switching between apps and juggling multiple tabs in Chrome was equally as breezy, likely thanks to the 6GB of RAM in the Galaxy S9+ and 4GB of RAM in the S9.

That said, we’re reluctant to jump to any conclusions about performance without time to run the phones through their paces (i.e., perform benchmarking tests and our in-house suite of scripts). Already, preliminary results have shown that the Exynos 9810 performs unpredictably in the Galaxy S9+, and in the interest of fairness, we’re reserving judgment until we’ve had a chance to thoroughly investigate Samsung’s claims.

We’ve also yet to test the Galaxy S9 and S9+’s battery life. They have the same capacities as the Galaxy S8 and S8+, respectively: 3,000mAh and 3,500mAh. (Both support wireless charging and Samsung’s Adaptive Fast Charging.) Samsung says the Galaxy S9 gets up to 14 hours of internet use on Wi-Fi, 11 hours on 3G, and 12 hours on 4G; 16 hours of video playback; and 22 hours of talk time. It says the Galaxy S9+ gets up to 15 hours on Wi-Fi, 13 hours on 3G, and 15 hours on 4G; 18 hours of video playback; and 25 hours of talk time.

And we haven’t tested the storage’s read and write speeds. The Galaxy S9 and S9+ ship with 64GB of internal memory (up to 256GB) and a microSD slot that supports cards up to 400GB.


Software

The Galaxy S9 and S9+ ship with Samsung Experience 9.0 atop Android Oreo. Both are Project Treble compatible which is great news for the modding community — in the future, we expect to see the Galaxy S9 and S9+ boot generic Android Open Source Project images in the future (but only for the Exynos models, which have unlockable bootloaders).

As far as Samsung Experience 9.0 is concerned, there isn’t much in the way of surprises. It began to roll out late last year as part of the Android Oreo beta to Galaxy S8 and S8+ participants in Samsung’s Beta Program, after which it launched more broadly in stable form. From what we can tell, Samsung Experience 9.0 on the S9 and S9+ is no different than publicly available version, save features like AR Emoji.

Source: SamMobile

The new and improved Samsung Keyboard adds a Google-style toolbar to the top row with shortcuts, a theme switcher, and a GIF creator. And Edge Lighting, a staple of Samsung’s curved-screen devices that shows alerts, text scrolls, and other peripheral information on the phone’s sides, has been enhanced with more lighting effects.

Source: SamMobile

The Samsung Experience 9.0 launcher implements support for Android Oreo’s Notification Dots and Adaptive Icons, and a new color picker that lets you tweak the appearance of folders. Additionally, the lock screen has a new clock widget and an adaptive coloring option that changes the lock screen color to match your phone’s background.

If you nab the new $150 DeX Dock with your Galaxy S9/S9+, you’ll benefit from the new, higher-resolution (2,560 x 1,400) display output (double the previous DeX Dock’s resolution). Samsung says that more than 40 partners are optimizing their Android apps for the DeX Dock interface, but alternatively, you can take advantage of Samsung’s Linux on Galaxy feature and install a full-blown Linux distribution.


Conclusion

If it wasn’t obvious from the get-go, Samsung isn’t out to break new ground or shake up the smartphone industry with the Galaxy S9 and S9+. That much became clear in the hour we spent getting a handle on AR Emoji, putting variable aperture setting to the test, and blasting sound through the stereo speakers. The S9 and S9+ are iterative in every sense of the word: The new processors are on a par with other flagship devices announced for this year; the S9+’s upgraded RAM and secondary rear sensor bring it up to speed with the competition; and the down-firing speakers merely improve on the S8 and S8+’s disappointing sound.

But iteration isn’t necessarily a bad thing. In fact, a Samsung rep told me that the company’s well aware that most soon-to-be Galaxy S9 and S9+ owners will be upgrading from a Galaxy S7 or S7+. For them, the phones are a giant technological leap forward.

For current S8 and S8+ owners, though, or folks with a relatively new flagship such as the OnePlus 5T or LG V30, the incremental differences make the price tags hard to justify. At $720 and $840 for the S9 and S9+, respectively, they’re easily two of the most expensive phones on the market. Trade-in deals and monthly installment pricing help ease the burden a little, but no matter how you slice it, that’s a lot of moolah for variable aperture.

 

 



from xda-developers http://ift.tt/2HQSd2E
via IFTTT

mardi 27 février 2018

[Update: Video] Samsung Brings Full Linux Distribution Support to DeX with “Linux for Galaxy”

Update 2/27/18: Samsung has posted a demonstration video that shows off Linux for Galaxy.

At the 2017 Samsung Developer Conference, Samsung made a lot of important announcements. The company announced Bixby 2.0 with smart home and third-party developer support and announced a partnership with Google to bring ARCore on some Samsung Galaxy smartphones. However, the announcements don’t end there. Samsung also unveiled updated to Samsung DeX with the addition of “Linux on Galaxy”, a feature which could add more value to the fledgling state of DeX and pulls more developers into buying Samsung hardware.

DeX is a hardware accessory first sold by Samsung along with the Samsung Galaxy S8 and the Galaxy S8+. It consists of a dock station allowing the user to connect the Galaxy S8, Galaxy S8+ or the Galaxy Note 8 with a monitor, keyboard, and mouse to get access to a full desktop UI. The dock has a HDMI port, an Ethernet port, and two USB ports. DeX is simply an extension of Android Nougat’s multi-window to push optimized applications onto the connected display. There are obvious limitations to this as a lot of software needs to be updated to support Samsung’s DeX, but that’s changing.

Now, Samsung is launching Linux on Galaxy feature which is an app that enables the capability of running multiple GNU/Linux operating systems on a Samsung smartphone when connected to a DeX dock. There is already a way to run a GNU/Linux environment on any Android device, but it isn’t as sophisticated as Samsung’s implementation.

With DeX, you can have Ubuntu 16.04 or another distribution running on the DeX dock via connected peripherals. As these Linux distributions are made for a desktop-oriented UI, Linux on Galaxy is a perfect fit for DeX because DeX connects to your smartphone to a much larger display.

We can expect this feature to be popular among developers who will be able to now set up a fully functional development environment with all the advantages of GNU/Linux. Samsung is hoping to pull more users away from their laptops/desktops into committing to their ecosystem, though Linux on Galaxy is still experimental. If you are interested in signing up, you can do so here.


Update: Video

As pointed out by Omg! Ubuntu! (via AndroidPolice) Samsung recently uploaded a video demonstrating Linux programs such as Firefox, Thunderbird, Eclipse, and GIMP. You can check it out below.


Source: Samsung



from xda-developers http://ift.tt/2ik7sc1
via IFTTT

Google is testing an Android P System Image with Android 8.1 Oreo Vendor Image on the Pixel 2

Upgrading an existing Android device to a new version of Android can be a long and arduous process, according to Sony. Part of the issue revolves around waiting for vendors (like Qualcomm) to provide device makers (like Sony) with updated HAL source code or binaries in order to work with the new version of Android. Thanks to Project Treble, device makers can start work on the next Android version much more quickly, at least that’s the idea behind it.

Android

Credits: Google

We’ve talked ad nauseam about the potential benefits of Treble for custom ROM development, with many devices now capable of enjoying ROMs such as LineageOS 15.1, CarbonROM, and more on several Treble compatible devices. But there’s one question that has always lingered in the back of our minds–what happens when Android P rolls around? Will we be able to flash an Android P Generic System Image (GSI) on top of a device with an Android 8.1 Oreo vendor image? This is a question that nobody has been able to truly answer, since Android P source code is not available (and thus, an Android P GSI cannot be built), so some developers were skeptical of this ever happening.

However, a new commit suggests that Google is testing exactly that on the Google Pixel 2.

Android P GSI on Google Pixel 2 with 8.1 Oreo Vendor Image

What is being shown here is that Google is updating the Vendor Test Suite (VTS) to allow for testing an Android P GSI with an Android 8.1 Oreo vendor image. The device that this is being tested on is the Google Pixel 2 (“wahoo device“). Google tests that this configuration does in fact boot, which is a requirement for passing the VTS.

What does this mean for us? Unfortunately, it’s hard to extrapolate. We can’t say this proves that any upcoming device launching with Android 8.1 Oreo (such as the Huawei P20 or Xiaomi Mi Max 3) will be able to boot up an Android P GSI out of the box, since we don’t have more information nor do we have an Android P system image to test with. At the very least, this shows that work is progressing nicely on Treble, and once Android P source code eventually drops, we can finally put these claims to test.



from xda-developers http://ift.tt/2BWzEtB
via IFTTT

Google Photos v3.15 prepares to let users Export Motion Photos as GIFs and Like Photos in Shared Albums

Google Photos v3.15 has begun rolling-out to users on the Play Store, and as usual, Google has not published an official changelog. While live changes are few, we did an APK teardown of the app and found that Google is planning to allow users to like photos in shared albums. Also, the company may be planning to allow users to export Motion Photos as GIFs. Let’s take a look at the changes one-by-one:

An APK teardown can often predict features that may arrive in a future update of an application, but it is possible that any of the features we mention here may not make it in a future release. This is because these features are currently unimplemented in the live build and may be pulled at any time by Google in a future build.


Like photos in shared albums

<string name="photos_hearts_viewbinder_user_liked_a_photo">%s liked a photo</string>
<string name="photos_hearts_viewbinder_user_liked_a_video">%s liked a video</string>

The strings indicate that users will soon be able to “like” photos in shared albums. Shared albums are albums that are shared between different users. The “like” feature will be part of an upcoming “Favorites” feature that has showed up in strings for a long time. Just like the “Favorites” feature, the “like” feature hasn’t gone live yet.

Google Photos Like Photo Shared Album

Google Photos Shared Album

Export as a GIF

<string name="photos_microvideo_actionbar_beta_export_as_dialog_export_button">Export</string>
<string name="photos_microvideo_actionbar_beta_export_as_dialog_gif">GIF</string>
<string name="photos_microvideo_actionbar_beta_export_as_dialog_photo">Still photo</string>
<string name="photos_microvideo_actionbar_beta_export_as_dialog_stabilization_checkbox">Keep stabilization (content is trimmed)</string>
<string name="photos_microvideo_actionbar_beta_export_as_dialog_title">Export as</string>
<string name="photos_microvideo_actionbar_beta_export_as_dialog_video">Video</string>
<string name="photos_microvideo_actionbar_beta_export_as_gif_success_toast_text">GIF exported</string>
<string name="photos_microvideo_actionbar_beta_export_as_gif_success_toast_view_result">View</string>
<string name="photos_microvideo_actionbar_beta_export_as_menu_item">Export</string>
<string name="photos_microvideo_actionbar_beta_export_error_toast_text">Failed to export</string>

These strings are most likely related to Motion Photos. As of now, Motion Photos can be exported as either still photos or as videos. Soon, users will be able to export them as GIFs as well. This feature is hardly groundbreaking, but when it goes live, users will be able to share Motion Photos in a standard file format, which will make it easier to share them.

Google Photos Motion Photos

Current export options for Motion Photos


Let us know in the comments if you spot anything new, and follow our APK Teardown tag for more articles like this!



from xda-developers http://ift.tt/2FEYH48
via IFTTT

Asus Takes the Wraps off the Asus 5Q, Asus 5, Asus 5Z, and Asus Max M1

Asus announced the Asus 5 series at Mobile World Congress 2018, a lineup of new high-end, mid-range, and low-end devices: The ZenFone 5, which feature Qualcomm’s 845 chip; the ZenFone 5, a cheaper, slightly less powerful variant of the  ZenFone 5Z; and the ZenFone 5Q, which packs four cameras. Here’s everything you need to know.


Asus ZenFone 5Z

The Asus 5Z, the undisputed flagship of Asus’s 2018 ZenFone lineup, features a 90% screen-to-body ratio, “premium materials” and a small, compact glass-covered body that measures 5.5 inches across. Its top-of-the-line features include Qualcomm’s Snapdragon 845, 6GB of RAM, 64GB of UFS 2.1 storage, a dual camera module, and 2.5D 6.2-inch Full HD+ (2160 x 1080) 19:9 screen.

The above-mentioned screen, which has an Essential Phone-like notch on the left side to accommodate the front-facing camera, supports DCI-P3 color space and wide color gamuts, and taps Asus’s intelligent display technology to adjust the color temperature automatically in response to ambient light changes. Screen On, another handy feature, prevents the phone’s display from turning off when you’re looking glancing at it.

The Asus 5Z’s vibrant screen is complemented by three noise-canceling microphones and two five-magnet speakers in stereo configuration, driven by dual amplifiers. The handset supports Hi-Res Audio and files encoded up to 24-bit/192KHz, and DTS’s Headphone-X technology for 7.1-channel virtual surround sound on supported headphones.

The Asus 5Z doesn’t just pack a powerful screen and stereo speakers. It also has a dual rear camera with Sony’s IMX363 sensor and a six-element lens, and a 8MP front-facing camera with a f/2.0 aperture. The 12MP shooters, which have a f/1.8 aperture and 1.4um pixel size, tap dual-pixel phase detection autofocus (PDAF) that takes just 0.03 seconds to focus, and a four-axis optical image stabilization system the reduces blur.

One of the sensors has a 120-degree wide-angle lens, and both take advantage of a “night HDR” mode that delivers up to 5x brighter and clearer photos. On the video side of things, the Asus 5Z can capture smooth, jittery-free 4K UHD clips at 60FPS (or 1080p at 30/60FPS) stabilized with the help of three-axis electronic image stabilization (EIS).

Qualcomm’s Snapdragon 845 is at the heart of the Asus 5Z, and it doesn’t disappoint. The chip comprises Qualcomm’s latest Kryo 385 core design, and packs the chipmaker’s new Adreno 630 GPU, optimized-for-AI Hexagon DSP, and Spectra 280 ISP. We’ve covered it at length, and from what we know so far, it’s no less capable in the Asus 5Z than in any of the smartphone’s competition.

The Asus 5Z leans on a variety of new AI-powered features in Asus’s new ZenUI 5.0. An energy-saving service called AI Charging automatically adjusts the phone’s charging rate in response to your usage habits (Asus claims it “slows down the battery aging process”), and AI Ringtone tweaks the phone’s call volume in response to ambient noise.

When it comes to photos, AI Scene Scene Detection uses algorithms to analyze subjects in real time and match them to one of 16 scene types, optimizing for different lighting conditions. AI Photo Learning identifies your go-to camera settings over time and adjusts the defaults accordingly. Real-time Portrait Mode produces a bokeh effect, blurring the background of photos while keeping the foreground intact. And Real-Time Beautification brightens your skin tone, removes stress lines, and applies other digital enhancements based on 365 facial points. It works in real time, and even in live-streaming video apps.

Other ZenFone 5Z highlights include super-fast biometric security features (the phone’s facial authentication can unlock it in 0.1 seconds, and its rear fingerprint sensor can unlock it in 0.3 seconds), and ZenMoji, a feature akin to Samsung’s AR Emoji: Cute characters respond to head and mouth movements captured via the phone’s front-facing camera. ZenMoji can be used in video and text chats and live-streaming, or you can add voice recordings to them via the microphone.

In terms of other internals, the ZenoFone 5Z has 802.11a/b/g/n/ac Wi-Fi, Bluetooth 5.0, NFC, an FM radio, and a USB Type-C connector. It starts at $499, and ships with Android Oreo.


Asus ZenFone 5

The Asus 5, the Asus 5Z’s mid-range counterpart, is nearly identical to the 5Z. It’s got the same “premium materials”, compact body, 2.5D 6.2-inch Full HD+ (2160 x 1080) LCD, stereo speakers, and Android Oreo with Asus’s ZenUI 5.0. But it’s not a carbon copy.

One of the key differences between the Asus 5 and the Asus 5Z is the processor: while the Asus 5Z has Qualcomm’s Snapdragon 845, the Asus 5 has a Snapdragon 636 (paired with an Adreno 509 GPU). Asus says that Snapdragon 636, a 64-bit system-on-chip that was announced a few months back, delivers 40 percent faster CPU performance and 10 percent better graphics performance than the Snapdragon 630, the SoC that powers the ZenFone 5Q.

The ZenFone 5 also has slightly less capable shooters than the ZenFone 5Z — its dual rear cameras lack phase detection autofocus and four-axis OIS. But like the Asus 5Z, they’re capable of shooting in up to 4K at 60FPS, and both the front and rear cameras are three-axis stabilized (with EIS).

Asus didn’t announce pricing for the Asus 5, but said it’ll come in two configurations — one with 4GB of RAM and one with 6GB of RAM (both with 64GB of storage and microSD slots) — when it ships later this year. It’ll also be available in two colors: Midnight Blue and Meteor Silver.


Asus ZenFone 5Q

The Asus ZenFone 5Q (or ZenFone 5 Lite, depending on the region), is a slight step down from the 5Q and 5Z, but no less uncompromising. It boasts a 4-camera module with a 20MP rear sensor and a 16MP front sensor, dual internal microphones with noise-canceling tech, an FM radio, a 120-degree wide-angle lens, and a 6-inch Full HD+ (2160 x 1080) 19:9 screen.

The cameras are the headliners. The rear and front sensors are 20MP with f/2.0 aperture and 16MP with f/2.2 aperture, respectively, and one sensor in each pair has a 120-degree wide-angle lens. Uniquely, Asus says all four can be controlled independently of one another; using the pre-loaded camera effects doesn’t require using the secondary sensor in conjunction with the main camera. (It’s a bit unclear how that’ll work in practice.)

The rear cameras can record in 4K resolution with three-axis EIS.

The processor — a Qualcomm Snapdragon 630 — isn’t quite as powerful as the Snapdragon 636, but it’s capable in its own right. Asus says that its power efficiency (thanks in part to a FinFET Lower Power Plus manufacturing process), combined with the Asus 5Q’s 3,300mAh battery, delivers up to 24 days of 4G standby time and 4 days of music playback.

It’ll be available in a 4G RAM/64GB storage model (expandable with a microSD card) later this year in Midnight Black, Moonlight White, and Rouge Red. Like the Asus 5Z and Asus 5Q, it’ll ship with Android Oreo and Asus ZenUI 5.0.


Asus Max M1

Lastly, Asus announced the Max M1, the latest model in the ZenFone Max series. The budget handset has a 5.5-inch “full-view” display, a 4,000mAh battery, dual rear cameras (one of which has a wide-angle lens), and a fingerprint sensor. Additional details were hard to come by as of publication time, but we’re expecting to learn more about the Asus Max M1 in the coming days.



from xda-developers http://ift.tt/2t2oGQg
via IFTTT