The LG G2 is the spiritual successor to the Optimus G, a device that we looked at last year and eventually went on to become the Nexus 4. LG dropped the "Optimus" branding this time, but the G2 is without a doubt still LG's flagship smartphone, and includes a number of unique LG features – stacked 3000 mAh (11.4 Whr) battery with SiO+ anode, 5.2-inch 1080p LCD from LG Display, and 13 MP rear facing camera with OIS (Optical Image Stabilization). It's an impressive combination of features that make the G2 a standout device. At the same time the G2 is our first chance to get a look at the 2.3 GHz bin of Snapdragon 800 inside a shipping device and get a look at performance and battery life. 

We took a quick look at the G2 at the announcement event, now we have our hands on a G2 and have been putting it through its paces, benchmarking it, and running battery life tests on it for a little under a week and wanted to share some thoughts.

Hardware Impressions

The G2 marries a curved backside shape with front glass that slightly curves at the edges and has a very narrow side bezel. The G2 also opts for on-screen buttons rather than the discrete capacitive kind or physical buttons that went out of favor a while ago. The reality is that Google does have a fair amount of input into at least this part of the Android ecosystem, and its guidance seems to be that on-screen buttons which use display real estate to draw the buttons is the recommended way to go. The G2 does afford the ability however to add Quick Memo buttons or the notification shade pull down/pull up buttons to the bar, but oddly enough there's no multitasking button option available. 

The G2 manages to include a large display without width that's much different from other devices I've been using lately, like the HTC One. Part of getting the edge bezel small was a reduction in volume required on the sides for volume and power buttons, which are instead moved to the back of the G2, perhaps its most striking and initially even alarming design change.

Holding down the volume down button launches you into the camera, pressing the center button powers on the phone, and holding down the top button launches QuickMemo. Up and down are volume up and down otherwise. There's a hard raised lip on both sides of the button too, so the G2 when laid backside down on a surface makes contact there instead of on the button – it won't inadverntely turn on when pressed against a table. I found the backside buttons easy to adapt to after my first few interactions with the G2, and they actually become second nature after a day or so. The raised bump for the power button makes it easy to locate with the index finger, and I haven't smeared or accidentally put my finger on the sapphire camera cover yet. If the power button on the back is still difficult to get used to, the G2 has a double tap to turn on feature it calls "knock knock" – double tap on the display, and the G2 will turn on, repeat the double tap on the status bar or in an empty part of the display when it's on, and it turns off. I find myself using the double tap gesture quite a bit to turn the G2 on and off. I believe this functionality uses the sensors onboard and the DSP inside 8974 to detect when the taps occur. 

Gallery: LG G2

The G2 I was sampled is a dark blue color which has a slight pinstripe on the back as shown in the photos above. The material is however the same kind of glossy plastic I'm used to seeing out of the Korean handset makers of note, and picks up fingerprints and hand oil very quickly unfortunately. I like the shape of the device and LG's innovations, it's just puzzling to me that materials hasn't picked up yet, I'd even take glass from the Optimus G over plastic. I'll save you the huge discussion on device size as well, I'm fine with the larger smartphones that aren't quite phablets, and the G2 for me is totally usable and I appreciate the increased display size. It definitely isn't phablet size, but it is on the larger high-end smartphone side of things. 

  LG G2
SoC Qualcomm Snapdragon 800 (MSM8974)
4x Krait 400 2.3 GHz, Adreno 330 GPU
Display 5.2-inch IPS-LCD 1920x1080 Full HD
RAM 2GB LPDDR3 800 MHz
WiFi 802.11a/b/g/n/ac, BT 4.0
Storage 32 GB internal
I/O microUSB 2.0, 3.5mm headphone, NFC, Miracast, IR
OS Android 4.2.2
Battery 3000 mAh (11.4 Whr) 3.8V stacked battery
Size / Mass 138.5 x 70.9 x 9.14 mm
Camera 13 MP with OIS and Flash (Rear Facing)
2.1 MP Full HD (Front Facing)

 

Display
POST A COMMENT

120 Comments

View All Comments

  • Krysto - Sunday, September 8, 2013 - link

    Cortex A9 was great efficiency wise, and better perf/Watt than what Qualcomm had available at the time (S3 Scorpion), but Nvidia still blew it with Tegra 3. So no, that's not the only reason. Nvidia can do certain things like moving to smaller node or keeping the clock speed low of the GPU's, but adding more GPU cores, and so on, to increase efficiency and performance/Watt. But they aren't doing any of that. Reply
  • UpSpin - Sunday, September 8, 2013 - link

    You mean they could and should have released more iterations of Tegra 3 and adding more and more GPUs to improve at least the graphics performance than waiting for A15 and Tegra 4.

    I never designed a SoC myself :-D so I don't know how hard it is but I did lots of PCB which is practically the same except on a much larger scale :-D If you add some parts you have to increase the die size, thus move other parts on the die around, reroute the stuff etc. So it's still a lot of work. The main bottleneck of Tegra 3 is memory bandwidth. So adding more GPU cores without adressing the memory bandwidth would not have made any sense most probably.

    They probably expected to ship Tegra 4 SoCs sooner, thus they saw no need in releasing a totally improved Tegra 3 and focused on Tegra 4.

    And if you compare Tegra 4 to Tegra 3, then they did exactly what you wanted, moving to a smaller node, increasing the number of GPU cores, moving to A15 while maintaining the power efficient companion core, increasing bandwidth, ...
    Reply
  • ESC2000 - Sunday, September 8, 2013 - link

    I wonder whether it is more expensive to pay to license ARM's A9, A15, etc (thought they were doing an A12 as well?) or to develop it yourself like Qualcomm does. Obviously QCOM isn't starting from scratch every time, but R&D adds up fast.

    This isn't a perfect analogy at all but it makes me think of the difference between being a pharmaceutical company that develops your own products and one that makes generic versions of products someone else has already developed once the patent expires. Of course now in the US many companies that technically make their own products from scratch really just take a compound already invented and tweak it a little bit (isolate the one useful isomer, make the chiral version, etc), knowing that it is likely their modified version will be safe and effective just as the existing drug hopefully is. They still get their patent, which they can extend through various manipulations like testing in new populations right before the patent expires, but the R&D costs are much lower. Consumers therefore get many similar versions of drugs that rely on one mechanism of action (see all the SSRIs) and few other choices if that mechanism does not work for them. Not sure how I got off into that but it is something I care about and now maybe some Anandtech readers will know haha.
    Reply
  • krumme - Sunday, September 8, 2013 - link

    Great story mate :), i like it. Reply
  • balraj - Saturday, September 7, 2013 - link

    My first comment on Anandtech
    The review was cool...I'm impressed by g2 battery life n camera...
    Wish Anandtech can have a UI section
    Also can you ppl confirm if lg will support g2 with Atleast 2 yrs of software update
    That's gonna be deciding factor in choosing between g2 or nexus 5 for most of us !!!!!!!
    Reply
  • Impulses - Saturday, September 7, 2013 - link

    Absolutely nobody can guarantee that, even if an LG exec came out and said so there's no guarantee they wouldn't change their mind or a carrier wouldn't delay/block an update... If updates are that important to you, then get a Nexus, end of story. Reply
  • adityasingh - Saturday, September 7, 2013 - link

    @Brian could you verify whether the LG G2 uses Snapdragon 800 MSM8974 or MSM8974AB?

    The "AB" version clocks the CPU at 2.3Ghz, while the standard version tops out at 2.2Ghz.. However you noted in your review that the GPU is clocked at 450Mhz.. If I recall correctly, the "AB" version runs the GPU at 550Mhz.. while the standard is 450Mhz

    So in this case the CPU points to one bin.. but the GPU points to another.. Can you please confirm?
    Nice "Mini Review" otherwise.. Am looking forward to the full review soon.. Please include the throttling analysis like the one from the MotoX. It would be nice to see how the long the clocks stay at 2.3Ghz :)
    Reply
  • Krysto - Sunday, September 8, 2013 - link

    He did mention it's the first. no the latter. Reply
  • neoraiden - Saturday, September 7, 2013 - link

    Brian could you comment on how the lumia 1020 compares to a cheap ($150-200) camera as I was impressed by the difference in colour for the video comparison even if ois wasn't the best.

    I currently have a note 2 but the camera quality in low light conditions is just too bad, also the inability to move apps to my memory card has been annoying. I have an upgrade coming up in January I think, but I might try to change phone before. I was wondering whether you could comment on whether the lumia 1020 is worth the jump from android due to picture quality or will an htc one or nexus 5 (if similar to the g2) suffice? I was considering the note 3 as I like everything else but it still doesn't have ois or would the note 3 with a cheap compact be better even given the inconvenience of having to bring a camera?

    The main day to day use of my phone is news apps, Internet, email some threaded (which I hear is a problem for windows phone).
    Reply
  • abrahavt - Sunday, September 8, 2013 - link

    I would wait to see what camera nexus 5 would have. Alternative is to get the Sony QX 100 and you would get great pictures irrespective of the phone Reply

Log in

Don't have an account? Sign up now