POST A COMMENT

29 Comments

Back to Article

  • djgandy - Tuesday, February 15, 2011 - link

    "Most PowerVR SGX 540 designs run the GPU core at up to 200MHz. Exynos' implementation is another 50% faster. LG's software build also uses a newer version of Imagination Technologies' driver (1.6 instead of 1.5) which fixes some rendering issues and improves performance considerably (likely between 10 - 30% in GLBenchmark2)."

    Exynos is the Orion Mali chip though? It is confusing what you mean here. Are you saying the OMAP you tested ran at 300MHz or making a comparison to the Exynos?
    Reply
  • metafor - Tuesday, February 15, 2011 - link

    Without commenting too much on what OMAP4 has its GPU clocked at, the dual memory controller can account for a lot of the performance difference we see. Reply
  • djgandy - Tuesday, February 15, 2011 - link

    Of course, I was just trying to make the article make sense :-)

    How sure are you that the 540 is running at 300MHz? The article says ~300MHz. With all that extra DDR2 memory bandwidth it wouldn't be impossible for a 200MHz part to gain the 20% or so from hummingbird. It has a beefier CPU too.
    Reply
  • Alexvrb - Tuesday, February 15, 2011 - link

    I'd be inclined to agree. But then again, testing performance of mobile devices is even messier than PCs. Maybe there's something else at work here. Reply
  • Brian Klug - Tuesday, February 15, 2011 - link

    OOps, OMAP4 is what's meant.

    Fixed!

    -Brian
    Reply
  • ssj4Gogeta - Tuesday, February 15, 2011 - link

    I remember seeing a Samsung slide (probably on this Anandtech) which said the Galaxy S II had quad-core SGX 544 GPU?? Reply
  • ssj4Gogeta - Tuesday, February 15, 2011 - link

    Here it is:
    http://www.anandtech.com/Gallery/Album/946#24
    the 24th slide. I also remember seeing the name SGX 544, that must be in one of those slides too.
    Reply
  • djgandy - Wednesday, February 16, 2011 - link

    Its an ARM Mali, not an SGX. Reply
  • puffpio - Tuesday, February 15, 2011 - link

    How does the 3D effect compare to a 3DS? Is it similar technology? Reply
  • Brian Klug - Tuesday, February 15, 2011 - link

    Indeed, both are parallax barrier autostereoscopic 3D displays and operate very similarly.

    -Brian
    Reply
  • MobiusStrip - Tuesday, February 15, 2011 - link

    Unfortunately, those lenses are way too close together to create proper (meaning human) 3-D.

    After 150 years or so of stereoscopic photography, it's incredible to see this blunder being made. Standard human interocular spacing is 64mm (about 2.5 inches). Look at this camera: I doubt the spacing is even half that.

    And before someone jumps in with, "No, because of the field of view of these lenses..."; that doesn't matter. It's not the field of view that matters; it's the POINT of view. If you're 13 inches away from seeing around a corner and you move one foot toward it, you still can't see around the corner. If you move a foot and a half, you can see around the corner. Putting wide-angle glasses on wouldn't have let you see around it without moving.

    Same thing with 3-D cameras. Lenses one inch apart will not see the same objects or parts of objects that lenses 2.5 inches apart will, regardless of their field of view.

    This camera is BS.
    Reply
  • bplewis24 - Tuesday, February 15, 2011 - link

    So, Tegra 2 had the overall performance crown for, what... a few days? Things are heating up!

    Brandon
    Reply
  • Conficio - Tuesday, February 15, 2011 - link

    I wished LG would offer the fast processor with the tempting NOVA screen of the Optimus black. I don't care for 3D gimicks. But I'm in the market for a dual core smart phone that has a screen with good resolution and visibility in daylight (preferably matte).

    Don't know why the black does not seem to have a dual core processor and fast memory channels.
    Reply
  • j.harper12 - Tuesday, February 15, 2011 - link

    This is the real reason I wanted the Optimus 3D, I hoped the benchmarks would prove to be near the top of the heap. 3D will be a fun feature, but really just an ancillary benefit. Sadly, I doubt I will ever get this phone, doesn't look like a CDMA variant is going to be released. Please, please, please correct me if I'm wrong folks. Would love to pick this phone up on Sprint. Reply
  • khimera2000 - Tuesday, February 15, 2011 - link

    Ouch so tegra 2 no longer is the fastest, thats not good, but at the end of the day I doubt that amy one will really notice a differance with the latest offerings. Im personnaly looking forward to seeing more about the new snapdragon chips.

    Is there a way for anandtech to maybe come up with a chart that shows the evolution of these chips? maybe a timeline showing offerings, and what new capabilities where added to each generation. It would be intresting to see how much performance i lose or gain baist on generation and vender.
    Reply
  • Ushio01 - Tuesday, February 15, 2011 - link

    So mobile phones now have a choice of 5 GPU makers Broadcom, ARM, Qualcomm, Nvidia and Imagination Technologies. I only have one thing to say hurry up and scale these chips up so we can use them in laptops as the current AMD and Nvidia chips aren't cutting it in the power to performance department. Reply
  • silverblue - Tuesday, February 15, 2011 - link

    Pardon my ignorance, but I didn't know the first two of those made any GPUs. Regardless, I'm not quite sure scaling them up to laptop/netbook usage would bring about the performance you're expecting. They still have Atom, Nano and especially Brazos in the way first... and taking Brazos from a single channel memory interface to dual channel would make for a tangible performance boost. Reply
  • dagamer34 - Tuesday, February 15, 2011 - link

    ARMs GPU is called Mali and it's in the newly announced Exynos SoC from Samsung. Reply
  • bhtooefr - Sunday, April 10, 2011 - link

    I'm replying to an ancient post, I know, but...

    Get an Atom Z-series netbook, with the GMA 500. That gets you an SGX 535 in a laptop.
    Reply
  • Stuka87 - Tuesday, February 15, 2011 - link

    With the two cameras being so close to each other, how good is the 3D photography? I would think that there would not be much depth because of how close they are to each other. Where as something like the 3DS has them quite a bit farther apart. Reply
  • MobiusStrip - Tuesday, February 15, 2011 - link

    This camera will not produce proper 3-D (meaning 3-D as humans see it).

    It can't, because it doesn't have the points of view that normally spaced human eyes do. Their field of view doesn't make any difference; the disparity between the images isn't enough to create a realistic 3-D effect.
    Reply
  • name99 - Tuesday, February 15, 2011 - link

    "I don't expect there to be a measurable performance advantage today due to MPE as I'm not aware of any substantial (or any?) NEON code in the apps we test with."

    I know this comment was made about Android, but I think it is worth pointing out that Apple does make use of NEON code in iOS. At the most obvious level, there are Apple libraries (part of the iOS API) for various numerical code, like FFTs and linear algebra, that utilize NEON.

    While much of the numerical processing you'd expect in video/audio playback and core animation is presumably done in ASICs or on the GPU, I suspect that the voice recognition stuff, for example, (perhaps also some HDR stuff?) is done on NEON.
    (Or perhaps not yet --- damn that HDR stuff is slow, and maybe with iOS 5 we'll get a nice speed boost as it moves onto either NEON or the GPU?)
    Reply
  • metafor - Tuesday, February 15, 2011 - link

    A lot of Android's UI library (prior to Honeycomb) is, I believe, done in NEON. Other than that, I'm not really aware of much. Then again, other than rendering websites, I don't know of many CPU-intensive tasks on phones at all.

    As phone software becomes more demanding, there may be more uses for NEON in code that's not easily offloaded to the GPU/DSP.
    Reply
  • Vlad T. - Tuesday, February 15, 2011 - link

    Any chance to run Flash benchmark and see how it performs?
    http://images.anandtech.com/graphs/graph4177/35414...

    Thank you,
    Vlad.
    Reply
  • Brian Klug - Wednesday, February 16, 2011 - link

    Sadly no, the Optimus 3D I spent a lot of time with didn't have Flash installed. We could've installed it from the market, but it's possible that Adobe and TI will work on something, just hard to tell whether the marketplace based flash would've been representative.

    -Brian
    Reply
  • jeffrey - Tuesday, February 15, 2011 - link

    Does the LG Optimus 3D actually use 3 seperate CMOS sensors? I see what appears to be two on the back of the phone and is there a front-facing camera also? Does anyone know if this is physically two seperate sensors or some type of new trick with 2 lenses and one sensor?

    I wonder what the specs are of each camera are: Two 5mp sensors in back and one vga in front?
    Reply
  • kenyee - Tuesday, February 15, 2011 - link

    Surprised they didn't use NVidia's Tegra 3D chip instead but this definitely ups the bar for the Tegra 3D :-) Reply
  • Amon - Tuesday, February 15, 2011 - link

    Hi,

    I found these tests very interesting but I'm curious as to wether the testing software is able to fully utilize and test both CPUs and in the case of the Samsung Galaxy S II all four GPUs.

    Should we expect to see the dual-core CPUs and the quad-core GPU do better with newer versions of the testing software?

    Also do you have any comment on the difference between the Optimus 2X and the Atrix in most tests? I had expected the Atrix to beat the Optimus 2X?

    Thanks.
    Reply
  • TareX - Wednesday, February 16, 2011 - link

    You need to consider the fact the Atrix has a higher resolution screen, and is running an older version of Android :) Reply

Log in

Don't have an account? Sign up now