Low and Medium Gaming on the ID49C

The benchmarks we run in our gaming suite should provide some fairly interesting results, especially in placing the NVIDIA GeForce GT 330M's relative performance against the 325M and 335M. The 325M sports 48 "CUDA cores" and a 128-bit memory bus just like the 330M, but at lower clock speeds, while the 335M moves up on the food chain and uses a harvested GeForce GT 240 die, running at slightly lower clocks than the 330M but bumping up to 72 shader cores.

There's a caveat to these results, however: for StarCraft II we had to manually tweak the display drivers to get Optimus to work with it (before we did the game ran at a constant 4fps, which is slow even for Intel's HD Graphics.) Also, there seems to be a bug with Optimus in Left 4 Dead 2 using the latest 260.63 drivers, where setting the game to 4xAA results in a blank screen. Unfortunately, we couldn't get the ID49C to run L4D2 with 4xAA on the 258.96 drivers either, something which wasn't an issue on the ASUS N82Jv. Those test results—for our "High" preset—are mostly academic as the 330M really isn't powerful enough to be pushing anti-aliasing anyhow, but it bears mentioning nonetheless.

Of course, all of this is going to be rendered fairly meaningless soon enough: GeForce 400M series parts should start trickling in soon, and hopefully they'll bring the kind of performance improvements NVIDIA has desperately needed on the mobile front. Optimus is a great technology AMD just doesn't have a counter for right now, but AMD's mobile parts are generally faster in their given classes. Hopefully with the 400Ms NVIDIA will be able to give us everything: better performance, DX11 support, and better battery life thanks to Optimus.

 

 

At these minimal settings, the GeForce GT 330M in the ID49C is able to post playable framerates on every game in our suite with room to spare. In most cases the testing units seem to be CPU limited, though the GeForce G 310M and Mobility Radeon HD 5470 are so anemic to begin with that even at minimal settings they threaten to bog down gaming performance. The exception is StarCraft II, where for whatever reason the Radeons seem to take it to the GeForces. The only Radeon not posting framerates over 80 is the HD 5650 in the Toshiba A660D, hamstrung by a slow Phenom II in a game traditionally CPU limited at even the highest settings. Also somewhat interesting is that the ID49C manages to best the GT 335M in the N82Jv in several games, despite sharing the same CPU. We'll see that once we up the quality settings, the GT 335M is able to take the lead.

 

 

Once we start to ratchet up graphics settings, our testing suite starts to separate the men from the boys, and it's here that the GeForce GT 330M unfortunately exposes the weak link in Nvidia's current (and thankfully soon to be retired) mobile lineup. The GT 330M is extremely common, but it consistently loses to the Mobility Radeon HD 4650. The 4650 is an old chip; when it dropped it was an absolute powerhouse for the market segment, but it's been around for nearly two years, and Nvidia is only just now getting around to answering it with the 400M series. Even the GT 335M has problems with it.

Taken in a vacuum, the 330M is able to produce playable framerates in every game we tested it in, but it doesn't have much headroom. Its bigger brother, the 335M, and AMD's newer Radeon HD 5650 both come out looking pretty bad too. The 5650 proves itself as an incremental at best upgrade over the 4650, although it's important to keep in mind that the A660D is CPU-limiting it and worse, the 5650 itself is clocked 100MHz below spec in that unit. But the 335M, despite having a 100MHz slower clock speed has 72 shaders instead of the 330M's 48, and it still barely holds a lead, likely because both parts have the same memory bandwidth.

General Performance with the ID49C High Gaming and 3DMarks
Comments Locked

43 Comments

View All Comments

  • Dustin Sklavos - Thursday, September 23, 2010 - link

    I'm not sure I have such a low opinion of the average consumer that I would assume they'd have to buy this thing just because it has a glowing touchpad. While I do like some of the styling (it's nice to see aluminum on a notebook at this price point), I take issue with the fact that rather than choosing to invest in putting together a more well-rounded machine, Gateway whiffed and just gave us a crappy screen and a touchpad that lights up.

    There was potential here. Dedicated volume controls are common from most manufacturers, they didn't need to replace useful document navigation keys with them. Instead, they somehow managed to make a bad keyboard worse (and a regular consumer checking out units on the shelf may very well test the keyboard), and again, burned their budget making the touchpad light up instead of improving something...ANYTHING else.
  • AnnonymousCoward - Saturday, September 25, 2010 - link

    $849 for this POS? Negativity was not overzealous. Your 1st paragraph isn't supported by the rest. In the rapist rapper's voice, Welllll, obviously, most people won't notice the shitty screeeen, and the crappy keyboard. He's climbin in your windows...

    I would never pay $800 for 1366x768 and a crappy keyboard, even if there's a quantum CPU with data crystals inside.
  • Minion4Hire - Monday, September 27, 2010 - link

    I wouldn't expect you to buy this. As I said, this model of laptop is not designed to target Anandtech readers in the least. But its target audience is known to lower their resolution (while remaining entirely ignorant of aspect ratio) in order to get larger text. As such 1366x768 isn't a problem in the least. As for the "crappy" keyboard, while it does flex when under pressure I think the key layout is acceptable, and unless you pound your keyboard while typing you'll never notice said flexing; it takes a decent amount of force (more than any typist would use) in order to cause the keyboard to bow.

    Even Dustin admitted that its "pricetag is justifiable". It's not a great laptop, but the flaws that we see often do not exist in the eyes of the consumer, either because they don't care (don't know better) or view said flaws as positives (ie. 1366x768 resolution) so it's all very relative.
  • AnnonymousCoward - Monday, September 27, 2010 - link

    There's no question that this laptop, like any crappy product, is acceptable to the average consumer. When it comes to average/bad products, I'm sure you'd agree that AnandTech should lean zealously negative. When poor design choices are made that affect things that AnandTech readers care about, it should be a big deal.
  • andrepang - Thursday, September 23, 2010 - link

    Not too sure if you guys have noticed, this particular gateway notebook have very similar physical design compared to Acer's timelineX 4820TG....

    Looking at the side ports, DVD tray and even the back cover plus and the battery's shape looked the same.. And of course not forgetting the keyboard.......

    I wondered if its a design copy or are they sourcing the design from the same OEM...

    Just my thoughts...
  • infodan - Thursday, September 23, 2010 - link

    Acer owns gateway, so thats not a surprise, but in the US the gateway brand is more popular, unlike in europe (and especially the UK) where the gateway brand is all but dead.
  • Roland00 - Thursday, September 23, 2010 - link

    The big differences between the two (besides looks)

    Is the Acer TimelineX either uses intel i3/i5 integrated graphics or has an ATI HD5650. The Gateway ID series either uses intel i3/i5 integrated graphics on their cheaper models, on their more expensive models they use nvidia Optimus with the GT330m (this is what Dustin reviewed).

    Also the TimelineX comes with a Six-cell, 6000mAh (up to 8 hours in mobile mark with intetgrated graphics ) or a Nine-cell, 9000mAh (up to 11.5 hours in mobile mark with integrated graphics). The Gateway ID series comes with a Six-cell 4400mAH battery (up to 6 hours in mobile mark with integrated graphics).

    So Timeline X gives you bigger battery with ATI (and the faster video card) whilethe Gateway gives you a smaller batter with Nvidia Optimus.
  • Roland00 - Thursday, September 23, 2010 - link

    I have seen and operated one and it is a good laptop for the money.

    I just hate they keyboard, hate, hate, hate...

    One thing that wasn't mention by Dustin is that when you click the touchpad (which is one large button), the button actually lowers, it actually deepens. For a person who loathes touchpads and always carries a mouse, I found this option to be intuitive and better than most touchpads I have operated.
  • zoxo - Thursday, September 23, 2010 - link

    Seriously, how much extra would it cost to have a decent screen?
  • Pirks - Saturday, September 25, 2010 - link

    judging by MacBook Pro prices - about a grand extra

    forget about it, PC user

Log in

Don't have an account? Sign up now