Power Consumption

Idle Power


Load Power


Assassin's Creed and The Witcher Performance Final Words
Comments Locked

56 Comments

View All Comments

  • mczak - Thursday, October 23, 2008 - link

    While the text says 1 simd disabled (which is clearly wrong), it seems AMD sent out review samples with 3 simds disabled instead of 2, hence the review samples being slower than they should be (http://www.techpowerup.com/articles/other/155)">http://www.techpowerup.com/articles/other/155). So did you also test such a card?
  • 7Enigma - Thursday, October 23, 2008 - link

    Where is the Assasin's Creed data? Where is the broken line graph for The Witcher showing performance at different resolutions? I've commented on the last several articles on your data analysis and frequently dislike the chosen resolution you use for the horizontal bar graph, but at least you had the broken line graph to compare with.

    With the vast majority of people using 17-19" LCD's with 1280X1024 (especially in the price range of the card being reviewed), it seems kind of strange to me the higher resolutions for 20-22" LCD's are the ones being selected for the large bar graphs. I know the playability difference between 60fps and 70fps is moot, but the trend it shows is very important for those that do not plan on upgrading to a larger monitor and want to know which is the better card.

    For instance the only data we see for The Witcher is at 20-22" resolutions. This single data point shows the AMD card 9% faster than the Nvidia at a just playable (IMO) framerate. As that is likely the average framerate, a 9% difference could be huge when you get into a minimum situation. If I have a 17-19" monitor this data is worthless. Does the trend of AMD being 9% faster hold true at the lower resolution, or does the Nvidia card pull closer?

    And while I'm repeating myself from previous articles, I beg of you, PLEASE PLEASE PLEASE try to keep the colors of the bar graphs the SAME as in the broken line graphs. It is very frustrating to follow the wrong card from bar graph to line graph because the colors do not match up between them.

    Overall good review, there are just these nagging issues that would make the article great.
  • strikeback03 - Thursday, October 23, 2008 - link

    I only know two people using 1280x1024 LCDs, and both would consider $130 way too much to spend on any computer component. I'd guess there are more people these days using 17-19" widescreens with 1280x800 or 1440x900 resolution than the 1280x1024 screens, as these widescreens are what has been common in packages at B&M stores or a while now.
  • 7Enigma - Thursday, October 23, 2008 - link

    And your point is? Those resolutions you listed are right around the 1280X1024 resolution I was referring too. It's not the height/width of the monitor that matters with these cards, it is the overall pixel resolution that can have differences between them. A 1280X1024 uses 1.3 million pixels per screen. A 1440X900 uses almost exactly the same number of total pixels, so you could directly compare the results unless the video card had some weird resizing issues. A 1280X800 uses 1.0 million or about a quarter less so this difference could be even larger between the cards than in my original example.
  • GlItCh017 - Thursday, October 23, 2008 - link

    For $130.00 I would not be disappointed with that card. Then you consider it to be somewhat cheaper on newegg maybe as low as $100.00 and you got yourself a bargain, overclock it a little or buy an OC version even, not too shabby.
  • Butterbean - Thursday, October 23, 2008 - link

    I jumped to "Final Words" and boom - no info on 4830 but right into steering people to an Nvidia card. That's so Anandtech.
  • mikepers - Thursday, October 23, 2008 - link

    Actually what Derek says is it's shop around and get the best price between the two. If there is in fact a $20 to $30 difference then get the 9800gt. If not then get the 4830.

    Performance is about the same and right now you can get a 9800gt for $100 after rebate. (for $110 get it delivered with a copy of COD4 included)

    This doesn't even consider the power advantage of the 9800gt. Assuming you leave your PC on all the time then that little 14 watt idle difference adds up. At the 20 cents per KwH I pay here in Long Island, NY the 9800gt would save me about $24.50 a year. So personally, I would definitely get the 9800gt unless I could find the 4830 for a decent amount less then the 9800gt.
  • Spoelie - Thursday, October 23, 2008 - link

    Didn't really notice it the first read through and most of the time I don't concur with bias allegations.

    However it seems in this case the paragraphs could have been rearranged to 1-4-5-2-3-6 (no rewording necessary) and the conclusion would have said the same thing, only focusing a bit more obvious on the product at hand than what a great deal the nvidia card is.
  • DerekWilson - Friday, October 24, 2008 - link

    thats actually a good suggestion. done.
  • Hanners - Thursday, October 23, 2008 - link

    "Based on the information we know about the GPU, the 4830 is clearly just an RV770 with one SIMD disabled."

    Don't you mean two SIMD cores?

Log in

Don't have an account? Sign up now