Back to Article

  • JonnyDough - Sunday, January 17, 2010 - link

    It's always good to see new cards rolling off the line that don't require me to open another coal-powered power plant in my town. Reply
  • PR3ACH3R - Monday, January 25, 2010 - link

    The current situation with Anandtech ATI reports & coverage is absolutely absurd, & so disappointing,
    I do not even know where to start.

    It seems like nothing is done by pros anymore at Anandtech.

    From the endless 57XX Driver bugs, To the flaky incomplete & undocumented DXVA features,
    To the High DPC usage in anything not 3d/dxva
    all the way to the poorest 2d performance ever seen on the pc (this is not an exaggerated comment), NOTHING is discovered by Anadtech.

    You have become a commercial, biased, & unprofessional, overrated site.

    So there, I have done the work for you,
    go check these issues & let's see when you will get the staff professional enough to analyze or even notice all the above.
  • 529th - Friday, January 15, 2010 - link

    According to benchmark reviews the 4670 idles at 9w - the way they come to this conclusion is they boot the pc without the vid card and run the psu cord through a Kill-A-Watt EZ P4460 wall socket mount that reads the wattage draw, take that number for system idle power and then run it against the system at idle with the card for a base number, then they run it through some 3D titles for total draw.. Reply
  • 7Enigma - Friday, January 15, 2010 - link

    There is no marketing or business decision to continue producing the 4850. It's more expensive to make than the reviewed card, and it's making the new cards look like crap. I personally believe the Far Cry 2 data is not correct (it makes no sense), but in everything else the 4850 is significantly faster.

    I'm actually surprised AMD would be stupid enough to continue producing (not just selling out of existing stock) the 4850...they are shooting themselves in the foot and making their "new" product lineup underwhelming.

    Hint, hint, if you are in the market for a card in this price range and don't care about power requirements or small size (ie HTPC), get the 4850 NOW. I can't imagine it will be around next month unless AMD is completely clueless (which I believe they are not).
  • peakchua - Friday, May 07, 2010 - link

    hey im a noob on GPU'S :) is the 4850 better than the 5670? I own an imac :) with a mobility 4850, if apple upgraded to a mobilit 5750, would it be considerably faster? I tried asking apple but as usual they never reply :) Reply
  • JimmiG - Friday, January 15, 2010 - link

    It seems not a single card in the 5-series bring you more performance at a particular price point. There's always a card from the 4-series that beats the 5-series and costs less. Why this trade off between performance and features? It's either a slower card with more features or a faster card with less features...

    This is completely unlike the 4-series, which revolutionized performance at every price point.

    Guess things will change in, oh, about a year, when Fermi-derived cards are out at all price points...
  • marc1000 - Friday, January 15, 2010 - link

    If we can expect the same "downsizing" that Nvidia did for the GT200, then there will be low-end Fermi's only in 2013.... Reply
  • rjc - Friday, January 15, 2010 - link

    On th first page of the article it said launch volume would be around 50k units and that is expected to be sufficient.

    Is that figure for US only? if it's the whole world, it works out about 1 card each for all the retail stores that sell graphic cards. Even with the price set high as it is would think a much greater supply is needed.

    From here:">
    The market for grphics cards is about 20m units per quarter...this card is supposed to be in the mainstream segment would think it would sell in the millions.
  • ChoadNamath - Thursday, January 14, 2010 - link

    How is the load power 63W higher than idle when the TDP is supposed to be only 61W? It sounds like something is funky with your review sample, or AMD's numbers are just wrong. Reply
  • Iketh - Wednesday, February 03, 2010 - link

    Hey choad, let's bash AnandTech/AMD on the basis of your ignorance! YEA~!~! It's better to just ask why it's 63w than to assume. Reply
  • Spoelie - Friday, January 15, 2010 - link

    Because the GPU can never truly be isolated, the CPU/memory/buses need to perform some work too to keep the GPU fed with data and instructions to process. Reply
  • Slaimus - Thursday, January 14, 2010 - link

    It is not too long ago that the Geforce 6200 debuted at $150. Low end gaming cards are slowly pickup up prices again. Reply
  • dagamer34 - Thursday, January 14, 2010 - link

    When do the low profile 5650/5670 cards come out? I've been waiting one for my HTPC to bitstream Blu-ray HD codecs. Reply
  • SmCaudata - Thursday, January 14, 2010 - link

    Unless you already have an HTPC why would anyone get this card. If building a new HTPC you could get a Clarkdale to bitstream the audio-codecs.

    Also...why do we care if it is bitstreamed? I have a reciever that can decode this but it doesn't matter if the digital information is converted to PCM before or after the HDMI cable. The only advantage is to see those lights on the front of my reciever...
  • papapapapapapapababy - Thursday, January 14, 2010 - link

    future what? dx11 at 5fps? no thanks ati, remember the 4770? that was a good sub $100 card, (thanks) this crap is overpriced, $45 or bust.

  • TheManY2K3 - Thursday, January 14, 2010 - link


    I don't see any of the applications at 12x10 include data for the 8800GT, however, you are comparing the 8800GT to the HD5670 in most applications.

    Could you include the 8800GT in the 12x10 data, so that we can accurately gauge the performance of the HD5670?
  • Ryan Smith - Thursday, January 14, 2010 - link

    The 8800 GT data was originally collected for past articles, where we started at 16x10. The 8800 GT isn't part of my collection (it's Anand's) so I wasn't able to get 12x10 data in time for this article. Reply
  • silverblue - Thursday, January 14, 2010 - link

    It's probably fair to point out that, in most tests, the 5670 is very close to the 8800, and as such listing it may not mean anything. However, the 1280x1024 tests are also without AA - it might be nice to see the effect of turning AA on with this oldie but goodie as compared to the more modern competition, so including it may make sense. You may think that the higher core clock of the 5670 would give it an advantage without AA but if it goes anything like Batman, this would probably be an incorrect assumption as well. Reply
  • pjladyfox - Thursday, January 14, 2010 - link

    Last I looked ANY Radeon card with the x5xx, x6xx, or x7xx model number was denoted as a mainstream card which is clearly noted here:">

    By that definition that means that these cards were designed to run in systems that have power supplies from 350 to 400w, support HD quality video, and support games at a resolution of no higher than 1440x900 at medium quality settings with 2x AA and 8x anisotropic filtering. By putting them at settings that most will not run these cards at it makes these results for the most part worthless.

    I mean who cares how these cards run at 1920x1200 at high detail settings since we already know they're going to fail anyway? I'm more interested in how these run with all the details on at say 1440x900 or possibly 1680x1050 which are the more common widescreen monitors most people have.

    For that matter where are details about how these cards compare running HD quality video, if the fan speed can be controlled via speedfan, or even if they have fixed some of the video quality issues like black crush when outputting via HDMI?
  • Ryan Smith - Thursday, January 14, 2010 - link

    We traditionally run every game at 3 resolutions, particularly since some games are more GPU-intensive than others. Based on the 12x10 performance, I decided that 10x7 would be silly and instead went up a level to 19x12 - a few games were actually playable even at those high resolutions.

    16x10 is accounted for, and 12x10 is close enough to 14x9 that the results are practically the same.

    HD Video: All the 5000 series cards, except perhaps the Cedar are going to be exactly the same.

    Fan speed: Can be modified (I use it to cool down the cards before extraction after running FurMark)

    Black Crush: I honestly have no idea
  • ereavis - Thursday, January 14, 2010 - link

    Hybrid Crossfire review please! This generation is supposed to support it (according to the IGP review) Reply
  • ZipSpeed - Thursday, January 14, 2010 - link

    Looks like a worthy replacement for my 4670 in my HTPC. Looks like it will still have some issues with certain games at 1080p but since I play mostly Source games on my HTPC, it should stabilize some frame rate problems I get with my 4670. Reply
  • DigitalFreak - Thursday, January 14, 2010 - link

    Not showing up on Newegg yet... Reply
  • BelardA - Thursday, January 14, 2010 - link

    Uh... Newegg has 7 5670s to choose from for $100 (512mb) to $120 (1GB). You posted your response 4 hours ago.

    Stupid to spend $120 for such a card. $125 (after rebate) ~ $140 gets the easily faster 5750.
  • DigitalFreak - Thursday, January 14, 2010 - link

    ... and at the time there weren't any 5670s on Newegg. Reply
  • BelardA - Friday, January 15, 2010 - link

    Oh, I know. Thats why I said it was "4 hours" after your posting. it wasn't a slam... Just showing how much things can change in a few hours. :0
  • Sureshot324 - Thursday, January 14, 2010 - link

    Wow, modern high end cards are around 3 times as fast as my 8800gt, yet I can still play pretty much any game at max settings at 1680x1050. Reply
  • DigitalFreak - Thursday, January 14, 2010 - link

    You can thank consoles for that. They're the least common denominator now, and a big reason why game graphics aren't moving forward as fast as they used to. Reply
  • hadphild - Thursday, January 14, 2010 - link

    Can anyone confirm being able to run a 6 screen configuration with 2 x 5XXX cards yet.

    If it ran with 2 x 5670 cards then I would be happy. (I am not running Crysis with this config) But wanting to run a custom opengl app.

  • Spoelie - Thursday, January 14, 2010 - link

    eyefinity while crossfired is not supported as of yet. Reply
  • krumme - Thursday, January 14, 2010 - link

    KO 1 round Reply
  • Shadowmaster625 - Thursday, January 14, 2010 - link

    Where is the 4770 and 9800GT??? A lot of the the data makes me skeptical because it doesnt mesh with what I'm seeing on other sites. I'm more inclined to trust the other reviews because they used common sense and compared this card to more cards in its own price range. Common sense is your friend. Notice how close the 5670 and 5750 are in terms of load power. That dont make sense at all. Reply
  • Ryan Smith - Thursday, January 14, 2010 - link

    The 9800 GT is the same as an 8800 GT. As for the 4770 it's here too, although I don't have any 19x12 data for it since that resolution was a last-minute decision (I only had 30 hours or so with the 5670). Reply
  • ereavis - Thursday, January 14, 2010 - link

    Common sense dictates that if a 4770 performs like a 4850 clone in most cases, leave it out for saturation purposes. Same case for the 9800GT, GTS 250 is a rebranded 9800GT, near identical performance.
    Price point? 9800GT and 4770 are next-to-unavailable, so what price should be used on something that can't be bought (in the near future when 5670 is all that's left on ATI side and GT 250 for nVidia but this review is still sitting on this site).

    How can a power meters measured numbers "not make sense"? A much better performing 5750 at near 5670 power usage just means inefficiency at 5670 performance, which is common for lower performance parts on similar fab.

    Explanation of the above would be a bonus, but hardly required, great review.
  • daytrader7 - Thursday, January 14, 2010 - link

    4850's for 99 bucks a pair?

    Methinks not.
  • silverblue - Thursday, January 14, 2010 - link

    No I think he meant that he'd seen a couple of instances where a 4850 was priced as low as $99, not that it costs $99 to have two of them.

    If it only cost that, hell I'd import two myself.
  • daytrader7 - Thursday, January 14, 2010 - link


    2 different 4850's available @ 99.00 each.

    Misinterpretation on my part.

  • Zool - Thursday, January 14, 2010 - link

    Also want to note that without real nvidia cards for amd it seems enough just to beat the weak nvidia setup and thats all.
    Eyfinity is just plain stupid on these level of cards for games. Real DX11 games on these performance levels is questionable at least too.
    Both 5700 and 5600 are VERY weak for the price they sell. I wouldnt recomend them to anyone if they own a 4800 or 4600 series card, just in case they need a new card in a new machine.
    The feature set with new generation of cards, including audio bitstreaming should be a MUST and not a priced upgrade put in the cards cost. Iam quite dissapointed form the amd-s 5600 and 5700 series cards. The only real new cards are the radeon 5800-s.
  • MadMan007 - Thursday, January 14, 2010 - link

    I agree with Zool. The 5800s are the only true advancement among this 5000 series. They actually improved performance and price/performance (well, until pricing got out of hand because of supply) over their equivalent lineup predecessors and against their competition from NV. The 5700s and apparently 5600s are just spinning wheels performance-wise: they are feature upgrades not performance upgrades. That makes them mildly disappointing and not an easy purchase decision. Reply
  • marc1000 - Friday, January 15, 2010 - link

    I had a 3850. Buying a 5770 was a fairly easy decision to me. I wanted the 5850 in fact, but to buy this one I would need to change my PSU too and that would be too expensive. So the 5770 it was! And I'm pretty glad with the performance. Remember that not everyone has the latest board from previus generation. ATI/AMD is doing a great job. Reply
  • Calin - Thursday, January 14, 2010 - link

    These lower end series are not intended to run high resolution monitors in "heavy" games at performance modes. For that, there is the 5800 series.
    These 5600 series seems ok for every game in 19" resolutions and lowered quality, which make them perfect for many people. They are a huge step up from integrated graphics :)
  • BelardA - Thursday, January 14, 2010 - link

    How many people actually buy 19" displays anymore? Wide screen isn't like the older 4:3 screens, so a 19" LCD is kind of small.

    At $125~150, there isn't much reason to NOT get a 20~21" class monitor.

    While the 5600s are a bit on the slow side, there is a NEED to have low-end graphics cards that meets some standards and having an entire product line support DX11 is still a good thing.

    Once the price of the 5670 gets down to $75 then it will be a good value card. But not at $100~120 which is the current price on Newegg. And remember, many people don't have the PSUs (or budget) to get support a 5700 series card. I think once 40nm manufacturing matures for TMSC (sp?), the pricing will go down more.

    As an owner of a 4670, the 5670 is easily a faster card... but I believe AMD screwed up. The $100 4770 was almost on par with the 4850 and easily faster than the 4830. There is NO reason the smaller die 5670 to be ANY slower than the 4770. That is ALL the 5670 needed to be. But then again, the $135 (today) 5750 is starting to be constantly faster than the 4850 card (good).

    SO the real problem is pricing. If the $100 5670 was almost as fast as the $135 5750, there would be no need for the 5750. Also, other than PSU requirements - it would be stupid to spend $120 for a 1GB 5670 when the 5750 is $15 more and almost twice the performance.
  • Zool - Thursday, January 14, 2010 - link

    "They are a huge step up from integrated graphics :)"
    Price wise the 5670 is a huge step too from integrated graphic. I was mainly comparing the 5xxx and 4xxx series and thats almost a zero jump.
  • Zool - Thursday, January 14, 2010 - link

    But of course why should AMD compete with itself when it still beat everything that nvidia has in price.
    I think i will skip this generation too and wait for the 6K cards.
  • Zool - Thursday, January 14, 2010 - link

    The 5700 cards are on the same level than 4800 cards and the 5600 cards are very close to 4600 cards. Now if u enable DX11 in games u will se performance way below both 4800 and 4600 for both dx11 cards against they counterpart. Thats downgrading not upgrading.
    And the X700 vs X800 series trick and price range change is quite disturbing too.
  • Zool - Thursday, January 14, 2010 - link

    How can game developers make better looking games when the performance/price sits on the same level with each generation ? DX11 is very taxing if u want to make it properly. Those fancy new efects, postprocessing with Dx compute just eats much more shader power,bandwith. Performance wise 4800 owners can upgrade only to 5800 cards (dx11 speeds with 5700 is very weak) which price level is another category.

    But that can happen if your only competition is rebranding a 2006 card architecture because the GT200 was overdesigned. The disturbing part of this is that nvidia cant learn from its mistakes and make another giant chip second time GT300 which is this time even late :).
  • Zool - Thursday, January 14, 2010 - link

    Its quite strange that they downgraded the 5670 TMUs from 32 to 20. With the 60+ GB/s the 32 TMUs could be much more usefull than with the 4670 bandwith. All games use multitexturing to some degree quite some time. Reply
  • Spoelie - Thursday, January 14, 2010 - link

    Far Cry 2: the text states that the 5670 and the 4850 have the same amount of memory and that the 5670 beats the 4850.

    However, looking at the test setup, the 5670 is the 1GB version and the 4850 is the 512MB version, and the test results support this. The gap between the 4850 and the 4870 is *way* too big to not be memory size constraint.

    As such, the only reason the 5670 "beats" the 4850 in this test is the memory size, and the supporting text is wrong.
  • Ryan Smith - Thursday, January 14, 2010 - link

    The 5670 is 512MB.

    The facts have been corrected to fit with reality.
  • Spoelie - Thursday, January 14, 2010 - link

    hmmm ok, then the Far Cry 2 results are a bit peculiar. The 4850 has the same amount of memory but more of everything else and is 25% slower. The performance of the 5670 seems to fall in line with its compute resources, as if it doesn't have a memory bottleneck. This made me think you had a 1GB card. My apologies. Reply
  • Spoelie - Thursday, January 14, 2010 - link

    Also, the conclusion that the radeon really pulls away from the other value cards at higher resolutions (hawx) might be an artifact of the differing memory sizes.

    (need edit button!)
  • kmmatney - Thursday, January 14, 2010 - link

    I bought a $99 HD4830 more than a year ago, and it much faster than this, especially when overclocked (as it had lot of OC headroom, and performs a little faster than an HD4850). Sad that the same amount of money a year later gets you a slower card. Reply
  • Griswold - Thursday, January 14, 2010 - link

    You 4830 is a partial defective 4850, thats what made it nice value until the 4770 arrived (despite low availablility then). You have to wait for the already rumored 5830 to get the same feeling again... Reply
  • BelardA - Thursday, January 14, 2010 - link

    What made the 4670 an exciting card well over a year ago was that it was under $100 when it was launched ($80 avg) and it was almost as fast as the 3870, sometimes faster (as drivers matured). So when looking at some of these benchmarks that DON'T have the 4670, just look at the 3870 and count it the same. So at $80, it had replaced the $200~150 3870 and ran cooler, etc.

    Anyway, the 5670 SHOULD have at least equaled the 4770 in performance! That would make the 5670 a very good value gaming card for the $90~100 price range. You can get 4770s for about $95~110 (until gone).

    Hopefully in the coming months, the prices will start to get lower
    naturally. But AMD should have a $100 card that *IS* equal to the 4850. Perhaps that would be a 5730 card, but its power should still be under 75watts under load.

    Until Nvidia comes out with something competitive, AMD has little reason to load the prices... ha, notice how things have changed? :)

    Ideal pricing by March/April.
    5870 = $350 (Today = $400~440)
    5850 = $225 (Today = $300~340)
    5830 = $175 * hey, there was a 4380, why not?
    5770 = $125 (Today = $155~200 for 1GB)
    5750 = $110 (Today = $135~150 for 1GB)
    5730 = $ 95 * hey, there was a 4380, why not?
    5670 = $ 75 * Its cheaper to make than a 4670.
    5650 = $ 60
    5550 = $ 55 * Cause 555 looks cool.
    5450 = $ 40
    5350 = $ 30 * Office PCs that need DVI... 4350s are $25.

    With such a line up, the entire 4000 series can go. There are still 3600s and 24/2600s on the market, usually lo-profile or AGP.

    In the meantime, Nvidia will still be selling 9600 / 9800 / GT1xx / G/gt 2xx for another 1-3 years... ugh.
  • BelardA - Thursday, January 14, 2010 - link

    OOPS! I typoed

    I meant to say "4830", not "4380".... doh!
  • Drazick - Thursday, January 14, 2010 - link

    What About Some Open CL / Direct Compute tests?

    No games, just pure calculations?

  • haplo602 - Thursday, January 14, 2010 - link

    hmm seems this card is a bit short of my needs ... a performance level around hd4770 would be great. Reply
  • Obsy - Thursday, January 14, 2010 - link

    Idle and Load Power charts say "NVIDIA GeForce 4870 X2" ;) Reply
  • MadMan007 - Thursday, January 14, 2010 - link

    Nice thorough review. I'd be interested in some more results with lesser or no AA as well though. While we all love AA it's kind of silly to expect to run it well at 1920x1200 or sometimes even 1680x1050 on <$100 cards. Plus it would give those who keep cards for a long time and just turn down features such as AA a better comparison. Reply
  • ET - Thursday, January 14, 2010 - link

    While I already replaced my 3870 with a 5750, it's nice to see a sub-$100 card that beats it in all cases. I'm glad ATI went with 128bit GDDR 5 for this card. Reply
  • Slaimus - Thursday, January 14, 2010 - link

    Another nice thing about it is that it makes an inexpensive triple-head card that does not need external power, even if one of the heads needs to be Displayport. Even a single link HDMI/DVI can still support 1920x1200. Reply
  • SlyNine - Thursday, January 14, 2010 - link

    Is the 4670 you have in your comparison charts a 4670 with DDR3 or DDR2?

    Also any news on the mobile version of this card or the 5380 I've seen in alot of notbooks?
  • Ryan Smith - Thursday, January 14, 2010 - link


    And I don't have any news on the mobile 5000 series. Nor is there a 5830 that I have been informed of.
  • Blahman - Thursday, January 14, 2010 - link

    5830 is in the HP Envy 15. Reply
  • Ryan Smith - Thursday, January 14, 2010 - link

    If it's in a laptop, then it's going to be a neutered Juniper. The Mobility series is always a part down, so a Mobility 5800 series part would be Juniper based. Reply
  • SlyNine - Thursday, January 14, 2010 - link

    I'm sure I seen it in 2, including the Envy15, Perhaps I shouldn't have said "alot".

    But it would be good to have a review comparing the mobile solutions out there. Not to mention the throttling problems in some notebooks.

    I'd love to see Anandtech do a review of the problems the Dell XPS 16 w/ Core I7 has. On A/C and only on A/C it cuts the multiplier to 7 and then uses a clock modulation. Clock modulation tells the cpu to only do work certain cycles, so you can have as many as 75% of your CPU cycles going to waste.

    The end result is a Dell 1645 Core I7 running at the equivalent of 300mhz.
    more info at this forum :">

    Full story here, and just for the record, Id be willing to let Anandtech borrow my 1645 to test if Dell doesn't fix it with this next BIOS update, which I don't see how they can 90watt AC is simply not enough.
  • WT - Thursday, January 14, 2010 - link

    I read through that thread yesterday. We support 50+ Dell e6500 laptops that have been problematic in other ways besides throttling, but it was nevertheless interesting to read and pass along to my fellow IT co-workers. Reply
  • JarredWalton - Thursday, January 14, 2010 - link

    I've looked at the thread and sent Dell an email asking for comment. It's important to remember power supply (power brick in this case) efficiency, so if the brick can output 90W and it's only 75% efficient (which is probably higher than what it really achieves), power draw at the wall of up to 120W might be achievable without the need to throttle. So, it's possible that a BIOS update will indeed address the problem, but let's not jump to any conclusions just yet....

    I'd also say that if you're using FurMark to achieve the throttling, find something else instead. FurMark really pushes the envelope and many consider it a power virus. I understand others are saying it occurs with regular games, which is obviously a much bigger issue than with a test program that doesn't represent a real-world workload.

    Anyway, if you really want to send us the laptop for testing, why not do the testing yourself and use that as the basis for an audition to AnandTech? If you go that route, I would make sure you really investigate when throttling does and doesn't occur, look at the various power profiles and try tweaking those, etc.

    As a side note: with Win7 I noticed on at least one laptop that using the "passive" cooling profile caused video playback to stutter, and setting it to "active" fixed the problem. There are so many variables that you can never know 100% what might be causing a particular problem.
  • SlyNine - Sunday, January 17, 2010 - link

    Jarred thanks, I'm going to take you up on that and currently I'm doing a write up on the XPS 1645 w/ RGB. I would love any suggestions or if you would like me to include anything please send them to with the subject: XPS 1645. If anyone knows any tools other then throttle stop to monitor the CPU modulation that would also be helpful. Reply
  • SlyNine - Thursday, January 14, 2010 - link

    Yea I don't use FurMark at all, in fact I made a post recommending them not use it.

    With just UT3 nothing else going on the multiplier hits as low as 7 and with the brightness up halfway the modulation kicks in bringing the CPU down to 25%, that's only 25 cycles out of every 100 that's willing to do anything. Even just doing a Prime 95 run the multi is below 10, correct me if I'm wrong but isn't it supposed to be around 13?

    But thanks a TON Jarred for acknowledging this, If a high profile sit like Anandtech did a story on it I'd imagen that dell would have to respond. Really this is an Amazing laptop otherwise (other then this line I have threw my screen but obviously that is covered by warranty.)
  • JarredWalton - Sunday, January 17, 2010 - link

    As far as CPU multiplier, if you have the i7-720QM the normal multiplier is 12X (133 bus * 12 = 1.6GHz). For the i7-820QM the stock multiplier is 13 (1.73GHz). Maximum Turbo mode on the 720QM is 2.80GHz, so you could potentially see a 21X multiplier, while on the 820QM the maximum Turbo is 3.066GHz so you'd see up to a 23X multiplier. I don't know if throttle stop tells you max and min multipliers or not, but you could even run CPU-Z and just watch to see if the multiplier is changing a lot. Reply
  • SlyNine - Tuesday, January 19, 2010 - link

    Yea I have been watching a few programs including throttle stop, Realtemp and Realtemp GT, including I7 turbo. They all show the max multiplier at 7-9 when gaming under load, even with an external monitor hooked up and this screen off it doesn't go past 10. Its worth noting that with the screen brightness turned down and CPU load only they stay at 12, but turn the brightness up and your multi falls to 8.

    The biggest problem is the clock modulation, which I'm trying to test. But it definitely correlates with real world performance, while task manager may show the CPU at 100%, throttle stop reports a 75% reduction in CPU usage. This also correlates with the delta that taskmanager indicates CPU usage at and what programs like I7turbo and real temp show the C0 state percent. Task manager will show 100% while the C0% will be at 25%, indicating a 75% reduction while under load.

    Perhaps throttle stop just measures the difference between the CO% and what the OS reports.

    I've custom set all the settings in the advanced power options to be the same on and off battery. When you unplug the system runs a great deal faster, albeit at the risk of harming the battery. I've disabled speed step as well with no difference.

    Excel isn't my strong suit(basically I'm going to have to relearn how to use it) but I'm trying to correlate frame rate with the indicated clock modulation. But I'm unsure how to record a timeline of FPS. It does appear though that the FPS do report accurately when the clock modulation kicks in.
  • satish2685 - Monday, April 01, 2013 - link

    Hi, I would like to purchase an Entry level 1GB DDR3 Asus Geforce HD5450 Graphics Card, but considering the power requirements, i only have an 250W PSU. Is it ok to buy a graphic card that requires a minimum of 400w and connect it to my existing MB or do i need to upgrade my PSU?? Advice required. If so any consequences i could face in future ?? Reply

Log in

Don't have an account? Sign up now