Back to Article

  • marsspirit123 - Sunday, May 31, 2009 - link

    For the $160 less with amd 720 you can get 4890 in cf and beat I7 for the same money in thouse games . Reply
  • Royal13 - Saturday, April 11, 2009 - link

    Which system will perform better in games? I can not oc mine E6600 more then 3GHz with box cooler, but was hopping for 3.5GHz at least with X3 720. Can it be done without any alternative cooling?
    GFX does not really metter. I have 9800GX2, 3870X2 and 4870 at the moment.
    Or should I just go for PII940 instead? It costs around 100$ more, but a don't have a lot to spend, so CPU should hold 2 years. The same as mine E6600 did.
    What do you think?
    I want to oc, less power usage, more like a standard internet pc, but with a good gfx, course I have SyncMaster T260HD, so I am forced to run games at 1900x1200.
  • 7Enigma - Tuesday, March 31, 2009 - link

    Guys, I love CoH. I'm currently playing OF and will probably be getting the latest expansion in a couple weeks but for the love of God please use DX10!

    You've used this same introduction for ages now:

    "In the meantime, we crank all the options up to their highest settings, enable AA at 2x, and run the game under DX9. The DX10 patch offers some improved visuals but with a premium penalty in frame rates."

    That premium penalty is what exactly? 20%? 50%? In this test you have a MINIMUM frame rate of 45 with a single card at the stock cpu frequencies, while the average is >110fps. That's like running at 800X600 resolution so you can have 400fps. I get that at some crazy high resolutions or with very lopsided hardware (fast cpu with slow gpu or vice-versa) you may have some frame rate issues, but please this is a RTS and not a FPS. If you want it to age better and have it actually stress the system properly (regardless of whether it looks tremendously better) use DX10.

    All you have to do is change the introduction to say, "In the meantime, we crank all the options up to their highest settings, enable AA at 2x, and run the game under DX10. DX9 offers virtually the same graphical experience minus some improved visuals but at a significantly increased framerate".

    Great article btw!
  • Plyro109 - Tuesday, March 31, 2009 - link

    Well, my memory isn't the best sometimes, but if I remember, on my computer going from DX9 to DX10 on CoH resulted in about a 70% performance hit. This equates from about 120FPS to about 30-35 for the average, and on the minimum end, from 35 to about 10.

    I'd rather have the MUCH higher framerate than the SLIGHTLY improved visuals.
  • 7Enigma - Wednesday, April 01, 2009 - link

    So I did a bit of googling and you are correct; seems to be about a 70% performance penalty which is significant. For gaming I agree with you, but for benchmarking I still think DX10 should be used, if only to stress the gpu/system more in line with the other titles used.

    I didn't realize it was so bad since I'm gaming on a 19" LCD and cranked everything up to max with 4X AA...but I did just build my gaming rig in January with a 4870 so the combination of good hardware and low res is probably what hid the huge drop in framerate.
  • Niteowler - Monday, March 30, 2009 - link

    I had to make the decision between a Phenom 2 quad-core 940 or a Phenom 3 720 triple-core cpu about 3 weeks ago. I went with the 940 because it's a stronger performer across the board and one obvious reason that i figured out that wasn't mentioned in this article.....the cost of each system. Phenom 3 motherboards cost more in general and DDR3 is certainly more expensive. There was only about $10 to $15 difference for both systems. It basically came down to whether I wanted DDR3 or an extra core more. Phenoms don't really take full advantage of DDR3 yet in any of the reviews that I have read. The 720 is a decent performer in it's own right and I wouldn't have felt to bad buying one until the black edition quad core Phenom 3's come out. Reply
  • Visual - Monday, March 30, 2009 - link

    What you call "Phenom 3" is more properly "Phenom X3", or triple-core. you make it sound confusingly like third-generation or something.

    Also, it runs perfectly fine on AM2+ motherboards, with DDR2. You are right that AM3 and DDR3 are more expencive, but that is not related to the CPU choice.

    Not that there's anything wrong with going for the quad 940 - I think it is worth the extra cash... just pointing out, it is indeed some extra cash, quite more than the $10-$15 you state, compared to a X3 AM2+ setup.
  • XiZeL - Sunday, March 29, 2009 - link

    grate article... really makes you think twice before spending extra cash on an intel rig. Reply
  • tshen83 - Sunday, March 29, 2009 - link

    There used be a time where reviewers would properly review a platform. It is funny to read articles that say "Phenom is a great gaming platform" because the so called "equivalent gaming performance" compared to last-gen Intel core2 based CPU is simply GPU bound, even with crossfire.

    X3 processors are junk. They are broken POS that AMD couldn't sell unless one core is disabled. The question is why would you want to buy a triple core 95W processor, when you can buy a quad core 75W ACP(95W TDP) Opteron 1352 for about $110 now on newegg. Pay more and get less core :)

    I can hardly recommend Phenoms when the Q8200 is a much better performer from a previous generation. Then again, the i7-920 simply trumps anything AMD has right now.

    If you are a casual gamer, even Atom 330 on Nvidia 9400M can be a great platform for only 20W total platform power consumption. AMD better pick up their pathetic engineering effort and start doing some thinking, because the time bomb is ticking for them. I actually like Dirk Myers, compared to that POS Hector Ruiz.
  • waffle911 - Sunday, March 29, 2009 - link

    I fail to see a single valid and substantiated argument in your poorly written post, other than the fact that the PII 720BE costs more than the Opty 1352, and has a higher TDP.

    The X3 is an X4 with one core disabled either because it has a defect or because AMD needed to boost there quota of X3's to meet demand, which is actually becoming the more frequent of the two situations. Your arguments for "energy efficiency", if that is what you're arguing, are comparing apples to oranges. You have performance, or you have efficiency. But overall efficiency is also affected by how much data can be processed for each joule of energy expended, and in that vein the 720 still trumps the 1352, because it can process a certain amount of information faster enough than the 1352 that it can expend less energy overall despite having higher peak energy use. And that 95W rating isn't very indicative of how much energy it will actually use for any given task, either (nor, for that matter, is the 75W rating of the Opteron).

    PII 720:
    4000MHz HT
    L1 2x128kB
    L2 3x512kB
    L3 6MB

    Opt 1352:
    2000MHz HT
    L2 4x512kB
    L3 2MB

    You are definitely getting less processor for less money. Not only is the Opteron (a server/workstation processor!) using technology that's approaching 2 generations old, but price/performance wise the 720 is a better performer for most desktop applications. Not many (if any) games use 4 cores, so they benefit more from the added speed than the number of cores. Plus, you get an unlocked multiplier and actual room to overclock. For my $20 extra, I would gladly take a 720 clocked at 3.6 or a conservative 3.4 over an Opty 1352 at an optimistic 2.6, if you can get it to overclock at all. It's all about quality over quantity. I would rather have one BMW 335i over 2 Honda Civic Si's if I'm going to be the only one driving them. And it would only be me, because games only use 2 cores at best, not 4, and I can only drive one car at a time. You have 2 cores left over doing nothing, just like I've got one Civic Si sitting parked in my driveway while I drive the other.

    And no, the Q8200 is not necessarily a better performer. In most applications, the PII 920 will outperform it (it'll even compete with the Q9300, which is more recent), and now it is priced similarly ($164), making it a much better value. But the PII 720 has longevity on its side, because its compatible with AM3. The Q8200 has nowhere to go, and LGA775 is a dying breed. I like knowing I can upgrade my system in bits and pieces further on down the road as I can afford them, rather than having to fully replace my motherboard, CPU, and RAM all at once. I can do the CPU now, the motherboard and RAM later when prices come down, and then when a better high-end CPU comes along I can upgrade to that as well.

    Plus, while the i7-920 may beat anything AMD has right now, it's a terrible value for the money. Between an $800 PII 720 gaming rig and an $800 i7-920 gaming rig, the 720 allows room in the budget for better graphics, more RAM, more HD space... the i7 just doesn't allow for a very balanced system on a budget.

    And I have yet to see a single commercially available example of an Atom paired with a 9400M. All of the ones out there are engineering/testing samples. But when that does come along, I will gladly get it and put it in my car PC.
  • strikeback03 - Monday, March 30, 2009 - link

    What is the need for a 9400M in a carPC? How much GPU does it take to run a front end? Reply
  • tshen83 - Sunday, March 29, 2009 - link

    If you haven't figured out by now, both Intel and AMD flush higher TDP CPU parts down the consumer's throats, and save the really good CPUs(performance per watt) for the data centers in the 2P space.

    My original post was meant to tell Gary Key, the author, that his "paid" assertion that "Phenom is competitive as a gaming platform" is flawed, because all gaming benchmarks are GPU bound. It means CPU can be a lot weaker before it will show up on the FPS charts.

    There is no reason to save 30 dollars to get the X3 at all because the X4 is just a dinner bill away from the X3. Have some mac and cheese for dinner and you would have had enough money to get the Phenom x4 920.(Not that it is a smart choice at 125W TDP, but surely beats the X3 by far.)

    There are actually far better AMD CPUs to get than the Phenoms. AMD Shanghai 2376s are at the same price now, and allows scalability to 2 Socket and takes only 75W TDP also. Tyan S2912G2NR board is 60 dollars at newegg to support two Shanghais. Phenom x4 is the CPU that didn't make the "Shanghai" grade, and got flushed down to unsuspecting consumers. x3 is a castrated x4. You get the point.

    BTW, i7 920 will get far better longevity as a platform than any AMD processor right now. Nehalem-EP would likely bankrupt AMD for good this time(In fact AMD is already broke a few times if it wasn't selling blood to the Dubai oil suckers) Your argument that Phenom IIs will outlast Socket 775 is correct, except in the case of AMD going belly up, then, you are stuck with a 125W TDP heater that's half the speed of the i7-920.

    Good luck, I am taking off for the day. Get ready for tomorrow, when Nehalem-EP will be revealed, and it will put AMD out of its pathetic misery.
  • moriz - Sunday, March 29, 2009 - link

    some bold assertions. care to prove any of those?

    out of the three tech giants (Intel, nVIDIA, AMD), AMD currently has the best platform: only AMD can deliver the complete package using only their products. this will be a pretty big advantage down the road, and i think Intel and nVIDIA both know this.

    therefore, AMD is not going away.
  • 7Enigma - Tuesday, March 31, 2009 - link

    Don't feed the TROLLS. Reply
  • tshen83 - Monday, March 30, 2009 - link

    Only AMD can deliver the complete "CRAP" package to idiots pretty much. Let's see, the CPUs aren't as good as the Nehalems. The GPUs get their butts kicked in GPGPU modes. It takes AMD's 800 Stream processors to fight Nvidia's 240. Talk about freaking copy and paste engineering.

    In the enterprise market, IT managers will soon realize that the glorious AMD days(HP DL585) are gone, and 8 Socket Nehalem-EX will be a 128 thread monster with 1-2TB of ram(128 FB-DIMMs), killing off Itanium(HP Superdome) along the way.

    "AMD is not going away". Really? I have AMD's tombstone marked June 2010, with Hector Ruiz's name on it too. The Global Foundry spinoff dollars won't even last them 3 months.

  • Hacp - Sunday, March 29, 2009 - link

    Does the Opteron have unlocked multipliers? I'm thinking no! Reply
  • Repr - Sunday, March 29, 2009 - link

    so far the E8x00 series have been the most wanted choice for gaming machines (in the netherlands at least). however after checking a few price comparison sides i found out that the x3 720 is cheaper then the E8400. i would wonder how the tripple core would fight up against the intel dual core. Reply
  • hansmuff - Saturday, March 28, 2009 - link

    Thanks for the article.
    I do have a request: for those games where you use timedemos or other recorded input files, would you be willing to link to them in the article?
    It'd be nice to compare one's own machine to those benchmarked.

    Thank you!
  • 7Enigma - Tuesday, March 31, 2009 - link

    My guess is they want to keep them closed so "optimizations" don't take place that would give an unfair advantage to one side or the other. Reply
  • Roland00 - Saturday, March 28, 2009 - link

    This article was near perfect in what I am looking for in a price for your buck comparison. Showing the difference between cpus at different frequencies, with/without cross fire, and showing the minimum frame rate in an easy to read manner. To top it all off you had real analysis in the text.

    Well I am glad to know the X3 PhenomIIs are comparable with gaming, with a nice oc and sometimes needing Crossfire.

    Then again at the price an X4 PhenomIIs are you may just want to get an X4 PhenomII

    Currently newegg has the x4 920 at 164. It has a promo code for 30 dollars ( AMD32530 ). Making the cpu $134 with free shipping and most places no tax.
  • Hrel - Saturday, March 28, 2009 - link

    Great Article, good stuff to know. However, in the future, when you're doing a CPU article I'd appreciate it if you'd include results for the E8400 Core 2 Duo; as that was the processor to get for some time. I'm sure I'm not the only one that put that CPU in just about every computer built for some time. It'd be nice to see how it stacks up to the newer CPU's.

    On another note; it'd be nice if you could include performance results for the 8800GT/9800GT in future GPU articles; as that WAS the GPU to get for so long. I'm sure there are tons of people out there that still have that GPU running in their systems. No matter what the point of the article is or what level of intensity you're putting the cards through; it's always nice to have something to go off of to compare what you have now; to what's just coming out.
  • 7Enigma - Tuesday, March 31, 2009 - link

    My gaming computer I built back in January uses the E8500 as at the time it was for me the best bang for the buck. I have it OC'd to 3.85GHz on stock voltage, with tremendous headroom if I ever decide to up the voltage (Xigmatek 120mm rifle cpu cooler). If I was building today I would probably go with the AMD X3, but for my gaming (currently on 19" LCD with plans to go to 24" probably in the next 2 years), the E8500 has more than enough grunt. Reply
  • atlmann10 - Saturday, March 28, 2009 - link

    The percentage of people who actually use a computer to it's abilities today is so minimal its almost self defeating. This percentage on this boards and most hardware discussion boards would be a good bit different. But, when you average that to the computer users in the US that is unremarkable at any percentage level. Do not get mad at me; because, I am not talking at least specifically, on the large part of people on here. Software development much less released product is so far behind the hardware market it is also unremarkable. Yes maybe somewhere between 3-10% of high end newly releases games put a computer to a decent percentage of usage but that percentage is unremarkable at best. Next month AMD releases the rest of the Phenom 2's and in September INTEL releases it's newest processors to make this even more so. I think when I upgrade (probably the middle of this summer (Julyish), I will most likely go for the 920. But the arguments especially some of the more ridiculous in this exact forum are childish, and I imagine that super harsh comment was posted by a 13 year old. At least I would hope so, if not you (that commenter specifically) need to truly evaluate you mental sense of operation. For childish comments like that you need to go to the Nickelodeon website, if they have a discussion of this type of equipment. Anyway; the main point of my post is this, Software developers need to get off there A77, and make some products that use the hardware to at least a 50% level more than 5% of the time. Reply
  • Beno - Saturday, March 28, 2009 - link

    this just shows that those games arent written for quad core processors. Reply
  • Summer - Saturday, March 28, 2009 - link

    The article did a good job concluding that one can build a good gaming machine without spending too much. Emphasizing on real game performance is a plus especially to the average consumer who just wants a decent system to play today's popular games. Hopefully the average anandtech reader won't think too much about the article and turn it into another AMD versus INTEL e-penis thread.

    SIDENOTE: I'll definitely be looking forward to the Northbridge article.

  • nubie - Saturday, March 28, 2009 - link

    What about performance of the x2 7750 that you can buy on ebay for $49 (free shipping) and Newegg for ~$65 (+tax and shipping)?

    I haven't even seen a review of this processor that I can remember, is it Phenom I or II? Is it built from a quad-core die like the x3?

    They seem to overclock well, would they do for a real budget gaming system, say ~$300-400 for the entire system including a HD4xxx or 9800 series card?

    I appreciate your "mainstream" bias, but some people have no money and just want to run the games or keep their system usable without laying out more than $100 for an upgrade.
  • Roland00 - Saturday, March 28, 2009 - link

    The x2 7750 is a Phenom 1 chip with two cores disabled. I haven't seen reviews here but there have been several reviews at other sites. It is comparable at stock to an e5200 from intel (the x2 7750 is barely faster, but not by much.)

    Once you start OC either chip the e5200 is a better chip for it has far more OC headroom. In addition the e5200 is comparable in price to the x2 7750.

    Eventually we are going to get Phenom II dual cores but that is going to be several months from now.
  • nubie - Saturday, March 28, 2009 - link

    Yeah, I found an e5200 for $59 on ebay, as soon as I can get my P6N RMA'd by MSI to support 45nm processors I am going to try for 4Ghz (my Scythe Infinity should be up to the task ;) )

    If AMD and DFI hadn't dropped support for my Infinity-AM2 I might have stayed with AMD.

    Thanks for the info on the Phenom 'x2'. I wonder when we will get a Phenom II tech processor for under $100 (preferably closer to $50).
  • iamezza - Tuesday, March 31, 2009 - link

    The x2 7750 actually does really well in gaming benchmarks compared to the e5200, but gives up a bit in application performance.
    It uses a lot more power than the intel chips though and has less overclocking headroom.
    It does have a potentially better upgrade path with the AM2+ socket being able to accept future AM3 CPUs from AMD whereas the intel socket 775 won't be getting any new CPUs for it.
  • buzznut - Saturday, March 28, 2009 - link

    So I didn't see too many comments that were actually about the article, go figure.
    I think its awesome that someone is writing articles that are for "the rest of us", people who cannot afford the latest and greatest and have to make compromises to build themselves a new system. Or don't receive products to test and so are able to build the most ridiculous, benchmark busting, $4000, top of the top end behemoth.

    If I had seen this article a month ago, I might not have bought the PII X4 940 and yet I am still glad I did. I do alot more than just gaming on my pc.

    I would find it even more interesting to take the intel processors and clock them to 3.8ghz (or same speed as the AMD proc.) and run the same battery of tests; yes I know that is not the point of this article.

    And I see why you would want to see the max performance available from each chip.

    About a year ago or even as recent as 6 months ago, everyone was saying what a stupid move it was for AMD to acquire ATI and basically counting AMD out as far as competing ever again. Right now, AMD doesn't look too stupid to me. Seems to me they are doing quite well with developing the "platform" as their strategy for getting their share of the market.

    Look at the way Nvidia and Intel are fighting right now. I think AMD has the right idea and are moving in a sound direction. I think they have compelling products, certainly from a budget pc users standpoint. I know others will not agree, judging from AMD's bottom line in the recent past and even currently, but they appear to be moving in a positive direction.

    I think AMD's graphics division is firing on all cylinders now.And as "bad" as the original Phenom was, the have become competitive again with Phenom 2. I am pretty impressed with the turn around.

  • yyrkoon - Saturday, March 28, 2009 - link

    You know I have been thinking it would be really cool if you guys did a story on *why* a specific game title performs better on various hardware. Does ID soft optimize for Intel ? AMD ? nVidia? AMD/ATI ? What about other game developers ? Could it be Microsofts "fault" ?

    You know, all that sort of "jazz" :)
  • MadMan007 - Saturday, March 28, 2009 - link

    I would have liked to see idle and load power consumption numbers. I know that my PC does not run at loast at least half the time if not more so idle power consumption is important to me and matters for TCO.

    That's the only thing missing from this article, otherwise nice succinct writeup.
  • gnesterenko - Saturday, March 28, 2009 - link

    Well, if I was buying a system today, I'd have to go for i7 920 by these numbers, BUT. THeres a few very interesting options coming soon. First is the new C2D from intel - the E8700 clocked at 3.5GHz. Although only a dual core, thats really really fast clocks per core and I'm sure it would OC to 4.5GHz on air like a champ considering how well the other C2Ds OC. THe other is the Phenom II 955 clocked at 3.2GHz. THis is the first quad AM3 CPU from AMD to break 3GHz barrier and should be an interesting option as well. In any case, I'd like to see another one of these articles including these two above once they come.

    Although either way, won't be picking a platform until I see performance numbers of the RD890 and SB800 platform from AMD. THis is going to be a merry X-mas!
  • TMike7 - Saturday, March 28, 2009 - link

    The quality of your articles is really outstanding, i love reading them.
    Some time ago I read an article about memory and the conclusion was that more memory is better for improving the overall performance of a given computer system than more expensive memory.
    Could You please include in your testing on DDR2 versus DDR3 one or several tests with 8Gb of DDR2-memory (2 kits of 2x2Gb). It would really be nice to see how the PhenomX3 720BE can cope with all four memory slots populated and how far it still can overclock.

  • martenlarsson - Saturday, March 28, 2009 - link

    He paid $400 for the entire setup excluding GPU, that's just a tad more than you pay for the cheapest i7, CPU only...

    Really nice article and shows you don't need a monster CPU to game. The X3 720 is looking more and more like the chip to buy.
  • erik006 - Saturday, March 28, 2009 - link

    In the article index "opposing forces" in displayed. That should be "opposing fronts." Reply
  • JarredWalton - Saturday, March 28, 2009 - link

    Gary's been playing the new cross-genre game that combined HL2 with RTS gameplay, I suppose. We could tell you more about it, but then we'd have to kill you.... ;-) Reply
  • jaggerwild - Saturday, March 28, 2009 - link

    You spent four hundred on a MATX when for a few hundred more you could have bleeding edge I7 that will clock out higher? You must be a FAN BOY with yer very miture remarks!
    Oh yeah my momma says hello :)
  • abzillah - Sunday, March 29, 2009 - link

    This is why I bought my phenom 720. On January 18th I got laid off work from a biotech company. I haven't had any luck finding a job. Two weeks ago I sold my 2 year old PC for $350 to a friend who's kid needed a new PC but didn't want to spend much. So now I had $350, and I got $100 for painting some stuff around his house. So, please tell me how I could get myself a core i7 for $450, unless you will give me the rest of the money for free.
    Yesterday I got hired part time at a hardware store and after I pay some of my credit cards, I will buy myself a 4890. You can call me a fan boy all you want, but I see it as smart economics.
    I use mATX boards because I don't add anything on the board besides a video card, so the extra pci lots are not needed by me. I use my pc to surf the net, watch movies, play video games and use Microsoft Office.
  • iamezza - Tuesday, March 31, 2009 - link

    he was being sarcastic ;) Reply
  • Griswold - Saturday, March 28, 2009 - link

    Hi Bertha! Reply
  • StormyParis - Saturday, March 28, 2009 - link

    do we have any idea of the %age of gamers who actually have an xfire (or nvidia's equivalent) config ? I personnally know of none, and always wonder when I see such a review what use it is... Reply
  • MrSpadge - Saturday, March 28, 2009 - link

    All people that I know who do multi-GPU do it for BOINC or folding. Reply
  • TA152H - Saturday, March 28, 2009 - link

    Does the 9550 have the same issues on the x48? You mentioned that the differences were minor, so I'm guessing they did, but, even a few fps seems relevant there, since that's all the more fluid AMD processors had.

  • just4U - Saturday, March 28, 2009 - link

    I agree with your conclusions about the 720BE. It's a nice compromise for those looking at Dual cores and Quads. It's also at a really reasonable price that competes against the 8400 and maybe to a lesser extent the 7500 which I think is priced to high. For those that want a Quad but might not be able to afford it this is a excellent substitute that's actually cheaper then the other two.. (go figure!)

    If someone offered me a 8400 or the 720BE for free and said choose I'd take the 720 without even a thought. Hell It's even got a selling point over the 920 in being a black edition with unlocked multi's.

  • abzillah - Saturday, March 28, 2009 - link

    I have my phenom II 720 overclocked to 3570 Mhz using a Xigmatek cooler. Without the video card and hard drive, I spent $398 plus tax. This is for a gigabyte micro-ATX board, Corsair PSU, Mushkin 1066 DDR2 Ram, Case, Xigmatek CPU cooler and two Xigmatek case fans. Now I am waiting for the 4890 to come out and prospecting some SSDs for a Raid setup.
    I love how much money I saved that I can now use towards a video card and two SSDs. So far I love the performance on my new PC.
  • garbageacc3 - Saturday, March 28, 2009 - link

    you spent so little because you went with mATX.

    mATX mobos SUCK more than full ATX
  • albundy2 - Saturday, March 28, 2009 - link

    troll much?

    i know your dad lost his job and your mother to an american, but would it not be more constructive to seek therapy. we didn't do it.
    i know you hurt, would you like a hug? some gummy bears perhaps?
    a OO buck load to the forehead?

    your off to a great start kid. suicide, please consider it...
  • garbageacc3 - Sunday, March 29, 2009 - link

    fag fuck much?

    i know you're a teabag loving faggot.

    americunts are dumb, fat, and big'o smelly fishy PUSSIES and no one would want to fuck any of you fatties.

    i don't want a hug cause your man tits would crush me.

    you're already at a crappy end fat 40 yr old virgin fuck. go off yourself. remember, down the street, not across it.

    castration, do it.
  • ap90033 - Wednesday, April 01, 2009 - link

    Wow guess Mommy let you use her pc tonight eh?

    Get a life. No wait please post some other useless comment, I am sure that is your calling in life.
  • moriz - Sunday, March 29, 2009 - link

    you first. Reply
  • TA152H - Saturday, March 28, 2009 - link

    You're worse than he is.

    What an ugly, witless, post. Not that his was much better.
  • JonnyDough - Saturday, March 28, 2009 - link

    Was yours any better? Maybe we should just continue the trend of garbage not worth reading with my post.

    asdfjkl; MmmK? /end all the trash talk
  • ap90033 - Wednesday, April 01, 2009 - link

    Wow words cannot explain the total lack of intelligence your posts show you troglodyte. Reply
  • TA152H - Saturday, March 28, 2009 - link

    Holy hypocrisy Batman!

    Actually, I didn't attack him directly, or tell him to commit suicide, or anything like that, I just attacked his post.

    You, on the other hand, did exactly what you criticized. Couldn't you see that?
  • Griswold - Saturday, March 28, 2009 - link

    Ok, I'll say it: You all score equally high on the fool-scoreboard. Reply
  • Clauzii - Saturday, March 28, 2009 - link

    Amen to that!
  • poohbear - Sunday, March 29, 2009 - link

    to the OP, all the power to u for saving money. Nothing wrong w/ mATX at all if u dont need all the features. I'd go w/a m_ATX anyday if it meant i could get a 4890!! Reply
  • Clauzii - Sunday, March 29, 2009 - link

    Yes, I forgot to write 'except OP', but what went on in between was hillarious :) Reply

Log in

Don't have an account? Sign up now