Posted May 11, 200618 yr G'day folks, I'm looking at upgrading my p***y 32mb video card and found this one:- nVidia GeForce 6200V+ 256MB/DDR2 Is it worth getting? It's definitely in my price range so it's what I'm leaning towards but am after some expert advice on its usefulness. Cheers :P
May 12, 200618 yr if i know something about nVidia cards, it's this. versions FX and up are the best because they offer so much useful features like PureVideo. i have a crappy geForce 4 which isn't probably worth a dime. btw for more info on all its features, http://en.wikipedia.org/wiki/GeForce_6000#GeForce_6200_2
May 12, 200618 yr nVidia GeForce 6200 GPU is "Windows Vista Ready", so if you are thinking of upgrading to Windows Vista, you dont need to upgrade your graphic adapter... however... nVidia GeForce 6200 GPU is not SLI ready... in summary, NVIDIA SLI enable you to improve graphic performance by installing 2 graphic adapters (with SLI GPU ready) in a single system (cool huh ) with memory buffer of 256MB/128Bit, i believe it is more than enough for gaming and daily usage...
May 12, 200618 yr ...nVidia GeForce 6200 GPU is not SLI ready... in summary, NVIDIA SLI enable you to improve graphic performance by installing 2 graphic adapters (with SLI GPU ready) in a single system (cool huh )... <{POST_SNAPBACK}> I've never liked the idea of SLI. If it was multiple GPUs on one board (like the new dual-core processors), that would be one thing. But with SLI, you need to have a motherboard with two x16 slots, and the heat produced and power sucked up by two separate cards is tremendous! Plus, I have this fear that in two years or so, we're gonna have a new 3 video card spec, then four, until one day we basically have a tower full of graphics adapters! Kinda like speakers: we had 4.0 surround sound, then 5.1, then 6.1, then 7.1, then 8.2, ... [sarcasm] add one more and the sound is that much better! [/sarcasm] Anyway, I'm done with my rant, but in answer to your question Senuty, the specs on that card are pretty respectable, but I noticed there is not a fan on the heatsink... you may want to read some reviews to see if it runs hot or not. Otherwise, it doesn't look like a bad buy. I buy most of my computer parts from New Egg. Their prices (especially video card prices) are fantastic, and they have never sent me parts that were late or damaged (and I've ordered quite a few parts from them!). You may want to check them out before you buy.
May 12, 200618 yr good analysis there... it sound logical... but a tower full of graphic adapters is over doing... :P
May 13, 200618 yr Author Thanks for the help and info folks. Muchly appreciated Now all I have to do is convince SWMBO that it's a good buy, why I need it and how it's going to make my PC function better.... Cheers :P
May 13, 200618 yr good analysis there... it sound logical... but a tower full of graphic adapters is over doing... <{POST_SNAPBACK}> It doesn't make quite as much sense to have something like this: Not to mention the quad-cores coming out. However, the heat that comes out of actually making FASTER components is probably less than that produced by two smaller ones. At least that's what AMD, Intel, and Nvidia are thinking.
May 14, 200618 yr ...the heat that comes out of actually making FASTER components is probably less than that produced by two smaller ones. At least that's what AMD, Intel, and Nvidia are thinking. <{POST_SNAPBACK}> I think you've got that backwards zaph (unless I misunderstood). Two SLOWER cores produce LESS heat than one FASTER core. That is why Intel and AMD are putting multiple CPU cores on one chip instead of cranking up the clock another hundred or so megahertz. Intel has already reached the "fast single-core" thermal limit, and AMD is bound to eventually if they keep making faster single-core CPUs. That is why multiple cores is such a big deal; you get double the speed, but only a slight increase in heat. Therefore, when you have multiple video cards, as opposed to multiple-core GPUs, MORE heat is produced. That is why I don't like SLI, CrossFire, or any other multiple video card solutions. They just seem silly in the long run. I know that I won't be getting a mobo with multiple PCI-E x16 slots, but I will be getting a board with support for dual-core CPUs. I prefer lower power consumption and a longer lifespan over raw speed. A couple extra frames per second isn't worth a toasted case.
May 14, 200618 yr i dont think neither NVIDIA nor ATI will release graphic adapters similar to dual core manner... since currently it is moving towards Quad SLI... so now, it is either buying a single powerful graphic adapter or using SLI... or both... the thing that i concern the most is power consumption... Quad NVIDIA® SLI only uses 2 PCIe slots... not like in the picture... it scare me at first... :P
May 14, 200618 yr I think you've got that backwards zaph (unless I misunderstood). Two SLOWER cores produce LESS heat than one FASTER core. <{POST_SNAPBACK}> Kinda sorta. The heat that comes out of faster processors is greater than that of two slower processors. That is why Intel and AMD are putting multiple CPU cores on one chip instead of cranking up the clock another hundred or so megahertz. Intel has already reached the "fast single-core" thermal limit, and AMD is bound to eventually if they keep making faster single-core CPUs. That is why multiple cores is such a big deal; you get double the speed, but only a slight increase in heat. <{POST_SNAPBACK}> I've heard a lot about how dual cores are incompatible with games and various applications, though it looks like that's improving a lot. The next upgrade I'm looking for is after AM2 comes out, when I expect 939-pin mobos to go down. I'm thinking about the Abit KN8-32X bundled with the AMD 3800+ X2, or maybe even the 4200+ X2 if my parents are feeling generous Quad NVIDIA® SLI only uses 2 PCIe slots... not like in the picture... it scare me at first... I wonder if they could get that motherboard to cram in the Quad SLI 7900GTX in there. Imagine. Quad-quad 7900GTX You'd need a kilowatt power supply, but still... playing Oblivion at 1600x1200 maxed out at 40 FPS OUTSIDE??? ^_^
May 15, 200618 yr I've heard a lot about how dual cores are incompatible with games and various applications, though it looks like that's improving a lot. The next upgrade I'm looking for is after AM2 comes out, when I expect 939-pin mobos to go down. I'm thinking about the Abit KN8-32X bundled with the AMD 3800+ X2, or maybe even the 4200+ X2 if my parents are feeling generous Not compatible? Like they won't play properly, or do you mean that they just don't take advantage of both cores? I hadn't heard of dual-cores being incompatible with any games, but then I really don't game all that much... I wonder if they could get that motherboard to cram in the Quad SLI 7900GTX in there. Imagine. Quad-quad 7900GTX You'd need a kilowatt power supply, but still... playing Oblivion at 1600x1200 maxed out at 40 FPS OUTSIDE??? ^_^<{POST_SNAPBACK}> Ooooo! That got me all hot and bothered! Restroom break... Oblivion @ 1600 x 1200 --> <-- Me
May 16, 200618 yr Author <{POST_SNAPBACK}> The first part wasn't. <{POST_SNAPBACK}> Agreed Lokoike, it wasn't....... and who made MrT the Topic Policeman??.... Cheers :P
May 16, 200618 yr <{POST_SNAPBACK}> The first part wasn't. <{POST_SNAPBACK}> Agreed Lokoike, it wasn't....... and who made MrT the Topic Policeman??.... Cheers <{POST_SNAPBACK}> do you mean only Mod can use this ? i am not trying to anyway...
May 16, 200618 yr Author <{POST_SNAPBACK}> The first part wasn't. <{POST_SNAPBACK}> Agreed Lokoike, it wasn't....... and who made MrT the Topic Policeman??.... Cheers <{POST_SNAPBACK}> do you mean only Mod can use this ? i am not trying to anyway... <{POST_SNAPBACK}> Nope, anyone's free to use any icons on this forum else Tarun wouldn't have made them available for use. I was just commenting on your rather verbose use of just a single icon and not bothering to back it up with any form of constructive criticism about why it was 'off topic', which as Loko pointed out, it technically wasn't. And even if it was 'off topic', that's why we have mods here, they'll let us know if it's off topic quick enough. Not trying to put the agro on you here, just trying to point out that just the use of a single icon (in this case a 'strong' one) can make it seem like you're (generic usage) above them and it can demean the other person's post. Cheers :D
May 16, 200618 yr if you noticed, lokoike indicated... <{POST_SNAPBACK}> The first part wasn't. <{POST_SNAPBACK}> of coz i believe that everyone can use any smileys provided... if not they will not be available... like what you said... my dont really means off topic or trying to be offensive, in this case to lokoike ... i was joking around with lokoike regrading his post (the second part)... if my post is really offensive, i apologies then...
May 16, 200618 yr i was joking around with lokoike regrading his post (the second part)... if my post is really offensive, i apologies then...<{POST_SNAPBACK}> Sorry about the confusion. The other guys can tell you I like to mess around, but I don't mean anything by it. And no need to worry MrT, you didn't offend me, just surprised me. But since I was joking, and since you were joking... sa'll good in da hood. Oh, and my personal little rule of thumb is, if I'm joking around, I usually follow up my post or comment with a lil' tongue dude. That way I can say things like "ZAPHIRER SMELLS LIKE BAD CHEESE!!!", and as long as I end it with a lil' tongue dude, he won't care. So if you put something like instead of just , people will see that you are spaking in jest. zaphirer <-- zaphirer zaphirer See? Isn't it great?!
May 16, 200618 yr Author I agree with both of you on this as it's extremely hard to convey 'tone' on a written forum. The only real way to inject any form of tone, is best done by following Loko's examples, that way people can see by the ending emoticon what type of tone is intended. @MrT - I guess I fell into the same trap by not using the appropriate smiley when I made the comment about you being the 'Off Topic Policeman', but thought I had it covered with my closing "Cheers " as I was trying to convey a messing around tone. You can tell the tone of my posts by the ending, if you see "Cheers ", I'm usually messing with you. If you don't see it, you've made onto my 's***' list Cheers :D
May 17, 200618 yr i get what u mean... lokoike... You can tell the tone of my posts by the ending, if you see "Cheers ", I'm usually messing with you. If you don't see it, you've made onto my 's***' list Cheers <{POST_SNAPBACK}> and you too... Cheers :D
May 18, 200618 yr Senuty, this box I'm on has an 8 MB SiS card, I would love to have your "p***y" 32 MB card. I'm sure it wouldn't fit, though. :( :wish:
May 18, 200618 yr Author Senuty, this box I'm on has an 8 MB SiS card, I would love to have your "p***y" 32 MB card. I'm sure it wouldn't fit, though. :(Â <{POST_SNAPBACK}> What sort of graphics are you running, ASCII?.... Heh, guess I don't feel to bad now,.... but still an upgrade is in the wind for me, now I just have to convince the finance manager that I REALLY do need it. ::sigh:: Another uphill battle coming up Cheers :D
May 26, 200618 yr That way I can say things like "ZAPHIRER SMELLS LIKE BAD CHEESE!!!", and as long as I end it with a lil' tongue dude, he won't care. So if you put something like instead of just , people will see that you are spaking in jest. zaphirer <-- zaphirer zaphirer See? Isn't it great?! <{POST_SNAPBACK}> Ha. You're funny. I won't care even if you manage to forget the little tongue dude =D Cheese is good. On topic... By the way, I have a GeForce FX5200... it's just barely enough to get you through most modern games (all low, maybe a few modifications.) I can play Oblivion, though it's absolutely horrible outside (check out the screenshot I took in XFire.) The best I can get is 17 fps outside, with the sky off (hey, it gives me 2 fps!). Inside, though, it's completely smooth, 40 fps FTW!
May 26, 200618 yr By the way, I have a GeForce FX5200... it's just barely enough to get you through most modern games (all low, maybe a few modifications.) I can play Oblivion, though it's absolutely horrible outside (check out the screenshot I took in XFire.) The best I can get is 17 fps outside, with the sky off (hey, it gives me 2 fps!). Inside, though, it's completely smooth, 40 fps FTW! <{POST_SNAPBACK}> many of my friends who own FX5200 are complaining about it... i cant believe it is really so bad~ :P
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.