Jump to content

Video Card


Recommended Posts

nVidia GeForce 6200 GPU is "Windows Vista Ready", so if you are thinking of upgrading to Windows Vista, you dont need to upgrade your graphic adapter... however...

nVidia GeForce 6200 GPU is not SLI ready... in summary, NVIDIA SLI enable you to improve graphic performance by installing 2 graphic adapters (with SLI GPU ready) in a single system (cool huh :P )

with memory buffer of 256MB/128Bit, i believe it is more than enough for gaming and daily usage...

Link to comment
Share on other sites

...nVidia GeForce 6200 GPU is not SLI ready... in summary, NVIDIA SLI enable you to improve graphic performance by installing 2 graphic adapters (with SLI GPU ready) in a single system (cool huh :P )...

<{POST_SNAPBACK}>

I've never liked the idea of SLI. If it was multiple GPUs on one board (like the new dual-core processors), that would be one thing. But with SLI, you need to have a motherboard with two x16 slots, and the heat produced and power sucked up by two separate cards is tremendous!

Plus, I have this fear that in two years or so, we're gonna have a new 3 video card spec, then four, until one day we basically have a tower full of graphics adapters! Kinda like speakers: we had 4.0 surround sound, then 5.1, then 6.1, then 7.1, then 8.2, ... [sarcasm] add one more and the sound is that much better! [/sarcasm]

Anyway, I'm done with my rant, but in answer to your question Senuty, the specs on that card are pretty respectable, but I noticed there is not a fan on the heatsink... you may want to read some reviews to see if it runs hot or not. Otherwise, it doesn't look like a bad buy.

I buy most of my computer parts from New Egg. Their prices (especially video card prices) are fantastic, and they have never sent me parts that were late or damaged (and I've ordered quite a few parts from them!). You may want to check them out before you buy.

Link to comment
Share on other sites

good analysis there... it sound logical... :P but a tower full of graphic adapters is over doing... :P

<{POST_SNAPBACK}>

It doesn't make quite as much sense to have something like this:

Quad_SLI_3.jpg

Not to mention the quad-cores coming out.

However, the heat that comes out of actually making FASTER components is probably less than that produced by two smaller ones. At least that's what AMD, Intel, and Nvidia are thinking.

Link to comment
Share on other sites

...the heat that comes out of actually making FASTER components is probably less than that produced by two smaller ones.  At least that's what AMD, Intel, and Nvidia are thinking.

<{POST_SNAPBACK}>

I think you've got that backwards zaph (unless I misunderstood). Two SLOWER cores produce LESS heat than one FASTER core. That is why Intel and AMD are putting multiple CPU cores on one chip instead of cranking up the clock another hundred or so megahertz. Intel has already reached the "fast single-core" thermal limit, and AMD is bound to eventually if they keep making faster single-core CPUs. That is why multiple cores is such a big deal; you get double the speed, but only a slight increase in heat.

Therefore, when you have multiple video cards, as opposed to multiple-core GPUs, MORE heat is produced. That is why I don't like SLI, CrossFire, or any other multiple video card solutions. They just seem silly in the long run.

I know that I won't be getting a mobo with multiple PCI-E x16 slots, but I will be getting a board with support for dual-core CPUs. I prefer lower power consumption and a longer lifespan over raw speed. A couple extra frames per second isn't worth a toasted case.

Link to comment
Share on other sites

i dont think neither NVIDIA nor ATI will release graphic adapters similar to dual core manner... since currently it is moving towards Quad SLI... so now, it is either buying a single powerful graphic adapter or using SLI... or both... the thing that i concern the most is power consumption...

Quad NVIDIA® SLI only uses 2 PCIe slots... not like in the picture... it scare me at first... :P

Link to comment
Share on other sites

I think you've got that backwards zaph (unless I misunderstood).  Two SLOWER cores produce LESS heat than one FASTER core. 

<{POST_SNAPBACK}>

Kinda sorta. The heat that comes out of faster processors is greater than that of two slower processors.

That is why Intel and AMD are putting multiple CPU cores on one chip instead of cranking up the clock another hundred or so megahertz.  Intel has already reached the "fast single-core" thermal limit, and AMD is bound to eventually if they keep making faster single-core CPUs.  That is why multiple cores is such a big deal; you get double the speed, but only a slight increase in heat.

<{POST_SNAPBACK}>

I've heard a lot about how dual cores are incompatible with games and various applications, though it looks like that's improving a lot. The next upgrade I'm looking for is after AM2 comes out, when I expect 939-pin mobos to go down. I'm thinking about the Abit KN8-32X bundled with the AMD 3800+ X2, or maybe even the 4200+ X2 if my parents are feeling generous :P

Quad NVIDIA® SLI only uses 2 PCIe slots... not like in the picture... it scare me at first...

I wonder if they could get that motherboard to cram in the Quad SLI 7900GTX in there. Imagine. Quad-quad 7900GTX :P You'd need a kilowatt power supply, but still... playing Oblivion at 1600x1200 maxed out at 40 FPS OUTSIDE??? ^_^

Link to comment
Share on other sites

I've heard a lot about how dual cores are incompatible with games and various applications, though it looks like that's improving a lot.  The next upgrade I'm looking for is after AM2 comes out, when I expect 939-pin mobos to go down.  I'm thinking about the Abit KN8-32X bundled with the AMD 3800+ X2, or maybe even the 4200+ X2 if my parents are feeling generous :P

Not compatible? Like they won't play properly, or do you mean that they just don't take advantage of both cores? I hadn't heard of dual-cores being incompatible with any games, but then I really don't game all that much...

I wonder if they could get that motherboard to cram in the Quad SLI 7900GTX in there.  Imagine.  Quad-quad 7900GTX  :P You'd need a kilowatt power supply, but still... playing Oblivion at 1600x1200 maxed out at 40 FPS OUTSIDE???  ^_^

<{POST_SNAPBACK}>

Ooooo! That got me all hot and bothered! Restroom break...

Oblivion @ 1600 x 1200 --> uberhump.gif <-- Me

Link to comment
Share on other sites

offtopic.png

<{POST_SNAPBACK}>

The first part wasn't. :D

<{POST_SNAPBACK}>

Agreed Lokoike, it wasn't....... and who made MrT the Topic Policeman??.... erm.png

Cheers :wish:

<{POST_SNAPBACK}>

do you mean only Mod can use this offtopic.png ? :P i am not trying to anyway...

Link to comment
Share on other sites

offtopic.png

<{POST_SNAPBACK}>

The first part wasn't. :D

<{POST_SNAPBACK}>

Agreed Lokoike, it wasn't....... and who made MrT the Topic Policeman??.... erm.png

Cheers :wish:

<{POST_SNAPBACK}>

do you mean only Mod can use this offtopic.png ? :P i am not trying to anyway...

<{POST_SNAPBACK}>

Nope, anyone's free to use any icons on this forum else Tarun wouldn't have made them available for use.

I was just commenting on your rather verbose use of just a single icon and not bothering to back it up with any form of constructive criticism about why it was 'off topic', which as Loko pointed out, it technically wasn't.

And even if it was 'off topic', that's why we have mods here, they'll let us know if it's off topic quick enough.

Not trying to put the agro on you here, just trying to point out that just the use of a single icon (in this case a 'strong' one) can make it seem like you're (generic usage) above them and it can demean the other person's post.

Cheers :D

Link to comment
Share on other sites

if you noticed, lokoike indicated...

offtopic.png

<{POST_SNAPBACK}>

The first part wasn't. :wish:

<{POST_SNAPBACK}>

of coz i believe that everyone can use any smileys provided... if not they will not be available... like what you said...

my offtopic.png dont really means off topic or trying to be offensive, in this case to lokoike ... i was joking around with lokoike regrading his post (the second part)...

if my post is really offensive, i apologies then...

Link to comment
Share on other sites

i was joking around with lokoike regrading his post (the second part)...

if my post is really offensive, i apologies then...

<{POST_SNAPBACK}>

Sorry about the confusion. The other guys can tell you I like to mess around, but I don't mean anything by it. And no need to worry MrT, you didn't offend me, just surprised me. But since I was joking, and since you were joking... sa'll good in da hood. :D

Oh, and my personal little rule of thumb is, if I'm joking around, I usually follow up my post or comment with a lil' tongue dude. That way I can say things like "ZAPHIRER SMELLS LIKE BAD CHEESE!!!", and as long as I end it with a lil' tongue dude, he won't care. So if you put something like offtopic.png;P instead of just offtopic.png , people will see that you are spaking in jest.

:D zaphirer :P

:wish: <-- zaphirer :D

:D zaphirer :P

See? Isn't it great?!

Link to comment
Share on other sites

I agree with both of you on this as it's extremely hard to convey 'tone' on a written forum.

The only real way to inject any form of tone, is best done by following Loko's examples, that way people can see by the ending emoticon what type of tone is intended.

@MrT - I guess I fell into the same trap by not using the appropriate smiley when I made the comment about you being the 'Off Topic Policeman', but thought I had it covered with my closing "Cheers :P " as I was trying to convey a messing around tone.

You can tell the tone of my posts by the ending, if you see "Cheers :wish: ", I'm usually messing with you. If you don't see it, you've made onto my 's***' list :D

Cheers :D

Link to comment
Share on other sites

i get what u mean... lokoike...

;P:P

You can tell the tone of my posts by the ending, if you see "Cheers :D ", I'm usually messing with you.  If you don't see it, you've made onto my 's***' list  :D

Cheers :wish:

<{POST_SNAPBACK}>

and you too...

:P:D

Cheers :D

Link to comment
Share on other sites

Senuty, this box I'm on has an 8 MB SiS card, I would love to have your "p***y" 32 MB card. I'm sure it wouldn't fit, though. :( 

:D

<{POST_SNAPBACK}>

:wish:

What sort of graphics are you running, ASCII?.... ;P

Heh, guess I don't feel to bad now,.... but still an upgrade is in the wind for me, now I just have to convince the finance manager that I REALLY do need it. ::sigh:: Another uphill battle coming up :P

Cheers :D

Link to comment
Share on other sites

That way I can say things like "ZAPHIRER SMELLS LIKE BAD CHEESE!!!", and as long as I end it with a lil' tongue dude, he won't care.  So if you put something like  offtopic.png;P instead of just offtopic.png , people will see that you are spaking in jest.

:D zaphirer :P

:P <-- zaphirer  :P

:P zaphirer  :P

See?  Isn't it great?!

<{POST_SNAPBACK}>

Ha. You're funny. :D

I won't care even if you manage to forget the little tongue dude =D

Cheese is good.

On topic...

By the way, I have a GeForce FX5200... it's just barely enough to get you through most modern games (all low, maybe a few modifications.) I can play Oblivion, though it's absolutely horrible outside (check out the screenshot I took in XFire.) The best I can get is 17 fps outside, with the sky off (hey, it gives me 2 fps!). Inside, though, it's completely smooth, 40 fps FTW!

Link to comment
Share on other sites

By the way, I have a GeForce FX5200... it's just barely enough to get you through most modern games (all low, maybe a few modifications.)  I can play Oblivion, though it's absolutely horrible outside (check out the screenshot I took in XFire.)  The best I can get is 17 fps outside, with the sky off (hey, it gives me 2 fps!).  Inside, though, it's completely smooth, 40 fps FTW!

<{POST_SNAPBACK}>

many of my friends who own FX5200 are complaining about it... i cant believe it is really so bad~ :P

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...