posted
I use 1280*1024. My monitor is 17". I'm rather fond of Corona Extra.
And after some fiddling I can indeed do 1024*768 with 85Hz. And I can do 1152*864 with 75Hz. But I've gotten rather attached to 1280*1024, so I guess I'm stuck with 60Hz.
Actually, after running at 1152*864 at 75Hz for just a couple of minutes I can't stand 60Hz at 1280*1024...
-------------------- I haul cardboard and cardboard accessories
Registered: Mar 1999
| IP: Logged
posted
Them's the breaks. I don't like 1152, so disproportionally wide screen. Corona? **** **** ****! ***** **** ******! Although with a twist of lime in it, as I've seen somewhere is a custom with Corona for some reason, and also a big fat Tortilla, I guess it works, summer guy.
Registered: Aug 1999
| IP: Logged
posted
Generally, the higher the refreshrate, the less likely you are to suffer from sore eyes and headaches.
Even if you don't think there's a difference, the flicker does tire out the eyes. Do what, er, thingie suggested (I can't be arsed going back a page to check): Run at a higher refresh rate for a couple of days, and then drop back to 60. You'll notice the difference then.
It's like the resolution thing. People say that 800 x 600 is fine, and that going up a size makes everything too small. But force them to do it for a couple of days, and then get them to go back, and they'll complain that everything is now too small.
People suck.
-------------------- Yes, you're despicable, and... and picable... and... and you're definitely, definitely despicable. How a person can get so despicable in one lifetime is beyond me. It isn't as though I haven't met a lot of people. Goodness knows it isn't that. It isn't just that... it isn't... it's... it's despicable.
Registered: Mar 1999
| IP: Logged
quote:Originally posted by Nim: TSN, you have to go backwards to see change. If you're used to 60 Hz you don't think about the flicker.
Or it could be like mine, where I would set my reresh rate to 100hz, but for some reason the monitor would stay stuck at 85hz. I had to fiddle around in the Radeon tab bits of display settings to force it up. It annoyed me.
-------------------- Yes, you're despicable, and... and picable... and... and you're definitely, definitely despicable. How a person can get so despicable in one lifetime is beyond me. It isn't as though I haven't met a lot of people. Goodness knows it isn't that. It isn't just that... it isn't... it's... it's despicable.
Registered: Mar 1999
| IP: Logged
quote:Originally posted by Charles Capps: People that don't want to gauge their eyes out with spoons when they see a low refresh rate baffle me.
quote:Originally posted by Cartman: That's the only thing that still has me clinging to my 19" CRT for dear life. Not their meagre color vibrancy or long crystal response delays or dead pixels, but their bloody fixed resolutions. I don't want to sacrifice too much screen real estate, so I won't go for anything smaller than a 17", but since 1280x1024 (the native resolution of a 17" LCD) already makes me squint, I don't even want to THINK what the 1600x1200 of a 19" will do to me' eyeballs.
See, CC has exactly the same attitude re: refresh rates as I do re: CRTs. Admittedly, it took a four month work term sitting in front of high quality LCDs to be converted from the whole gamer mentality of "LCDs suck because they blur a lot. I know because I saw my parent's old passive matrix LCD". But when you're staring at an LCD for a seven hour work day, you really start to notice how much your eyes hurt going from essentially no refresh rate to even a 100Hz refresh rate.
The cost issue admittedly has more merit, but consider this. A monitor is usually the one component of a computer that is most resistant to obsolescence. This includes the case (BTX) and power supply (try using a PII power supply with an Athlon). It puzzles me that people will shell out $600 for the newest video card and a similar amount for the CPU when they both lose more than half their value in a year. Course they usually then go out and buy the cheapest monitor they can find and $20 speakers *shrug*
Cartman: You're doing some weird math. For lack of a better term, pixel density actually gets less dense as size increases, not vice versa. i.e. a 17inch LCD at 1280x1024 has a "denser" resolution than a 20inch LCD at 1600x1200. Therefore, you should squint more on the smaller one. Also, 19inch screens are usually optimal at 1280x1024 (same as 17inch).
If you think about it, it makes sense. Thats the only reason companies would use 2x4 arrays of 20" LCD monitors rather than a smaller number of say 30"s. Its because they need the resolution, not the cost. (Counter-intuitively, it probably costs more to buy multiple smaller screens and custom mount them in 2D arrays than just use a smaller number of off-the-shelf larger screens)
Resolution: How often do you change resolution when doing actual work anyways? As far as games go, I never see the need to run games at a different resolution than normal, if you have the extra horse-power its usually better to up fun things like AA or FSAA than just up the resolution.
Registered: Mar 1999
| IP: Logged
posted
But it's normally the other way around for most people...they run their games at a lower resolution than their desktop, because they don't have the horsepower to do it. I have a 2800+ XP and a GeForce 9600, and I tend to run games at 1024 x 768, whereas my desktop resolution is one up from that.
-------------------- Yes, you're despicable, and... and picable... and... and you're definitely, definitely despicable. How a person can get so despicable in one lifetime is beyond me. It isn't as though I haven't met a lot of people. Goodness knows it isn't that. It isn't just that... it isn't... it's... it's despicable.
Registered: Mar 1999
| IP: Logged
Cartman
just made by the Presbyterian Church
Member # 256
posted
Hmm, you're right, an average 20" LCD monitor has a pixel-pitch of .25x (versus the nearly identical .26x of a 17"), so a 20-inch at 1600x1200 WOULD display coarser images and be less straining than a 17-inch at 1280x1024... but at 1600x1200 I couldn't read anything without wearing magnifying goggles. Ngh.
Registered: Nov 1999
| IP: Logged
posted
What if you have a 15.4 WXGA screen on your laptop and want a higher refresh rate from a ATI Radeon 9600 running on 64MB of video ram?
-------------------- "It speaks to some basic human needs: that there is a tomorrow, it's not all going to be over with a big splash and a bomb, that the human race is improving, that we have things to be proud of as humans." -Gene Roddenberry about Star Trek
Registered: May 1999
| IP: Logged
posted
Cartman: "Also, 19inch screens are usually optimal at 1280x1024 (same as 17inch)"
No, a 17" is not native in 1280, that's it's max-res. The max-res is never the native res for any monitor because the refresh rate sucks and the pixels are too dense.
My 20" has 1280 as native res, a 17" would do best in 1024.
I play games in 1024 nowadays, but only because the hardware allows it.
Michael T: If you have a Radeon 9600 you have excellent chances of getting high refresh rates, as far as the video card goes. I would recommend 800x600, the most comfortable res for a 15".
What model and year is your laptop, Michael? If it's at least 2000-2002 I think 100 Hz might be doable, which is as much refresh rate as anyone will ever need.
Seriously, people, if you can just get 85 Hz you are in the clear, anything higher is also good but not THAT necessary.
I once played "Space Quest III: Pirates of Pestulon" on my old Amiga 500 back in '92 for a whole day, 14" monitor that I'm not even sure counted refresh rate. I got such a headache I hallucinated and pulled my hair for hours, to leviate the pain.
I've had my fair share of run-ins with mr flicker to know what's good for me.
Registered: Aug 1999
| IP: Logged
Cartman
just made by the Presbyterian Church
Member # 256
posted
"What if you have a 15.4 WXGA screen on your laptop and want a higher refresh rate from a ATI Radeon 9600 running on 64MB of video ram?"
Laptop displays are LCD-based, so they don't have a "refresh rate" in the traditional (CRT) sense. If I understand you correctly, you want to increase the screen's responsiveness so it "ghosts" less, right? That isn't possible with LCDs, I'm afraid. Ghosting occurs because the crystal elements that make up the screen can't always reshape themselves quickly enough to keep pace with the signals fed to them by the graphics card. It's sucky, but unavoidable.
Registered: Nov 1999
| IP: Logged
"It's like the resolution thing. People say that 800 x 600 is fine, and that going up a size makes everything too small. But force them to do it for a couple of days, and then get them to go back, and they'll complain that everything is now too small."
I tried going higher than 1024x768 on my 17" monitor for a while some time back. I had to change it back. All the squinting just to be able to read things was getting on my nerves.
Registered: Mar 1999
| IP: Logged
posted
And that was just as well, a 17" works best at 1024, like my 20" works best with 1280.
That's what's so funny, if you only change resolution when you get a bigger monitor, the image in itself never changes size, just the borders. So the density of my screen is just the same as that of an 17" in 1024.
I don't exactly know the ratio, but this line is two inches on my screen;
quote:Originally posted by Nim: Cartman: "Also, 19inch screens are usually optimal at 1280x1024 (same as 17inch)"
No, a 17" is not native in 1280, that's it's max-res. The max-res is never the native res for any monitor because the refresh rate sucks and the pixels are too dense.
You're either confused or just plain incorrect. Read this as a start.
First, I posted that, not Cartman. Secondly, for the vast majority of LCD panels, and I'm almost willing to bet all TFT panels, the max-res IS the same as the native resolution. This is because of the fact that an LCD physically only has the number of pixels that its native-mode resolution has.
quote:Originally posted by PsyLiam: But it's normally the other way around for most people...they run their games at a lower resolution than their desktop, because they don't have the horsepower to do it. I have a 2800+ XP and a GeForce 9600, and I tend to run games at 1024 x 768, whereas my desktop resolution is one up from that.
Thats weird, until the beginning of this month I was running a 2100+ XP with an ATI 9500 Pro, and I had no trouble running games at native resolution. After Best Buy started dumping 9800 Pros for 300 CDN (!), I'm actually maxing out of FSAA and AA levels to set, in KOTOR at least.
You make a decent point though. If you don't have a decent enough card to run games at the native resolution of an LCD with some room to spare, for the love of God don't buy an LCD. Of course, that usually maps directly to "For the love of God, don't play new games, period" anyways. If you can only play Counterstrike at 1024x768 before your screen starts smoking, its a safe bet you won't be playing Doom III at any resolution, period.
However, if you're STILL playing games as old as Counterstrike, you could probably just using something as old as a Radeon 7000 and still have no problems running at native.
quote:Originally posted by TSN: I tried going higher than 1024x768 on my 17" monitor for a while some time back. I had to change it back. All the squinting just to be able to read things was getting on my nerves.
Thats normal, I only ran my old 17" CRT at 1152x864 because everything WAS too small any higher. The only reason you can go higher on a 17" LCD is because you essentially get an extra inch of viewable area because of the way they measure it.
Registered: Mar 1999
| IP: Logged
posted
I run at 1280x1024 on a 17", tried to go back to 1024x768 and couldn't. For one thing, I would have had only enough space for three windows at the bottom of the screen, which isn't enough even for baseline stuff for me. And for another, I do a lot of graphics work (I'm a Computer Art major) so I basically have to keep it high-res. And thirdly, you mean you guys like seeing that much aliasing on your screen at 1024x768?
-------------------- Fell deeds await. Now for Wrath... Now for Ruin... and a Red Dawn... -Theoden, TTT
Lord Vorkosigan does not always get what he wants!
Registered: Feb 2004
| IP: Logged
quote:Originally posted by PsyLiam: I have a 2800+ XP and a GeForce 9600, and I tend to run games at 1024 x 768, whereas my desktop resolution is one up from that.
Thats weird, until the beginning of this month I was running a 2100+ XP with an ATI 9500 Pro, and I had no trouble running games at native resolution. After Best Buy started dumping 9800 Pros for 300 CDN (!), I'm actually maxing out of FSAA and AA levels to set, in KOTOR at least.
Granted, I could probably go up a resulution. The newest game I play is Freelancer anyway. But any sort of slowdown, or a drop below 25fps annoys the hell out of me. And I'd much rather have it at 1024 x 768 with all the detail turned up than go up a resolution and have to turn some of it down.
And, for games, I'd say that a 9800 with 2100+ probably beats a 9600 with 2800+. Processors are stupidly fast nowadays anyway, and if you gave at least 2 gig, then your GPU is far more important than your CPU.
Besides, it's not even a 9600 pro. I only get 6600 in 3dMark 2001. Which is still an improvement on my previous components, which were a 450 PIII and a GeForce 2 MX. I'm happy, and I can save the stupidly fast graphics card for when I have lotsamoney, and when I want to play Half Life 2.
-------------------- Yes, you're despicable, and... and picable... and... and you're definitely, definitely despicable. How a person can get so despicable in one lifetime is beyond me. It isn't as though I haven't met a lot of people. Goodness knows it isn't that. It isn't just that... it isn't... it's... it's despicable.
Registered: Mar 1999
| IP: Logged