Recommended Posts

I saw the exact same problem with both a 9100 and a 9600 pro. Maybe a 9800 would fix it, but that's a pretty expensive way to find out. Maybe I could borrow one from a friend....

Share this post


Link to post
Share on other sites

Don't forget that sony have a 23" model:

http://www.dabs.com/uk/channels/hardware/m...?quicklinx=28PQ

Although gamers will not be satisfied with the 40ms response time, it suits my needs for development.

BTW, 1920 x 1200 is beyond the DVI single link spec of 1600 x 1200, you will a dual link graphics card and a dual link cable. I have an MSI GeForce 5900 and it works a treat.

Share this post


Link to post
Share on other sites

Actually, the bandwidth required to drive a monitor at 1920x1200 is not beyond the DVI spec. But, in order do to it, the video card has to reduce the blanking interval to some shorter time period which is non standard.

Joo

Share this post


Link to post
Share on other sites
Don't forget that sony have a 23" model:

http://www.dabs.com/uk/channels/hardware/m...?quicklinx=28PQ

Although gamers will not be satisfied with the 40ms response time, it suits my needs for development.

BTW, 1920 x 1200 is beyond the DVI single link spec of 1600 x 1200, you will a dual link graphics card and a dual link cable. I have an MSI GeForce 5900 and it works a treat.

I looked at Sony and Apple's 23" LCDs. Sony's was poorly constructed. I might even go so far as to call it "Cheap" looking. I seem to recall that something about the cable hookups in back being really lame.... Apple's was, well, an Apple.... It's great if you've got a MAC, but on a PC it looks like a fish out of water. I found numerous bad pixels on both the Sony I surveyed at a local Circuit City and on the Apple unit they had at my local CompUSA. My Samsung has zero bad pixels. Maybe I should try a new DVI cable. Can anyone point me to a high quality "Dual Link" DVI cable?

Share this post


Link to post
Share on other sites
Can anyone point me to a high quality "Dual Link" DVI cable?

IIRC, it's just 2 DVI cables running in parallel. The tricky part is having a video card with dual-DVI that supports "dual-link" or whatever they're calling it.

Share this post


Link to post
Share on other sites

ddrueding,

IIRC, it's just 2 DVI cables running in parallel. The tricky part is having a video card with dual-DVI that supports "dual-link" or whatever they're calling it.

Exactly. The cards that do supoprt such links are usually pricy workstation cards. I would look over at Matrox first to see if there cheaper cards (P650 perhaps) support the option before you go chasing something like .this.

I think the term to search for is "Dual Link TDMS" by the way.

Do well.

Jonathan Guilbault.

Share this post


Link to post
Share on other sites

I have had the sony since June last year and wouldn't consider it to be poorly constructed, there's an internal metal chassis and the connectors seem robust enough.

I have a lindy dual link cable, http://www.lindy.com/ , nice and sturdy.

As for graphics cards, I have always been a matrox fan (upgraded from millenium to g200, to g400 to g550), but my quest for a dual link capable card caused me to defect to Nvidia. Matrox just don't have the economies of scale anymore to keep up with the big boys. At the time, the Parhelia would not go over 1600 x 1200 DVI, and I wanted to stay within the specs to avoid problems (did not want to reduce blanking interval or refresh rate).

I initially had concerns about going with a 'generic taiwanese' manufacturer such as MSI, especially after getting used to the solid support and drivers that matrox provide. This proved to be misguided, I purchased an FX5900-128, dual link DVI worked straight away, the picture quality blew me away.

The only problem I had was the occasional message 'the graphics card is not recieving sufficient power', turns out my molex connector had slight surface corrosion and needed a clean.

Share this post


Link to post
Share on other sites

I have a 243T now and I had a 240T before that.

I did have to run the 240 at 52 hz to get digital @ 1920*1200.

The 243T handles the same res at 60 hz, however my aging radeon 9000 pro refuses to do any higher than 59 hz at that res. I use powerstrip to get that refresh rate.

Personally I find the 243T to be brighter and it has richer colours compared to the 240T. Sometimes maybe even too rich.

It is very important to have either a good quality cable if you are running hig res DVI or use a cheaper but very short cable. 2 m is fine for el cheapo cables. anything longer than that requires better cable. The 243T seems to be less sensetive to poor cabling.

Also the gfx card is very important for high res DVI. Most current generation ATi cards do fine, most nvidia cards don't do so fine @ 1920*1200.

Anyway.. dvi is a MUST on the 240T and 243T. I would not use them without it.

Btw the 240T is rated at 25 ms response time but the 243T is rated at 30 ms.

I personally don't use it for gaming (if you don't count chess as gaming:) ).

For video playback.. well.. it works.. But like all lcd's it show's all the flaws in an mpeg stream. It doesn't look good in my opinion.

Regarding gaming and weird aspect ratio. The 240T will actually display 1600*1200 @ 4*3 with black borders on the sides.. so that's not a problem.

The 243T stretches the pciture instead :(

Also neither of them support 1920*1080 res.

Share this post


Link to post
Share on other sites
I'm a little surprised that the cable that came with it isn't good enough

i would suspect the cable is plenty good. it's from the manufacturer, and more importantly it is digital. analog cabling is infinitely more "quality" sensitive.

Share this post


Link to post
Share on other sites
Also neither of them support 1920*1080 res.

I haven't tried it on mine, since I don't have an HDTV source, but the native resolution exceeds 1920x1080, and it's done every resolution I've ever thrown at it. For what it's worth, This link states it *does* support HDTV 1920x1080.

Share this post


Link to post
Share on other sites

I'm also a proud owner of a 240t. I've used both a Visiontek Ti4600 and currently a Sapphire 9800 Pro. I too notice the "sparkles" w/ images on black backgrounds with my current 9800, but I never had this issue w/ my 4600. I remember reading an article on Anandtech (IIRC) about the differences in quality of DVI interfaces on different vendors of ATI and Nvidia cards. Some graded great while others were just "passable". I guess the "sparkles" can be expected considering that 1920 x 1200 is pushing the DVI spec anyways.

Share this post


Link to post
Share on other sites

I bought a Belkin dual-link DVI cable at CompUSA for $80 and it dramatically reduced the incidence of the "sparkles". They occur almost never. I was considering returning the Belkin cable, however and trying to find one of the "Monster" dual link cables. They look cooler :)

Share this post


Link to post
Share on other sites

I would avoid the (243T) if you plan to use DVI.

This display has serious video dynamic range issues that are not present when using analogue input. Like anyone is going to spend $2k to use analogue input! :rolleyes:

When an image dims down (fade to black) serious artifacts occur. This does not show up on my 213T nor my Sharp WUXGA display on my notebook. I've tried several different cables, display adapters, etc. with the same result. I've also tried several samples of 243T product and they exhibit this same behaviour.

The easiest way to see this is run the 3DMark03 program and watch the beginning of the Troll Slayer section that starts dark and fades up. You will see this horrible artifcating.

It's not video card related or software related whatsoever. This is also present in movies as well as images when edited in PS.

This is totally unacceptable for a display of this calibre and I am sending all of mine back for a refund.

:angry:

Cheers!

Share this post


Link to post
Share on other sites
I would avoid the (243T) if you plan to use DVI.

This display has serious video dynamic range issues that are not present when using analogue input.  Like anyone is going to spend $2k to use analogue input!  :rolleyes:

When an image dims down (fade to black) serious artifacts occur.  This does not show up on my 213T nor my Sharp WUXGA display on my notebook.  I've tried several different cables, display adapters, etc. with the same result.  I've also tried several samples of 243T product and they exhibit this same behaviour.

The easiest way to see this is run the 3DMark03 program and watch the beginning of the Troll Slayer section that starts dark and fades up.  You will see this horrible artifcating.

It's not video card related or software related whatsoever.  This is also present in movies as well as images when edited in PS.

This is totally unacceptable for a display of this calibre and I am sending all of mine back for a refund.

:angry:

Cheers!

Get your refund and trade up for the new 16ms refresh Apples. The 30" especially is a sight to behold. :o

Share this post


Link to post
Share on other sites
Get your refund and trade up for the new 16ms refresh Apples. The 30" especially is a sight to behold.

This is a good idea however the only card that's reasonably priced (GF 6800U DDL-DVI) is MAC ONLY! Quadro 2000FX and higher will drive it natively, but they are too expensive.

The 23" model is a good option, however and cheaper than the 243T.

Cheers!

Share this post


Link to post
Share on other sites
Was the 240T you used at 1900x1200 in digital mode?

yes. 52hz on an old radeon.

I find the picture on mine to be excellent.

me too. however, the price, non-standard resolution, and response time are decidedly non-excellent.

Interestingly, 1900x1080 is the real resolution of "1080i" HDTV.

yes, interesting, but does it matter?

I was responding to duraid's comment regarding the image quality. I don't see anything non-standard about the resolution, other than that it is higher than any other panel on the market, (excluding some weird ridiculously priced thing from IBM I recall hearing about a while ago). That the resolution exceeds 1080p HDTV means that, for those who are interested, this panel is among the very few that can display the best quality HDTV possible. If image quality of HDTV matters to you, then yes, it does matter.

1080p isn't being broadcast, only 1080i. And apparently the HD camera manfacturers such as Sony are filtering out at around 1400 lines of rez. So really, there isn't any content to watch at 1080p (perhaps an exception for the WM9 "HD" DVDs).

Share this post


Link to post
Share on other sites

Apparently the HP L2335 is around $600 cheaper and has more flexible input options than the Samsung.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now