Unsolved
This post is more than 5 years old
6 Posts
0
15819
June 19th, 2006 23:00
2407WFP, DVI, and 1080i
Does the A02 revision of the 2407WFP, or will any future revision,
allow viewing a 1080i signal connected via the DVI port. In other
words, will it ever be possible to use this montior to view material
from a DVHS VCR, HD-DVD player, HD settop box, or other non-computer
device in a mode other than scaled-up 720p?
Thanks.
allow viewing a 1080i signal connected via the DVI port. In other
words, will it ever be possible to use this montior to view material
from a DVHS VCR, HD-DVD player, HD settop box, or other non-computer
device in a mode other than scaled-up 720p?
Thanks.
samnmax
25 Posts
0
June 20th, 2006 01:00
My personally feeling is if this is primarly for the computer, you're reasonably covered. 720p is quite good, and quite frankly I don't know why the powers-that-be decided to support 1080i when nearly all devices that show such a signal are progressive.
If you want to hook up many non-computer devices to your display, you might want to consider something that's primarly a television. You'll get better HD support and you can often hook more devices to it at once. You'll find most computer monitors don't support HDCP nor have any inputs in addition to DVI and VGA. I'm disappointed in the lack of real 1080i as I want to hook up a HD settop box to it, but for me it's not enough of an issue.
TalonDancer
4 Posts
0
June 22nd, 2006 04:00
I waited for the release of the 2407WFP so that I could use a single wide aspect ratio LDC display as a computer monitor (PowerMac G4 - VGA) and an HDTV (DISH Network ViP622 satellite receiver - DVI-D w/ HDCP). I'm pretty bummed that the 2407 does not appear to support 1080i via DVI-D :(
Talon Dancer
gervaismfr
1 Message
0
June 22nd, 2006 12:00
I have the same problem : I used a 2405 I sold infortunately which could display 1920x1080i50Hz or 60Hz without any problem...
I use a set top box to receive HD test on french DVBHD Terrestrial (NETGEM 7600 HD). It's a pity a brand new screen is inferior to the previous one.
A hope would be that my Netgem could support 1080p : this choice was not available on screen and I don't know if it is or if a firmware upgrade could solve it ?
Regards
samnmax
25 Posts
0
June 22nd, 2006 17:00
eidospsogos
9 Posts
0
July 3rd, 2006 02:00
samnmax
25 Posts
0
July 3rd, 2006 04:00
You mention using 'decoders' to handle the resolution. I imagine playing HD-DVD and Blu-Ray through the computer probably won't be a problem, as it can handle deinterlacing. However, if you want to hook up a device that doesn't support outputting 1080p, your only options then are either playing at a lower resolution (720p usually) or using a hardware deinterlacer. Any such device is going to be quite expensive, and won't be easily found. I doubt their are *any* currently available that support HDCP.
samnmax
25 Posts
0
July 3rd, 2006 07:00
I think it's okay to buy a lower-resolution HD television, assuming you mean 720p. I don't think there is going to be perceptible difference when viewing from a distance. What I don't understand is we people buy 'EDTVs', or tvs that only support 480p. If you are buying a 42" plasma, it's really stupid to skimp and not get a real HD tv. Youd' be a lot better off either getting a smaller TV or switching to a cheaper technology (like DLP) that supports HD.
I find the missing features in this monitor somewhat annoying. They could have easily supported 1080i, but decided it wasn't important enough to spend the extra $10 a unit or whatever it would have cost to support that.
eidospsogos
9 Posts
0
July 3rd, 2006 07:00
What annoys me more than anything about the whole HD Mass Chaos that is ensuing is this. You pay thousands of dollars for a display that can display both interlaced and progressive signals, but without the resolution to fully display them. And yet monitors that have had the resolutions for years now are handicapped in various other ways. It's almost as if the Industry is purposefully trying to confuse people with claims, just to make money and could care less if the consumer is happy. It's like their profitting from the chaos they created by setting such a ridiculous range of standards and then making products that never completely fulfill all of them.
For instance, I am sure many people bought the 2407 for the same reason I did. It claimed HDCP support(I was also looking for a bigger monitor mind you, but the HDCP was a selling point). But at the same time, having a DVI port with HDCP means alot less when it only supports the lesser of the two available HD resolutions currently available. It is quite frustrating.
Not to even mention the amount of video cards claiming HDCP support bought specifically for that reason that never actually had full HDCP support. Anyway, it's beginning to become enough to make me go buy an old tube TV (which had superior picture anyway) and put a set of Rabbit Ears on it.
TalonDancer
4 Posts
0
July 3rd, 2006 11:00
* DVI - D -> produces an error message saying that the monitor is not capable of 1080i and the 622 automatically reverts to 720P
* Component -> works fine at either 720p or 1080i.
So I can use Component for HDTV and either VGA or DVI-D for the Mac. But some time in the future, Dish will probably turn on HDCP and I will have to drop back to 720p for any HDCP protected content. Note the Dish ViP 622 HD/DVR Receiver does not support 1080p.
Talon Dancer
gbrauer
6 Posts
0
July 3rd, 2006 16:00
I was referring too. This simple feature omission makes the 2407wfp
not nearly as good of a value as it could be for those who have HD
equipment.
samnmax
25 Posts
0
July 3rd, 2006 17:00
So you know, when you use 1080i through component, are you actually losing half the lines, as the 2407 doesn't have an interlacer that can properly handle this resolution. You should therefore set your HD receiver to output 720p.
gbrauer
6 Posts
0
July 3rd, 2006 19:00
------------------
In theory, this is not true. Not having a de-interlacer would mean that
you get visual artifacts when there is horizontal camera movement, but
should not reduce the number of lines displayed.
If the 2407wfp *does* have a simple de-interlacer that throws out half
of the lines (better than no de-interlacer, IMO), you are still getting
1920 x 540 = 1036800 pixels which is greater than the 1280 x 720 = 921600
of 720p mode. So it would be preferable (from a resolution standpoint)
to run at 1080i. In addition, the scaling that will have to be done to
convert any 1080i material to 720p will introduce a further softening of
the image, resulting in a lower apparent resolution. The *only* reason
you would want to have an HD source output at 720p is to avoid seeing
interlacing artifacts. If the 2407wfp is in fact throwing out half
of the lines on the component input, the image will look better in 1080i
mode.
Perhaps someone here who has one could create some test images and
verify if the 2407wfp does or does not have a simple deinterlacer
that throws out half of the horizontal lines.
Again these are reasons that the 2407wfp really isn't usable for
high quality HD viewing.
TalonDancer
4 Posts
0
July 3rd, 2006 20:00
I don't think it is quite that simple :( 720p has MORE LINES than 1080i improperly interlaced (e.g. 540) BUT it has FEWER PIXELS per line (1280 vs 1920). And those 1280 pixels may be interpolated to fit the 1920 width of the 2407WFP depending on your settings.
Perhaps 720p viewed at 1:1 using roughly 2/3rds of the 2407's screen is better than 1080i (done badly) but at least filling the full width of the 2407. I'm looking for some obvious examples to test the difference. In this case a good video with know provence maybe worth thousands of words:)
Talon Dancer
samnmax
25 Posts
0
July 3rd, 2006 20:00
You can't show an interlaced signal on a progressive screen without deinterlacing. Technically, the 2407FPW does deinterlace 1080i through component. It just happens that their method of deinterlacing is one of the worst, that is, it simply throws out half the lines. People have posted images on other forums comparing the Xbox 360 connected with 1080i and 720p. It is clear that when using 1080i, there is visible pixelation.
The resolution of 1080i on the 2407 is worse than 720p. 1080i is normally 60 *fields* a second, which already halves the resolution to 1920x540. To drop lines, it is essentially dropping a field, making it equivalent to 1920x540 at 30 frames a second. In each second, you are seeing 1920x540x30 = 31 million pixels. 720p, on the other hand, is 60 *frames* a second. Therefore, you are seeing 1280x720x60 = 55 million pixels per second.
This also doesn't take into account that it really looks bad when pixels are much taller than they are wide. Just having more pixels isn't helpful if they aren't relatively evenly spaced.
Also, while scaling the 1080 image to 720 could make it somewhat softer, particularly if a poor scaler is used, it is still likely to look better than going through the 2407FPW directly due to it's incredibly poor deinterlacer. It is very likely that the deinterlacer on whatever device you hook up to the monitor will have a better deinterlacer, and even with the necessary scaling 720p will look better. Hopefully, this won't be an issue when more devices support 1080p, though it may take some time for it to become widespread.
gbrauer
6 Posts
0
July 3rd, 2006 21:00
----------------------------
Ah, good point on the refresh rate. So my question is: Is the 2407wfp's
deinterlacer actually throwing out every other field, or is it showing
every field but doubling the height of every pixel?
If the deinterlacer in the 2407wfp is really that bad, then I guess
there is no point in holding out for support for 1080i via the DVI
port. It sounds like the deinterlacer is more of a limiting factor
in terms of image quality than the analog vs. digital connection issue.