Cable issue (PC to Philips LCD - text is blurry)

  • Page

    of 2 First / Last

    Previous
  • phAge 4 Mar 2013 22:41:18 24,371 posts
    Seen 2 hours ago
    Registered 11 years ago
    HELLO AGAIN.

    I've recently hooked up my PC to a 32" Philips LCD. The telly is 1080p and everything looks very spiffy indeed - except text of pretty much any sort, which is blurry, washed out and almost unreadable. Don't know why the issue specifically affects text, and enabling ClearType doesn't do anything to help.

    I've found several people onlione who've experienced the same problem with both Philips and Samsung monitors, and a couple of them have had good results using DVI cables instead.

    Before I go invest in new cables, however, I'd like to try a (much cheaper) HDMI to DVI converter (HDMI out from Radeon 6950) - but I'm not sure if simply using a converter will have any effect whatsoever, or if I need to go DVI to DVI in order for it to change anything?

    Will accept abuse for a solution.

    EDIT: In fact - why should changing from HDMI to DVI change anything? Isn't HDMI just DVI + audio?

    Edited by phAge at 22:46:23 04-03-2013
  • Roddles 4 Mar 2013 22:49:17 1,898 posts
    Seen 5 hours ago
    Registered 4 years ago
    Plug yourself into HDMI 1 and make sure the screen size on the TV is set to "Screen Fit" or something similar sounding if you're using a Phillips.

    Changing from HDMI to DVI won't matter, it's all the same without audio and in terms of how your TV will handle the signal.

    One other thing you can change is the colour setting to YCbCr444 instead of RGB.
  • Whizzo 4 Mar 2013 22:53:21 43,131 posts
    Seen 1 hour ago
    Registered 13 years ago
    Are you running the TV at 1920x1080 resolution as a monitor or have you added the HD TV 1080p resolution in the CCC and have selected that? On my HD TV it certainly seems much happier with it set to an HD TV setting, even though the resolution is exactly the same.

    Also check if you're scaling the image at all which will make text look a bit iffy as well potentially.

    If you were using an LG telly I'd also make sure you've got the label on the HDMI socket set correctly on the inputs screen, it makes a big difference to how their TVs work with PCs from having to do this on a couple of dozen TVs at work! If Philips have something similar check that.

    As for switching from HDMI to DVI I can't see how that'll make much difference really, plus you'll miss out on having sound which the AMD card does a decent job at.

    This space left intentionally blank.

  • Deleted user 4 March 2013 23:00:03
    Are you positive it's plugged in?
  • neilka 4 Mar 2013 23:52:01 15,956 posts
    Seen 20 minutes ago
    Registered 9 years ago
    Have you connected the red and white HDMI cables?

    A map is like comparing velocity and speed.

  • dominalien 4 Mar 2013 23:57:54 6,861 posts
    Seen 41 minutes ago
    Registered 8 years ago
    phAge wrote:
    EDIT: In fact - why should changing from HDMI to DVI change anything? Isn't HDMI just DVI + audio?
    Yes. A very simple DVI to HDMI cable for a pound or two will give you the same quality picture as a 100 pound rip off. Unless it doesn't, in which case you'll see very visible artifacts on screen (personally I've experienced white or green dots all over the screen - you can't miss them).

    Make sure the TV is set to no overscan and try to set a "normal" image mode (not cinema or gaming). Make sure the PC's driver sets the correct resolution for your TV (1920x1080 in this case, I'm guessing), as any other res can and will result in degraded image quality.

    PSN: DonOsito

  • dominalien 4 Mar 2013 23:58:41 6,861 posts
    Seen 41 minutes ago
    Registered 8 years ago
    neilka wrote:
    Have you connected the red and white HDMI cables?
    I keep mixing those up. Is it the red from the red-green-blue set, or the red from the red-white set?

    PSN: DonOsito

  • Roddles 4 Mar 2013 23:59:31 1,898 posts
    Seen 5 hours ago
    Registered 4 years ago
    As mentioned above, changing the label of the HDMI input to DVI/PC or something similar changes the way your TV handles the signal. It's a well known yet undocumented feature of Samsung tellies at least.
  • Deleted user 5 March 2013 00:32:51
    @phAge

    The issue you are having might be a simple as mistakenly sending a interlaced signal to the monitor from the PC, instead of progressive. But it is much more likely that the monitor doesn't correctly communicate its timing capabilities to the video card resulting in the GPU picking a fallback mode rather than optimal, as is common with many smaller screens capable of 1080p native.

    As you will be well aware, the video card typically provides simple choices to configure a resolution on the PC such as

    Horizontal pixels, vertical pixels
    refresh rate(Hz)
    colour depth
    Scan type: progress/interlaced

    But along with this, it also has other values for a complete description of the display mode timing settings, as listed below.

    Standard timing: automatic/GTF/DMT/CVT/CVT reduced blanking/manual

    For both horizontal/vertical you have:
    Active pixels
    Front porch(pixels)
    sync width (pixels)
    Total pixels
    polarity
    refresh rate and the pixel clock frequency.

    For the problem you are describing, trying the other standard timing methods will typically force it to pick optimal settings. However, on my Sony 22 TV, I also needed to change everything manually in XP and Fedora (16 and earlier) after using a linux console utility(that I can't remember) to detect the correct TV timings.

    Naturally, overriding these timing settings randomly might damage your monitor, so checking the back pages of the monitor manual for the DVI modes available and the timing ranges should allow you to experiment with valid ranges and quickly find an optimum custom timing.
  • neilka 5 Mar 2013 00:36:52 15,956 posts
    Seen 20 minutes ago
    Registered 9 years ago
    #Hey everybody
    Vizzini's here
    To solve your troubles
    So have no fear!#

    A map is like comparing velocity and speed.

  • Dirtbox 5 Mar 2013 03:58:19 78,209 posts
    Seen 5 hours ago
    Registered 12 years ago
    Haha!

    What a load of irrelevant shite.

    +1 / Like / Tweet this post

  • mal 5 Mar 2013 04:32:44 22,569 posts
    Seen 18 minutes ago
    Registered 13 years ago
    I remember having to configure all that stuff back when we were still using VGA to hook up our screens...but these days?

    Cubby didn't know how to turn off sigs!

  • Deleted user 5 March 2013 06:51:15
    @mal

    Many smaller 1080p native screens used the 46 description information so they still don't communicate their capabilities (over DVI/hdmi either). Newer operating systems (Windows 7/8, Fedora 17/18) get it right for most resolutions by using a h/w database to override with correct settings for known tvs/monitors. But I had this problem trying to use 1366x768@60 at 22 on XP, Fedora 15 and Mac OSX in the last 2years, and needed to change polarities and frequencies manually to get the image perfectly sharp.

    The problem is most likely by design, as TV mfrs probably didn't want people using commodity HD Tvs in place of expensive reference monitors. TV technology was changing so fast(contrast, colour gamut) that having a screen report it didn't support a resolution was bad business by being an obvious software restriction. But having it pick the wrong settings and still show an image kept that other market afloat.
  • dominalien 5 Mar 2013 06:55:49 6,861 posts
    Seen 41 minutes ago
    Registered 8 years ago
    Yeah, that's what's going on. Whatever that is.

    /backs away slowly

    PSN: DonOsito

  • Dougs 5 Mar 2013 06:58:53 67,752 posts
    Seen 14 minutes ago
    Registered 11 years ago
    Roddles wrote:
    As mentioned above, changing the label of the HDMI input to DVI/PC or something similar changes the way your TV handles the signal. It's a well known yet undocumented feature of Samsung tellies at least.
    Yeah, this.
  • phAge 5 Mar 2013 08:45:41 24,371 posts
    Seen 2 hours ago
    Registered 11 years ago
    Cheers for the input, all.

    It's strange because in "normal" mode (on the TV) text looks OK-ish (but still not great) yet everything takes on a slightly fuzzy look, whereas changing to "PC-mode" (which requires a bit of tweaking in the over/underscan CCC-panel) makes everything pin-sharp (as it should be) - yet makes text of any kind mega-blurry.

    I'll try to snap a couple of pictures of the screen when I get home to show the effect, but the best desription I can give is that it looks like some sort of way too aggressive edge enhancement/antialiasing on text (of any size).

    Am I correct in my understanding that a simple HDMI-to-DVI adaptor attached to the end of the HDMI cable coming from the PC will have the same (possibly beneficial) effect as a ready-made HDMI-to-DVI cable? I'd hate to spend time and money re-cabling if it doesn't help...
  • Maturin 5 Mar 2013 08:53:39 3,006 posts
    Seen 2 hours ago
    Registered 5 years ago
    Could it be the Clear Type setting on your PC? If the pixels are arranged differently on your monitor to the TV screen it could be what looks good on one looks pish on the other?
  • Sharzam 5 Mar 2013 08:55:17 2,880 posts
    Seen 41 minutes ago
    Registered 6 years ago
    HDMI to DVI, or the other way should make no difference. Both are the same standard and the same information.

    Similarly as it is digital you will either get a picture or a total screw up, no middle ground like with analog vga. I agree with others sounds like a setting and is not a hardware or technical fault.

    Edited by Sharzam at 08:56:40 05-03-2013

    Known as 'Sharzam' in 98.5% of games

  • Deleted user 5 March 2013 09:00:50
    Overscan on you TV. Turnit off, everything will look better after that
  • dominalien 5 Mar 2013 09:10:52 6,861 posts
    Seen 41 minutes ago
    Registered 8 years ago
    @phAge

    An Adaptor will be fine.

    PSN: DonOsito

  • phAge 5 Mar 2013 16:59:05 24,371 posts
    Seen 2 hours ago
    Registered 11 years ago
    Right - here are some pictures of the TV with and without PC mode activated, and finally a shot of my 1650 x 900 Samsung monitor (via VGA).

    TV - normal mode
    http://www.flickr.com/photos/92120303@N05/8530768829/in/photostream

    TV - PC mode
    http://www.flickr.com/photos/92120303@N05/8531878418/in/photostream/

    Monitor
    http://www.flickr.com/photos/92120303@N05/8531879556/in/photostream/

    As you can see, the TV shots look much, much worse than the monitor one - and I have no idea why. I've tried disabling overscan on the TV, but that doesn't do anything.

    Guess I'll just have to find a VGA cable and pray that works. Or squint a lot. :(

    Edited by phAge at 17:00:29 05-03-2013

    Edited by phAge at 17:05:23 05-03-2013
  • Roddles 5 Mar 2013 17:03:18 1,898 posts
    Seen 5 hours ago
    Registered 4 years ago
    Just to double check, what's the precise model name/number of your Philips TV? I have a sneaky feeling it accepts 1080p, but its native resolution is actually 1360x768, so it's downscaling the picture which would explain the text in the proper PC mode photo above.

    It looks just like the picture I used to have on my old Samsung 1360x768 TV when feeding it a 1920x1080 signal in PC mode.

    Edited by Roddles at 17:05:16 05-03-2013
  • Khanivor 5 Mar 2013 17:09:12 40,774 posts
    Seen 4 minutes ago
    Registered 13 years ago
    Oh come on, people!
  • Maturin 5 Mar 2013 17:14:43 3,006 posts
    Seen 2 hours ago
    Registered 5 years ago
    Some 1080p tellies (such as my Panny plasma) won't do 1080p on the PC input and will only do a lower resolution. I have to use a HDMI cable to get it to do 1080p.
  • phAge 5 Mar 2013 17:18:49 24,371 posts
    Seen 2 hours ago
    Registered 11 years ago
    It's a 32PFL5404H. Just tried setting the res to 1368x768 from the Windows (not CCC) and it still looks gash.
  • Roddles 5 Mar 2013 17:22:54 1,898 posts
    Seen 5 hours ago
    Registered 4 years ago
    Bingo. That's a 1366x768 television.

    Set your resolution to either 1360x768 or 1366x768.

    Edited by Roddles at 17:23:45 05-03-2013
  • Khanivor 5 Mar 2013 17:26:59 40,774 posts
    Seen 4 minutes ago
    Registered 13 years ago
    This forum is dead to me now.
  • Spectral 5 Mar 2013 17:31:17 4,991 posts
    Seen 6 hours ago
    Registered 11 years ago
    also it sounds stupid but try turning sharpess down. It often defaults too high.

    Edit: on the TV not the PC

    Edited by Spectral at 17:32:00 05-03-2013
  • phAge 5 Mar 2013 17:32:51 24,371 posts
    Seen 2 hours ago
    Registered 11 years ago
    Hmm - the lower resolutions don't seem to look much better, so think I'll stick with the unsupported Full HD, even if that's a bit shit too. Still, thanks for the advice.

    Why does pretty much everything but text look good with 1920x1080, though? I've played quite a bit of BF3 and The Witcher 2 without noticing anything amiss - was only when I started up Shogun 2, with all it's itty-bitty text, that the problem became apparent.
  • nickthegun 5 Mar 2013 17:33:59 59,940 posts
    Seen 1 minute ago
    Registered 9 years ago
    phAge wrote:
    The telly is 1080p
    Roddles wrote:
    Bingo. That's a 1366x768 television.
    Christ on a bike, phage. Just give up on TVs and go and live in a cave somewhere.

    ---------------------------------------------------------
    someone say something funny

  • Page

    of 2 First / Last

    Previous
Log in or register to reply