Max Refresh Rate For 3440 X 1440 On Msi R7 240

Discussion in 'Hardware' started by Silverthunder, Dec 6, 2017.

  1. Silverthunder

    Silverthunder Sergeant

    I was searching the web but I was finding some conflicting information on this question

    1. I am not sure if the brand MSI plays into this or whether all of the brands have the same specs.
    2. I think there might be a 4GB version of this card (mine is 2 GB)

    I saw one post that it can support 4k at 60 Hz. But, I am pretty sure that's inaccurate (unless they are talking about a 4GB version)

    I did find this explanation, which I think would hold if my card is HDMI 1.4 (which I am guessing that it is)
    It's because HDMI 1.4 tops-out at 30Hz at 3840×2160. NOT 4k 60Hz.

    In order to get to 60Hz, you'd need to drop the resolution by half.

    4k = 8.2 Mp


    So to get to 60 Hz, you'd need to drop down to 4.1 Mp.

    3440*1440 = 4.95 Mp. This is HIGHER than the 60 Hz 4.1 Mp value we computed above, so your refresh rate is going to be under 60 Hz.


    Your bandwidth difference is simple to compute:


    X = (60Hz) * (4.95 Mp/4.1 Mp) = 50Hz

    from
    https://hardforum.com/threads/minimum-video-card-for-3440x1440-resolution.1923926/#post-1042793256

    50 Hz is probably fine for desktop applications, right?
    With the video subscription services, what would happen if I try to watch a 4k movie?
     
  2. Digerati

    Digerati Major Geek Extraordinaire

    There are way too many variables and not enough specifics.

    Total refresh rate is determined by the specific graphics card and the specific monitor. Other factors can indeed affect overall refresh rates, including the interface and cable used. If your monitor can only support 60Hz but the card supports 75Hz, you will be limited to 60Hz.

    The old industry standard for flicker free display was 60Hz. The new standard is 75Hz. But that is primarily needed to ensure flicker free "gaming" where extensive CGI (computer generated imagery) is used.

    In your case, if you look at the specs for that specific card, it clearly shows,
    • DVI Connectors - Max Resolution: 1920 X 1200 @60Hz
    • D-SUB Connectors - Max Resolution: 2048x1536 @60 Hz
    • HDMI Connectors - Max Resolution: 3840x2160 @30 Hz, 4096x2160 @24 Hz
    Again, too many variables and not enough specifics. Understand the refresh rate, and how it affects image quality applies mostly to animated or moving images. If you are looking at Word document or this web page, 50Hz is way more than needed. Note that the industry standard for decades for "live action" (with real people and objects moving in real-time, not CGI) video as seen in movie theaters is 24FPS frames per second. That is, it takes only 24FPS for humans to see smooth movements. So if the refresh rate is the same or better, you will not see any flicker or jerky movements.
    Once again, too many variables and not enough specifics. Streaming services are also impacted by bandwidth of the streaming service provider and your network/Internet service provider.
     
    Silverthunder likes this.
  3. Silverthunder

    Silverthunder Sergeant

    does each video port dictate a certain max number of pixels and refresh rate? Obviously, there are different tiers, like the port could "up" one side and "drop" another. I mean port in the generic sense, not port on video card A, B, C, D.
    Phrased another way, can the video card manufacturers "turn the volume down" on what they are pushing? I am pretty sure that they can for Hz.
     
  4. Digerati

    Digerati Major Geek Extraordinaire

    The number of pixels is determined by the resolutions supported. Refresh rate is different. That indicates how many times per second each pixel is told to do something.
    I suppose they could but I see no reason why. The GPU is most likely the determining factor and GPU is not made by the card maker.

    MSI, for example, bought the GPU for your card from AMD then built that card around that processor.
     
  5. Silverthunder

    Silverthunder Sergeant

    say you have a card that supports 1440p
    there are two resolutions that that could be tied to the ultra-widescreen and the 16:9. is 1440P just a vague term and it's can really refer to more than one "set" of pixel data?
    I just always been confused when people talk about being able to "see" the pixels instead of complaining about the picture being too big for the screen and wasting real estate. in other words when you have 1080 P and you're on a 32 inch monitor I thought it would just show really big icons but I guess people somehow people scale it down?
     
  6. Digerati

    Digerati Major Geek Extraordinaire

    No, it is a very specific term. It means 1440 lines (row of pixel) are drawn Progressively - that is one after the other all the way down the screen. If not "P", it would be "I" for interlaced where the screen draws 720 lines down the screen with a space between each line, then it goes back up and fills in the spaces for the remaining 720 lines.

    You can "see" the individual pixels if you are close enough to the screen.
     

MajorGeeks.Com Menu

Downloads All In One Tweaks \ Android \ Anti-Malware \ Anti-Virus \ Appearance \ Backup \ Browsers \ CD\DVD\Blu-Ray \ Covert Ops \ Drive Utilities \ Drivers \ Graphics \ Internet Tools \ Multimedia \ Networking \ Office Tools \ PC Games \ System Tools \ Mac/Apple/Ipad Downloads

Other News: Top Downloads \ News (Tech) \ Off Base (Other Websites News) \ Way Off Base (Offbeat Stories and Pics)

Social: Facebook \ YouTube \ Twitter \ Tumblr \ Pintrest \ RSS Feeds