Discrete graphics cards; why they will always exist

Discussion in 'Hardware' started by ChrisC, Jan 19, 2007.

?

Do you believe add-in graphics cards will be around ~10+ years?

  1. yes

    2 vote(s)
    100.0%
  2. no

    0 vote(s)
    0.0%
  1. ChrisC

    ChrisC Private First Class

    This is just a thought I've been contemplating, with all the speculation about AMD's Fusion (and Intel's secret work no doubt occuring), integrating GPU's into a CPU.
    I personally feel that this is perfect for laptops and SFF, as they do not require massive horsepower, placing more emphasis on power and size, both hugely superior with a CPU/GPU.
    But for desktops, I just cant see it taking over entirely (for budget and many OEM systems, yes), because it can *never* achieve the same performance as a standalone graphics card. Many people will point to the low latency (essentially 0) of an integrated CPU/GPU, but the reality is this can never compensate for its shortcomings. Basically, sacrificing silicon real estate from the CPU for the GPU (or vice versa, depends how one thinks of it), versus having 2 seperate and indentically large and complex chips. But even this *could* potentially be overcome (maybe not, but it doesnt matter....).The real issue is memory speed; while desktops are just reaching 10gb/sec, graphics cards are at over 100gb/sec, with R600 poised to reach nearly 200! This is the nail in the coffin, theres no way to have removable memory modules for those types of speeds; the traces need to be short, and so much care is taken to reduce interference. On a motherboard to an add-in card, its just too hard to do electrically; the soldered on chips of a graphics card will always be way faster than any add-in ram could ever be.
    For these reasons, I say long live the graphics card!
    Also, putting them into Torrenza-style ports so its connected to the CPU directly, best idea ever, its the next best thing to integrated CPU/GPU as far and latency and bandwidth is concerned.
    Discuss:)
     
  2. BCGray

    BCGray Guest

    The same discussion points you brought up, were used when the Chip Manufacturers integrated the Math Processor, L1/L2 cache, DMA and many other features into the present CPU's. Add in cards will be with us for as long as the Open Architecture of the MS system is allowed to exist, and the most used features will find there way into the CPU. Size is not a issue, hey were talking 4 processors on a single CPU chip with room to spare.

    Bottom Line: The Chip Manufacturers need to sell chips to stay in business, so they have to always offer you the consumer something they figure you need and are willing to pay for.
     

MajorGeeks.Com Menu

Downloads All In One Tweaks \ Android \ Anti-Malware \ Anti-Virus \ Appearance \ Backup \ Browsers \ CD\DVD\Blu-Ray \ Covert Ops \ Drive Utilities \ Drivers \ Graphics \ Internet Tools \ Multimedia \ Networking \ Office Tools \ PC Games \ System Tools \ Mac/Apple/Ipad Downloads

Other News: Top Downloads \ News (Tech) \ Off Base (Other Websites News) \ Way Off Base (Offbeat Stories and Pics)

Social: Facebook \ YouTube \ Twitter \ Tumblr \ Pintrest \ RSS Feeds