# Graphics Card Slows LR 2?



## Jim Mohundro (Sep 16, 2008)

Regarding graphics cards, a recent post on a well=respected photo-related forum reports that not only are graphics cards unnecessary for most post-processing programs, (e.g., LR 2), that they may actually slow execution of the PP program by demanding more of the computer’s basic RAM to run the graphics card, i.e., the greater the card’s RAM, say 128, 256 or 512, the more of the computer’s RAM is diverted to run the card and the less can be used for the PP program.  While I recognize that graphics cards are unnecessary for LR 2, are those cards actually an impediment to the efficient running of LR 2 and, logically, should be removed from machines that are to run LR 2 (assuming that the motherboard contains its own video infrastructure and circuitry)?


----------



## Jim Mohundro (Sep 16, 2008)

A followup to my post: Isn't the role of the graphics card to take on the task of providing the appropriate video signal to the monitor, relieving the system RAM of being diverted to this job (creating just the opposite situation suggested in my original post)?


----------



## Braders (Sep 16, 2008)

Video card still important for most photographers though, as it is what is most important to run the high end 3'" monitors we use for viewing the images. That might be the paradox.


----------



## Jim Mohundro (Sep 16, 2008)

I see the importance of the card in that context, but does the card "burden" the system's own RAM in proportion to the nominal RAM (e.g., 256 Mb) associated with the processor?


----------



## Brad Snyder (Sep 16, 2008)

Jim, Welcome to LR Forums.  

I believe you will find that it's the on-board graphic adapter solutions that are using system ram. Look at a Dell ad, and you'll usually find an asterisk footnote about 'shared' memory, system vs graphics. (There must have been a class action suit from people who felt they were stiffed on their RAM).

Low end graphic adapters fundamentally provide memory to hold the pixel map of what your screen is to look like, and the electronic circuitry to convert that to the video signal that your monitor is expecting to see.

Higher end GPU solutions provide more memory to (more or less) hold more than one screen at a time so that screen animation is faster. In high frame rate motion graphics, i.e. games, the moving 3-D objects are rendered on the fly. This means that the images are not processed as pixels, so much as they are geometric data descriptions of the objects that are turned into pixels via mathematical calculations. The more GPU power, the simpler/faster the instruction stream can be to the external Graphics pipeline, thus lowering the load on the system memory and processor.

So, no, a good graphics card actually 'unburdens' the CPU system memory. Although there are probably tuning/compatability issues involved in any specific combination.  

In the instance of LR2, there 'seems' to be an issue with a specific generation(s) of adapters from a specific manufacturer, which 'seems' to be having a huge negative impact on performance, which again 'seems' to be specific to some of the 'optimized' hardware rendering of 3D aspects of the images. Since there really are no 3D acceleration aspects to LR rendering, this is counterintuitve. But the problem seems to be quite real. I believe engineers from the two companies are investigating. 

At this point, I wouldn't abandon my GPU for the motherboard adaptor. But at the same time, if you're suffering performance issues, it might not be a waste of time to give it a try and empirically test the results.  As I said, there are some goofy things going on with LR in this area, which have not yet been completely characterized.


----------



## Jim Mohundro (Sep 16, 2008)

Thanks, Brad

Your post coincides with what I've thought up to the point where I've purchased LR2 and am having a new computer built to go with it.  I've read a lot about the (and I'll name it) NVIDIA cards (which I have in my current, about to retire computer) which may actually relate to the NVIDIA desktop manager.  In any event, I've been planning, based on the currently-expressed concerns, to put an ATI graphics card in my system if it not likely to negatively affect system RAM.  If I notice any real difficulties with LR2, I'll have the card removed, and use the system graphics which should, logically, divert system RAM to some extent.


----------



## Brad Snyder (Sep 16, 2008)

Reasonable plan, I think.  I've also seen Matrox cards recommended for photo editing purposes.


----------



## joemontana57 (Sep 19, 2008)

I'm building a system around Christmastime. I plan on putting a fairly high end ATI card in it. As Brad (was it brad?) said, it's integrated graphics that take up system memory.
I have never heard what you've said, only that a graphics card with more memory on it frees up the CPU to do more. 
I"m curious where you got your information regarding graphics cards. I'd like to look at it, perhaps there is a kernal of truth in it, ya never know.

Joe
Kalispell, MT
Photoshop CS3, LR 2.'
Canon 4'D, 7'-2'' f 2.8 IS L and several other lenses and accessories.



Jim Mohundro said:


> Thanks, Brad
> 
> .  I've read a lot about the.... (


----------



## joemontana57 (Sep 19, 2008)

Just an FYI, I found this post by borez when I googled for Nvidia and LR2
-------------------------------------------------------------------------
borez





                                  Junior Member




Posts: 1
                                                              Join Date: May 2''8
​


 



*Solved: Lightroom sluggishness with Nvidia drivers - *                                                                                     '5-24-2''8, '7:25 PM                                                

                                                                                                                                                                Hi all,

Thought I might want to post this for future reference:

I was previously using a ATI X16'' card on LR 1.41, and everything worked beautifully. 

I upgraded my card to a Nvidia 96''GT yesterday, (running Forceware 175.16) and realised that LR was running sluggishly with loading and refreshing images. As I did not install nView, that wasn't the case.

However, I realised that there was a setting that caused the issue (might need to show advanced settings):

3D Settings > Manage 3D Settings > Global Settings > Multi-display/mixed GPU Acceleration

By default it should be multiple display performance mode. Change this to a single display performance mode (I only have 1 screen anyway). 

This worked for me beautifully.




Jim Mohundro said:


> .  I've read a lot about the (and I'll name it) NVIDIA cards (which I have in my current, about to retire computer) which may actually relate to the NVIDIA desktop manager.  In any event, I've been planning, based on the currently-expressed concerns, to put an ATI graphics card in my system if it not likely to negatively affect system RAM.  If I notice any real difficulties with LR2, I'll have the card removed, and use the system graphics which should, logically, divert system RAM to some extent.


----------



## Brad Snyder (Sep 19, 2008)

For the most recent stuff I'm aware of re: nVidia, try here.

For the information concerning video adapter memory, that's just general lore I've picked up over almost 35 years of experience. I can't give you specific references, but the web abounds with sites dedicated to video performance junkies, overclockers and the like. Maybe start your research [COLOR=#''''ff]here.[/COLOR]


----------

