# GPUs for lightroom



## alaios (May 3, 2019)

Hi all,
does lightroom (and to similar extend photoshop) benefit from fast gpus?

For example I am thinking this card
GTX 1660ti  

will that be exploited nicely 7from lightroom?
Regards
Alex


----------



## Roelof Moorlag (May 3, 2019)

This is the information that can be found at Adobe: 




https://helpx.adobe.com/lightroom/kb/lightroom-gpu-faq.html


----------



## PhilBurton (May 4, 2019)

Roelof Moorlag said:


> This is the information that can be found at Adobe:
> View attachment 12520
> https://helpx.adobe.com/lightroom/kb/lightroom-gpu-faq.html


This list is out of date for nVidia cards.  For the 1000 series or even for the earlier 900 series, at what point does spending more money for a faster card or a card with more VRAM reach the point of diminishing returns.  That is, what card would provide such fast screen display times that there is no point to spending more money for a faster card?

Phil Burton


----------



## Johan Elzenga (May 4, 2019)

Screen display is only a very minor part in case of Lightroom. Lightroom is not a game with super fast screen action. The real reason to want a fast GPU for Lightroom is that more and more edit calculations are sent off to the GPU rather than being carried out by the CPU (if that option is turned on in the preferences, that is).


----------



## Zenon (May 4, 2019)

Roelof Moorlag said:


> This is the information that can be found at Adobe:
> View attachment 12520
> https://helpx.adobe.com/lightroom/kb/lightroom-gpu-faq.html



I have tried to do this with my iMac without success. I have read that Apple updates the driver versions automatically whenever there is an update. Is that accurate?


----------



## Zenon (May 4, 2019)

I had a chat with Apple support.


----------



## PhilBurton (May 5, 2019)

Johan Elzenga said:


> (stuff trimmed out)
> The real reason to want a fast GPU for Lightroom is that more and more edit calculations are sent off to the GPU rather than being carried out by the CPU (if that option is turned on in the preferences, that is).


Johan,

Given what you just said, what kinds or amounts of edits on an image would tip the balance towards a faster GPU?

Are you also saying that generating previews in LIBRARY does not involve the GPU?

Phil Burton


----------



## Zenon (May 5, 2019)

I'm getting a new iMac in about 3 months so I'm interested in this as well.


----------



## Johan Elzenga (May 5, 2019)

PhilBurton said:


> Given what you just said, what kinds or amounts of edits on an image would tip the balance towards a faster GPU?
> Are you also saying that generating previews in LIBRARY does not involve the GPU?


I don’t know all the details to answer that with much authority. It’s a combination of things, so you see the difference more if you work on a 4K or 5K screen. The bigger the screen, the more pixels need to be re-rendered with each edit. One example where the effect is very clear is Enhanced Details. That is really slow if you don’t have a fast GPU. The CPU does not seem to do much at all when you create an Enhanced Details DNG. The Library module is also using the GPU nowadays, so I would think that generating previews might also be a bit speedier with a fast GPU. The biggest bottleneck would be disk speed however, because rendering a lot of previews means reading a lot of raw data from disk.


----------



## alaios (May 5, 2019)

Related to the graphics card question. I have a NVIDIA GeForce GTX 750  Do not you guys see a reason to upgrade to a newer card? If yes to which one?
I also do games so I will buy a card for both purposes, still good though to be a card that lightroom and photoshop can benefit from.
Alex


----------



## Linwood Ferguson (May 7, 2019)

PhilBurton said:


> Johan,
> 
> Given what you just said, what kinds or amounts of edits on an image would tip the balance towards a faster GPU?
> 
> ...


Generally speaking it is just for display purposes like moving a slider, though Adobe has started to add more.  The new Enhance Details feature , I think, was the first place they did non-live work in a GPU (and there I think they allow multiple GPU's to be available).   I have not heard that previews use it. Yet.  Maybe they will, maybe they never will.

Buying a GPU is partly an exercise in speculation - how soon, and in which direction, will Adobe make use of it.  If you do not need one for general use, my suggestion is delay buying a very high end GPU and get a relatively cheap mid-range.  If Adobe makes it much more useful, you are not out much to buy another.  In fact, if they follow the trends in video editing they might let you just add one (or two or three).


----------



## PhilBurton (May 7, 2019)

Ferguson said:


> Generally speaking it is just for display purposes like moving a slider, though Adobe has started to add more.  The new Enhance Details feature , I think, was the first place they did non-live work in a GPU (and there I think they allow multiple GPU's to be available).   I have not heard that previews use it. Yet.  Maybe they will, maybe they never will.
> 
> Buying a GPU is partly an exercise in speculation - how soon, and in which direction, will Adobe make use of it.  If you do not need one for general use, my suggestion is delay buying a very high end GPU and get a relatively cheap mid-range.  If Adobe makes it much more useful, you are not out much to buy another.  In fact, if they follow the trends in video editing they might let you just add one (or two or three).


If/when I do buy another GPU, I will want one that supports Enhanced Details.  I read somewhere that the nVidia 1600 model does not support this feature.

Multiple GPUs just for editing in Lightroom?  At some point, for a non-gamer the money might better be spent on a faster CPU or maybe more RAM, or a bigger/faster SSD.

Phil


----------



## Linwood Ferguson (May 7, 2019)

PhilBurton said:


> Multiple GPUs just for editing in Lightroom?  At some point, for a non-gamer the money might better be spent on a faster CPU or maybe more RAM, or a bigger/faster SSD.


That's absolutely true today.  Whether it will be in Adobe's future - who knows.  There's a lot to like about the parallelism in GPU's.  At their core (pun intended) CPU's are meant for sequential calculations, and struggle to do things efficiently in parallel.  GPU's are the reverse.  One wonders if there was a real standard for GPU's, so they all could run the same code, if we would not be a lot further down this pike.  Instead the Adobe's of the world have to try to stay compatible with several competing mechanisms, INCLUDING systems with no usable GPU at all.

Will one day Adobe cross a divide and say "we require a GPU compatible with X or the product will not run at all"?  It would certainly substantially simplify their coding, and I think we would see substantially faster progress. 

Now... people just stay frustrated trying to find the right answer of hardware, drivers, settings, etc.


----------



## PhilBurton (May 7, 2019)

Ferguson said:


> That's absolutely true today.  Whether it will be in Adobe's future - who knows.  There's a lot to like about the parallelism in GPU's.  At their core (pun intended) CPU's are meant for sequential calculations, and struggle to do things efficiently in parallel.  GPU's are the reverse.  One wonders if there was a real standard for GPU's, so they all could run the same code, if we would not be a lot further down this pike.  Instead the Adobe's of the world have to try to stay compatible with several competing mechanisms, INCLUDING systems with no usable GPU at all.
> 
> Will one day Adobe cross a divide and say "we require a GPU compatible with X or the product will not run at all"?  It would certainly substantially simplify their coding, and I think we would see substantially faster progress.
> 
> Now... people just stay frustrated trying to find the right answer of hardware, drivers, settings, etc.


All good points you make here.  However, I doubt that there will be a real standard for GPUs unless AMD/ATI goes out of business.  Considering how much more powerful GPUs get with every generation, I'm going to guess that sooner or later, a $200 GPU will be sufficiently powerful for significant performance improvement, and then Adobe might start to require one in a target system.  

Today Adobe recommends 12 GB for Lightroom, but an awful lot of laptops are sold with only 4 or 8 GB.  We are currently shopping for a laptop for my wife and for her needs, which do not include LR, a machine with 8 GB of RAM and 256 of SSD storage is more than adequate.  So if you want to use a laptop for Lightroom, you should look for machines with 16 GB (or more).  They will cost easily $300 and maybe $500 more than the machine my wife will be buying.


----------



## Linwood Ferguson (May 7, 2019)

PhilBurton said:


> They will cost easily $300 and maybe $500 more than the machine my wife will be buying.



So what percentage is $300 of all the camera equipment, lenses, bags, tripods, and similar you own.


----------



## PhilBurton (May 8, 2019)

Ferguson said:


> So what percentage is $300 of all the camera equipment, lenses, bags, tripods, and similar you own.


If I count it all up, $300 is just a small percentage.  That's rational thinking.  But prices can also be very "emotional," as in, "How much am I planning to spend on this new (or upgraded) computer?"  Or, "I can't spend more than $1500 total on this new computer., because my significant other will think I'm being extravagent."  If you look at the Puget Sound computer website, their Lightroom workstations cost a lot more than $1500.


----------

