# Graphics cards in the time of Lightroom 10 (and going forward)?



## dkperez (Oct 28, 2020)

I read some topics in here from early this year, but figured rather than starting up the 6-month-old threads again, I try a new one.

My current Windows 10 Pro system has a 970 GTX card. And Lightroom 10 SAYS “your system provides full acceleration” or whatever it says when you stick it in Auto. In truth, performance in Lightroom 10 is “poor” (more polite than it SUCKS.) And in Photoshop 2021 it’s even worse. Scrubby zoom won’t work in ANY mode, and most tools are so laggy I just reinstalled PS 2020 because of the slowness.  I'm on the bubble about going back to Lightroom 2020, but waiting for the moment.

Like a lot of people (I suspect), I just want a graphics card that WORKS but doesn’t cost more than my first car.  That’ll actually WORK in both Lightroom Classic and Photoshop. No messing around, no picking "auto" or "custom" in Lightroom and seeing WHICH operations are fast or slow - just shove it into "Auto" and it WORKS.  Same in Photoshop - turn on Graphics Acceleration, put it in Advanced and it WORKS.  And Scrubby Zoom WORKS.  No more laggy clone tool, magic wand, brush, etc. in PS.  No slow, laggy performance in Lightroom doing spot healing, or dragging the adjustment brush around a 45 megapixel D850 image, and so on.

A card that when the next version of Lightroom comes out, STILL works. That as Adobe increases the number of things that benefit from AI and graphics acceleration, just WORKS.

It would be a nice bonus if whatever I use would work in Helicon Focus so it's actually faster with the graphics acceleration TURNED ON than leaving it off. HF is a graphics card memory hog and they recommend at least 8GB to be adequate, so HOPEFULLY anything that works optimally in LR and PS will also work well with Helicon.

I haven’t seen anything from Adobe on what will actually work "adequately" or "optimally" as opposed to "minimum requirements), which my current card, I believe, meets.  If Adobe HAS put out information on this, can someone point me there?

It doesn’t have to be future proof for the next decade, but it would be nice if the GPU was still working optimally in 2 or 3 years.

Looking at Puget System’s testing of Photoshop and Lightroom, the differences between the onboard Intel 630 and the 2080, 3080, 3090 class of cards shows NOT what I think of as a significant difference in performance. A little improvement between the onboard Intel 630 and the $1300+ cards, but NOT anything that looked worthwhile. The “overall score” for the 630 was 969 and the 2080 Ti was 1092 – about 12% higher… I was expecting something in the 10 or 20 or even 50 *TIMES* faster, not 12%.

Directly from their text:
*The GPU score is calculated based on the performance for the Rotate, Smart Sharpen, Field Blur, Tilt-Shift Blur, and Iris Blur tests - all of which are able to utilize the GPU to improve performance.
Unfortunately, even if we only look at these specific tests, there is still no meaningful difference between each of the discrete video cards. The only results that we may be outside the margin of error are the AMD Radeon Vega 64 and 5700XT, but even those cards are only ~5% slower than the fastest score which is going to be extremely difficult to notice in day-to-day work.*

For Lightroom Classic, I think they used LR 2019, so things may be very different with Lightroom 10.  The RTX 2080 was only about 8 – 9% faster than the RX 5700 XT and only about 5% faster than the onboard Intel 630 (I’m not sure HOW the onboard 630 was faster than anything, but it’s Puget Systems’ chart).

I DON’T see me going out and buying a Geforce RTX 3090 24 GB card or anything close to the $500 and up group of cards. And I have to believe the law of diminishing returns for Lightroom and Photoshop would kick in LONG BEFORE the $500 mark, and a far more REASONABLE card would work extremely well for the foreseeable future…

So, WHICH cards ARE that?  And HOW MUCH has Lightroom 10 CHANGED the requirements for a graphics card?


----------



## Paul_DS256 (Oct 28, 2020)

I'd suggest you need to understand where the performance issue is. It's not always the GPU. You can get an idea of the split between CPU and GPU usage  by watching the Windows Task Manager as you perform different tasks in LR or PS.

There is a performance chapter in Victoria's The Missing FAQ that may be a good starting point. Also make sure all your drivers are up to date.


----------



## dkperez (Oct 28, 2020)

My drivers are up to date.  And I've watched the Task Manager and Resource Monitor.  Which is why I'm asking the question I did.  In the hope that I can get recommendations, preferably reasonable ones, from knowledgeable folks in here about what kind of hardware to be looking at as Lightroom and Photoshop continue to evolve.


----------



## dkperez (Oct 29, 2020)

So, other than buying what appears to be a $30 book and reading a chapter, does anyone have actual recommendations for reasonably priced (<$500) graphics cards that'll work optimally with LR10 and going forward, and PS 2021 and going forward for a reasonable life expectancy?


----------



## Gnits (Oct 29, 2020)

I was hoping your post would generate some specifics.  My plan is to put in a basic card, such as GTX 1660 and upgrade to something more powerful when there is a real possibility of matching price and performance. My custom build will have a motherboard which will have space for gpu, capable of delivering power to the gpu and its spinning fans, removing heat from cpu, gpu, m2 drives, etc.  I will get a big performance boost already  by having upgraded cpu, memory and m2 drives.  Happy to  deal with picking a better gpu further down the road.

I understand  the whole question of GPU is a complex set of interactive elements, both hardware and software and everything needs to be optimised to get performace, incl cpu, memory, bus speed, speed of gpu and memory in gpu. This also has to be matched by sophisticated software that needs to know how to slice what should be done in the cpu, what needs to be done in the gpu and how long will it take to ship large blocks of data between all these hardware components.


----------



## dkperez (Oct 29, 2020)

That's also my hope.  Which is why I looked at what Puget Systems has tested, and what nVidia recommends, and read some other topics in here BEFORE asking... 
I looked at the AMD 6xxx stuff last night, but it's not helpful - everything is about frame rate on their favorite game.  If there was some way to relate that directly to getting Lightroom 10 to step through 1:1 previews in the Library significantly faster it might be useful, but otherwise, it's just noise.

Hopefully, some useful recommendations with their reasoning why they're good choices will come out...


----------



## Roelof Moorlag (Oct 30, 2020)

Recently i upgraded my +4 yr old system taking into account puget's system reommendations:

4,5 year old system:
Intel i7 4790/4Ghz/8Mb
GeForce GTX 750 1Gb
32 GB DDR3 1600 Mhz
OS (Windows 10) and LrC catalog on Samsung 850 EVO SSD 500
Puget Benchmarkscore (LrC 9.2) - 573.5

New system:
AMD Ryzen 9 3900X/3.8Ghz/12core/
GeForce RTX2070 Super/8Gb DDR6
64 GB DDR4 2666 Mhz
OS (Windows 10) and LrC catalog on Samsung 970 EVO NVMe
Puget Benchmark does not run..

I was hoping on some noticable effect on performance in Lightroom Classic but i could not 'feel' any. Lightroom is lagy as always.
The puget benchmark does not run on my system so i can not make it concrete.

I'm affraid to upgrade to LrC v10 with al the (performance)problems mentioned.


----------



## LRList001 (Oct 30, 2020)

I am having the same query.  I am running a system that has a GPU one revision below the min recommended performance and when I work the software hard (it isn't LR BTW), the CPU barely breaks out of idle, but the GPU is maxing out.  However, when I look at GPU specs, of which there are myriad options, I have no information as to which are worthwhile upgrades.  Adding more cpu ram will have no effect at all I suspect, as I already have 20GB free.  I use a very fast SSD, so it is down to GPU and then just what in a GPU makes it go faster?  I can get a card with more cores, more memory, faster memory, yep all of that, but will it make a difference that justifies the cost?  I'm not sure enough that it will, so anyone who can shed light is most welcome to chip in.


----------



## PhilBurton (Oct 31, 2020)

I also have this quandary, since I'm running a 9 (?) year old Nvidia 660Ti.  (I am almost embarrassed to say this.)  When I upgraded the CPU/motherboard/RAM earlier this year, I was all set to get an AMD 5600-series or thereabouts, until I started to read way too many forum posts that went, "I have all these issues with crashes," or freezeups, etc.  Also, posts saying that AMD can never be truly successful until they invest more in their drivers.  
So it's back to "team Green," Nvidia.  I have read (possibly unsubstantiated) posts that claim that Adobe favors Nvidia products over AMD products.  

No real decisions but some ruminations.  Right now, today, September 30, 2020 I'm waiting for real product reviews of cards from both vendors.   Until the dealers actually have products in stock, I can't actually buy anthing.  That said, I am thinking of the Nvidia 3060, which is rumored to be announced in mid-November.  Even that card is probably overkill for the current Adobe products, but I'm assuming that Adobe will leverage its AI more and more, which requires a good graphics card.  I'm also very interested in Topaz DeNoise, which even now makes heavy use of the graphics card.

Since my new motherboard supports PCI-E 4.0, I'd like to get a graphics card that also supports this specification.


----------



## Bob_B (Nov 1, 2020)

As one who is in the process of upgrading to a new computer, I greatly appreciate the comments in this thread, which brings me to ask about noise levels arising from high-powered CPUs and GPUs? As I put together my computer, I insist on it generating as little fan noise as possible. (I also use my computer to live audio from my sax via my DAW.) What's the verdict on noise in these systems? Which ones offer the quietest environment?


----------



## Gnits (Nov 1, 2020)

I asked a custom builder for advise on this, as I have the same concerns. I specifically queried whether the rig would be quieter with water cooled or aircooled. I did not get a definitive response as too many variables. 

I am specifically not going to overclock cpu or gpu and plan to try and configure various fan speeds and thresholds to keep fans off or at minimal levels. Various motherboards may provide sophisticated tools to manage fan speeds, etc.


----------



## Bob_B (Nov 1, 2020)

Thanks Gnits. Indeed, I will comprise performance for quietness when building my new computer. If anyone has opinions on what MBs or GPUs are the quietest, please offer your thoughts, and thanks.


----------



## Gnits (Nov 1, 2020)

I do not think you will get definite answers for specific models. CPU’s and GPU’s are a lot like car engines,  There will be a minimum tickover level, based on the specifics of the various hardware and software components used. Then as you press the accelerator (eg Batch Import or Export in Lr or render videos,etc) as the engine is more pressed, the revs go up, the temperature increases, more power is consumed and the fans work harder.  This is likely to be unique for your workflow and hardware/sw combo.

My plan is to get a high spec rig built. I can pick high end components for most things, but most worried about what gpu to purchase.  

Once I get delivery of my new build, I will try it out for a few weeks, to get some insight on actual real world performance and try to identify specifically what is slower than I want it to be or what is noiser than I want it to be.  I will tweak whatever I can (revised config or higher specified hardware, tweak how fans are triggered, etc,) until I get where I need to.  

There may be no perfect answer.  It may be your spinning discs which may be the noisest.


----------



## mcasan (Nov 1, 2020)

Quietness is sum of the noise damping capabilities of the case vs the noise from case fans (watch fan reviews) power supplies, CPU cooling fans, and GPU cooling fans.   So if you are want a very quiet machine, check the reviews and select the case and cooling fans that are some of the quieter models.   Magazines such as Maximum PC will have reviews of systems and components with noise being of the items covered.


----------



## dkperez (Nov 5, 2020)

Unfortunately, the original discussion of graphics cards is largely academic at present, and likely for the future at least into and through Q1 of next year.  There are essentially NO 3000 series cards available, and not even much in the way of 2000 series cards.  There are some on Ebay, but it seems like Covid 19 has astronomically increased the demand for graphics cards.  And if you DO find one of the cards it's either used or they're trying to get at LEAST 200% of list.

I WAS going to go with a 2060 Super because I need 8GB for some tools.  At MOST, it should run $399 (last March they were available for $330).  I haven't found ANY at any price NEAR $399, and as of yesterday, even pcpartpicker.com didn't show any for less than $500+.  At $500 you might as well by a 3070.  Except, of course, there ARE no 3070s, and from places I talked to, if they DO get any stock in they're already pre-ordered.

The expectation is that when AMD releases the 6nnn series, the same thing is going to happen...  

I'm hoping an AMD 5nnn XT with 8GB will provide better performance for the reasonable future.


----------



## Bob_B (Nov 6, 2020)

Seems that there has been some hoarding of the AMD 5000 series CPUs as well. Hopefully, stocks of both GPUs and CPUs will return shortly.


----------



## Paul_DS256 (Nov 6, 2020)

Bob_B said:


> Seems that there has been some hoarding


Not sure that's the issue or supply chain interruptions. I know people getting home renovations done during Covid that have had delays caused by supply availability.


----------



## dkperez (Nov 6, 2020)

It may be both...  nVidia may not have been able to produce as many cards as they wanted, but if you go over to the nVidia site or any of several that talk about graphics cards, the howls of outrage are deafening...  There appear to be a LOT of people that are very unhappy that the graphics card(s) they wanted aren't available.  And they're blaming it squarely on nVidia.  they don't appear to be nearly as unhappy with the scumbags that are scalping them at ridiculous prices.


----------



## LRList001 (Nov 7, 2020)

Looking at this website and *assuming *GPU performance is the key driver:
https://www.videocardbenchmark.net/directCompute.htmlthen the RTX 3070 seems to be sitting on the top of the price/performance chart.  It is listed at around GBP £550, though as already mentioned, no stock, no dates for when stock is due.

Be aware that the RTX 3070 consumes a lot of power, around 250W.


----------



## Paul_DS256 (Nov 7, 2020)

LRList001 said:


> Looking at this website and *assuming *GPU performance is the key driver


I don't know for LR if this is true or not. Yes, for certain functions, and I don't know which those are. While monitoring LR, GPU is not always in play.


----------



## dkperez (Nov 7, 2020)

Unless it was a considerable "bargain" - as in they were still screwing everyone to the wall on the 2060 Super, the 3070 is DRASTIC overkill for anything in Lightroom or Photoshop as far as I know...  So, unless they were selling at the $499 price and the 2060 and 1080 and all those were still ridiculously overpriced - which is never gonna happen, it would be silly to spend the money for the 3070.  

At the moment, before the next insane feeding frenzy starts, the AMD cards still aren't stupid expensive.  And places are actually doing rebates and discounts on 5xxx series cards in expectation of the 6xxx ones...  So, I'll get a 5700 XT and that should work fine for a while.  Hopefully.


----------



## Gnits (Nov 7, 2020)

Very useful list of cards. 

Remember, you may have the fastest and most expensive gpu, but you need corresponding high end components for the gou to be able to perform, ie motherboard with V4 bus speeds, fast cpu to deliver image packets to gpu, fast memory and fast disk i/o for batch file operations and smart software that can leverage all of this.


----------



## dkperez (Nov 7, 2020)

Yeah, as with race cars, they're only as fast and the slowest component and as strong as the weakest...
By and large, I'm OK for CPU (5820 O/C to 4.4 GHz and memory (32GB).  And everything that's doing anything O/S, applications, cache, temp, images, catalogs are all on SSDs, and MOST are on SEPARATE SSDs.

Yes, one of these days I'll put a new system together with a current motherboard, memory and CPU (generally 1 generation behind or at least 1 step below the absolute fastest one 'cause the Law of Diminishing Returns usually kicks in big time with CPUs and such.  But I'm lazy, so I'll start with the graphics card.


----------



## PhilBurton (Nov 7, 2020)

dkperez said:


> Unless it was a considerable "bargain" - as in they were still screwing everyone to the wall on the 2060 Super, the 3070 is DRASTIC overkill for anything in Lightroom or Photoshop as far as I know...  So, unless they were selling at the $499 price and the 2060 and 1080 and all those were still ridiculously overpriced - which is never gonna happen, it would be silly to spend the money for the 3070.
> 
> At the moment, before the next insane feeding frenzy starts, the AMD cards still aren't stupid expensive.  And places are actually doing rebates and discounts on 5xxx series cards in expectation of the 6xxx ones...  So, I'll get a 5700 XT and that should work fine for a while.  Hopefully.


When I was planning my new desktop system and purchasing components this past May, I was planning to get an AMD 5600XT or 5700XT.  Doing my due diligence on the components I had selected, I found an alarming number of web posts about driver instability, crashes and freezeups.  I also found posts saying that AMD could never compete effectively until they improved the quality of their drivers.  So I reluctantly decided to wait until the Nvidia 3000 cards were available.

I really do hope that AMD invests some of its profits from the CPUs into graphics card driver development.  But I value system stability and reliability over price/performance.  My current plan is to wait until I can get an Nvidia *3060 *card, which is supposed to be launched in mid-November according what I have read. 

How long I have to wait, I don't know. In the meantime, I'll monitor reports of early users of the new AMD "Big Navi" cards for reports of driver stability.  If those reports are favorable, then I will get an AMD card.

Phil Burton


----------



## dkperez (Nov 8, 2020)

I saw some of those same topics...  And I may wind up doing the same thing - waiting several months 'til this frenzy ends isn't going to work.  So, I ordered a 5700 XT.  Hopefully, the reports of AMDs ineptitude are overblown.


----------



## dkperez (Oct 28, 2020)

I read some topics in here from early this year, but figured rather than starting up the 6-month-old threads again, I try a new one.

My current Windows 10 Pro system has a 970 GTX card. And Lightroom 10 SAYS “your system provides full acceleration” or whatever it says when you stick it in Auto. In truth, performance in Lightroom 10 is “poor” (more polite than it SUCKS.) And in Photoshop 2021 it’s even worse. Scrubby zoom won’t work in ANY mode, and most tools are so laggy I just reinstalled PS 2020 because of the slowness.  I'm on the bubble about going back to Lightroom 2020, but waiting for the moment.

Like a lot of people (I suspect), I just want a graphics card that WORKS but doesn’t cost more than my first car.  That’ll actually WORK in both Lightroom Classic and Photoshop. No messing around, no picking "auto" or "custom" in Lightroom and seeing WHICH operations are fast or slow - just shove it into "Auto" and it WORKS.  Same in Photoshop - turn on Graphics Acceleration, put it in Advanced and it WORKS.  And Scrubby Zoom WORKS.  No more laggy clone tool, magic wand, brush, etc. in PS.  No slow, laggy performance in Lightroom doing spot healing, or dragging the adjustment brush around a 45 megapixel D850 image, and so on.

A card that when the next version of Lightroom comes out, STILL works. That as Adobe increases the number of things that benefit from AI and graphics acceleration, just WORKS.

It would be a nice bonus if whatever I use would work in Helicon Focus so it's actually faster with the graphics acceleration TURNED ON than leaving it off. HF is a graphics card memory hog and they recommend at least 8GB to be adequate, so HOPEFULLY anything that works optimally in LR and PS will also work well with Helicon.

I haven’t seen anything from Adobe on what will actually work "adequately" or "optimally" as opposed to "minimum requirements), which my current card, I believe, meets.  If Adobe HAS put out information on this, can someone point me there?

It doesn’t have to be future proof for the next decade, but it would be nice if the GPU was still working optimally in 2 or 3 years.

Looking at Puget System’s testing of Photoshop and Lightroom, the differences between the onboard Intel 630 and the 2080, 3080, 3090 class of cards shows NOT what I think of as a significant difference in performance. A little improvement between the onboard Intel 630 and the $1300+ cards, but NOT anything that looked worthwhile. The “overall score” for the 630 was 969 and the 2080 Ti was 1092 – about 12% higher… I was expecting something in the 10 or 20 or even 50 *TIMES* faster, not 12%.

Directly from their text:
*The GPU score is calculated based on the performance for the Rotate, Smart Sharpen, Field Blur, Tilt-Shift Blur, and Iris Blur tests - all of which are able to utilize the GPU to improve performance.
Unfortunately, even if we only look at these specific tests, there is still no meaningful difference between each of the discrete video cards. The only results that we may be outside the margin of error are the AMD Radeon Vega 64 and 5700XT, but even those cards are only ~5% slower than the fastest score which is going to be extremely difficult to notice in day-to-day work.*

For Lightroom Classic, I think they used LR 2019, so things may be very different with Lightroom 10.  The RTX 2080 was only about 8 – 9% faster than the RX 5700 XT and only about 5% faster than the onboard Intel 630 (I’m not sure HOW the onboard 630 was faster than anything, but it’s Puget Systems’ chart).

I DON’T see me going out and buying a Geforce RTX 3090 24 GB card or anything close to the $500 and up group of cards. And I have to believe the law of diminishing returns for Lightroom and Photoshop would kick in LONG BEFORE the $500 mark, and a far more REASONABLE card would work extremely well for the foreseeable future…

So, WHICH cards ARE that?  And HOW MUCH has Lightroom 10 CHANGED the requirements for a graphics card?


----------



## PhilBurton (Nov 8, 2020)

Gnits said:


> Very useful list of cards.
> 
> Remember, you may have the fastest and most expensive gpu, but you need corresponding high end components for the gou to be able to perform, ie motherboard with V4 bus speeds, fast cpu to deliver image packets to gpu, fast memory and fast disk i/o for batch file operations and smart software that can leverage all of this.


All of which I have in my new desktop build, except for the fast GPU.

ASUS ROG X570 Strix-E motherboard, which has V4 bus speeds.
AMD 3900X CPU
32 GB of 3600-rated Ballistix RAM.

But I'm still using a dinky Nvidia 600 Ti GPU from I think about 2012 or 2013.


----------



## PhilBurton (Nov 8, 2020)

dkperez said:


> I saw some of those same topics...  And I may wind up doing the same thing - waiting several months 'til this frenzy ends isn't going to work.  So, I ordered a 5700 XT.  Hopefully, the reports of AMDs ineptitude are overblown.


Well, keep us posted.  My heart says to get an AMD card.  My head says to stick with Nvidia.


----------



## dkperez (Nov 15, 2020)

SO FAR - and note the STRONG "SO FAR", the 5700 XT has worked very well, made LR 10 and PS 2021 usable...  The significant lag is fixed and things are working.  In Helicon Focus, the benchmark time has changed from 47+ seconds with my GTX 970 to UNDER 4 seconds with the 5700 XT.  Some other things are significantly faster as well.  Save in Photoshop still takes the same time (expected), but some operations seem better.

SO FAR - no problems with drivers.  I started with the ASUS drivers (card brand), then the upgrade popped up to go to the latest AMD driver, which I did.  There doesn't APPEAR to be any bad behavior...  And my modest 4K - 3840x1600 monitor works fine.

Time will tell.  I think I'll throw the 970 on Ebay and see what it'll bring.


----------



## robashcroftwales (Nov 15, 2020)

I have an NVIDIA GTX1050. I recently upgraded my rather low-spec desktop Windows PC. I also upgraded the RAM from 8 to 16 GB. Because of the more powerful card I had to upgrade the power pack in the desktop at the same time. It worked much better than my old setup, but then I noticed that NVIDIA do two driver versions, one for gaming and the other they call 'studio' for creative purposes. I was running the gaming version, and as I don't do games I downloaded the studio version and it did seem to make things a little better. I don't have any problems in LR or PC  now, no jerkiness or lagging when using various tools and functions. I dare say if I had a top-spec computer it could be better - but do I need it?

If anyone is using NVIDIA it is useful to use the Geforce Experience app that keeps control of your driver and account with them.

Shots below are my card settings in LR and PS.

LR settings





PS settings






Lightroom Classic version: 10.0 [ 202010011851-ef6045e0 ]
License: Creative Cloud
Language setting: en
Operating system: Windows 10 - Home Premium Edition
Version: 10.0.19041
Application architecture: x64
System architecture: x64
Logical processor count: 4
Processor speed: 3.4 GHz
SqLite Version: 3.30.1
Built-in memory: 16330.9 MB
Real memory available to Lightroom: 16330.9 MB
Real memory used by Lightroom: 1255.7 MB (7.6%)
Virtual memory used by Lightroom: 1362.8 MB
GDI objects count: 794
USER objects count: 2433
Process handles count: 1918
Memory cache size: 450.0MB
Internal Camera Raw version: 13.0 [ 610 ]
Maximum thread count used by Camera Raw: 3
Camera Raw SIMD optimization: SSE2,AVX,AVX2
Camera Raw virtual memory: 395MB / 8165MB (4%)
Camera Raw real memory: 413MB / 16330MB (2%)
System DPI setting: 216 DPI (high DPI mode)
Desktop composition enabled: Yes
Displays: 1) 3840x2160
Input types: Multitouch: No, Integrated touch: No, Integrated pen: Yes, External touch: No, External pen: Yes, Keyboard: No

Graphics Processor Info:
DirectX: NVIDIA GeForce GTX 1050 (26.21.14.4292)

Application folder: C:\Program Files\Adobe\Adobe Lightroom Classic
Library Path: C:\Lightroom\LR desktop Catalog-v10.lrcat
Settings Folder: C:\Users\Desktop PC\AppData\Roaming\Adobe\Lightroom

Installed Plugins:
1) AdobeStock
2) Facebook
3) Flickr
4) HDR Efex Pro 2
5) Helicon Focus Export
6) Nikon Tether Plugin


----------

