# GPU Change, GT640 to GTX970



## Linwood Ferguson (Aug 12, 2016)

So I decided to invest $300 in trying to make Lightroom run faster, and upgraded my quiet old video card to a relatively modern one (there's a new "10" version but they don't have dual DVI ports so I got the 970 instead of 1060).  That's about 10 times the cuda cores, hugely more bandwidth.

I tried going through a series of steps - enter develop after resetting settings, zoom 100%, then move each slider in the basic section at about the speed it can update from left to right and back to center, one by one, zoom out and exit.

And tried to time it.

The short version is "maybe faster".  

The longer version is that I can't time it worth calling it quantitative.  I got anywhere from about 30 seconds to 61 with the GPU off.  I pretty consistently got 30-40 with the old slow card, and about 25-35 with the new GPU. 

Subjectively I think the sliders move more smoothly now, but only a bit.  The brush is no different (I can't recall if brushes are accelerated or not). 

But only a bit.  Definitely not worth investing in a new card in this case.  I plan to build a new computer soon so I think of it as an early piece for it.  

I spec'd out a new computer, and here's the real issue -- speed benchmarks on a new processor are just slightly faster than what I have.  Memory (if I don't Overclock) is 50% faster, but no one seems to feel that is a huge difference (I'm not so sure). And I've already got 3 decent SSD's in use.

I bet if we all took a pool, that 80% of the LR users would vote to have Adobe do nothing at all for a full year but work on performance and bugs.  I for one will gladly pay my $120 over that time and expect zero new features if they would just make it run faster.


----------



## PhilBurton (Aug 12, 2016)

Kind of discouraging to see that $300 gets you only marginal speed improvements.  And if this new system has something like a 4K display you could actually lose performance.

I think the idea of a poll is a good one.  If you are serious PM me and I'll try to contribute some ideas that might make Adobe pay attention.

Phil


----------



## Cerianthus (Aug 12, 2016)

GPU acceleration works with my onboard Intel HD4600 and is officially supported. I was contemplating getting a faster separate card but now I doubt that will do anything. 


Verzonden vanaf mijn iPhone met Tapatalk


----------



## Linwood Ferguson (Aug 12, 2016)

Cerianthus said:


> GPU acceleration works with my onboard Intel HD4600 and is officially supported. I was contemplating getting a faster separate card but now I doubt that will do anything.



I do hasten to emphasize that the testing was very subjective.  In particular, I tried to move the slider only as fast as it could keep up (we've all, I assumed, dragged the slider only to see it sit there and snap to near the mouse a second later).  But that's not at all rigorous in terms of timing.

What I would really like is to try to find some way to automate a series of events that are GPU accelerated so you could do 1000x something, over a decent period of time, without human intervention and see how fast it is.

I'm just at a loss for what that would be.  Even if you get windows testing software which can script mouse movements, LR updates are asynchronous - how do you script "wait until the screen update is finished?


----------



## Linwood Ferguson (Aug 12, 2016)

PhilBurton said:


> I think the idea of a poll is a good one.  If you are serious PM me and I'll try to contribute some ideas that might make Adobe pay attention.


I've been watching with interest the "Open letter to adobe" in their forum, and all the comments.  Despite a few tangents, it was a clear message that a lot of people are unhappy and Adobe had lost its way; now how many "a lot" is nebulous, as the readership of those postings is probably in the 3rd or 4th decimal percentage of customers.  But it was enough that Adobe responded briefly, made some commitment to return and interview people and respond and.... silence.

I think most people have just given up on Adobe actually caring.  They are a near monopoly, milking a cash cow.  Why change?  Why invest? 

So no, my comment was a descriptive lament not volunteering.  I see no point in trying to talk to a wall.  I spend a bit of time every month looking at alternatives to see when/if someone will cover all the key areas Lightroom does.  So far they are lightyears away, but I have a lot more hope in the competitive arena than in getting Adobe to actually listen.

At least until there is a competitor -- then Adobe will be all ears.  

Anyway... I _*AM*_ interested if someone has an idea how to quantitatively test performance, willing to write some code even.  I think that would be very interesting for all of us suffering with performance to find out more objectively what makes a difference in the develop module.  Preview builds and exports are easy to test, but frankly of minimal concern to me -- it is my hand hovering over a mouse trying to complete the next step and move on that is the big deal, so it is the slowness of interactive steps in develop.


----------



## tspear (Aug 12, 2016)

Ferguson,

To test the performance you need a fair amount and limit the variables.
1. Defined state of the data, including catalog, previews and images.
2. Defined hardware, software, and all patch levels
3. An automated testing tool, here is an example of a Web Testing tool Selenium - Web Browser Automation You actually need one for Windows and/or Mac depending on the platform you are testing. Here is one from Microsoft (I have not used it in years) Testing tools | Visual Studio

Then you time/test a few times, and change a single variable and see what happens.


----------



## tspear (Aug 12, 2016)

Ferguson said:


> So I decided to invest $300 in trying to make Lightroom run faster, and upgraded my quiet old video card to a relatively modern one (there's a new "10" version but they don't have dual DVI ports so I got the 970 instead of 1060).  That's about 10 times the cuda cores, hugely more bandwidth.
> 
> I tried going through a series of steps - enter develop after resetting settings, zoom 100%, then move each slider in the basic section at about the speed it can update from left to right and back to center, one by one, zoom out and exit.
> 
> ...



Before getting a new card and/or computer start with using the monitoring tools and see where the current computer has a bottleneck that is limiting performance.
You might just be I/O bound on the motherboard, or memory constrained....


----------



## Linwood Ferguson (Aug 12, 2016)

tspear said:


> An automated testing tool, here is an example of a Web Testing tool Selenium - Web Browser Automation You actually need one for Windows and/or Mac depending on the platform you are testing. Here is one from Microsoft (I have not used it in years) Testing tools | Visual Studio
> 
> Then you time/test a few times, and change a single variable and see what happens.



I'm familiar with the technology, and it is specifically in the technology that the problem lies.  Testing tools work best when they are looking to test a result, as in regression testing -- do x, y, and z, and then see if the result comes up as A.  They work well as well for testing performance when the thing tested is equally definitive - do x, y, z and time how long it takes for A to appear.

Here's the problem with lightroom.  Take something like the brush, say a largish brush and painting on.  Do it by hand and move fast -- the brushed on area comes up slowly behind you.  Move slow; it comes up behind you as well.  Compare the two at a detail level, and you'll find the number of updates to the fast-moving-hand are fewer than for the slow moving hand.  Pan fast, it actually does less work.  Pan slow enough, and the number of updates now keeps pace (to human perception) and it works smoothly.

Now I could absolutely program a benchmark that painted (say) a big circle brush on an image.  It's just a bunch of mouse events, send them as fast as you can.  And time it until the result became stable (harder, but let's say you can). 

That's not meaningful (IMO), as a human wants to wait for the feedback of the updates -- but (a) how close behind does it have to be, and (b) how do you measure THAT, as you have to notice not "after stable for 2 seconds" but "after update complete move mouse".  When is it complete?   The problem with lightroom is it has a pile of calculations running, asynchronously, and updates tend to trickle in with no on-screen indication "all done". 

GPU benchmark tests get around this problem by having the test run itself be synchronous with the timing; they wait for certain things to actually complete.  With LR it's always running, always updating in the background.  To time something you need to know, clearly and in real time, it's done.

To test the actual performance of something like this the program itself needs to be instrumented, where there are clear events you can time being logged some where with metrics.  Presumably Adobe has such, but they aren't sharing.  If I had a real time "done" indicator, you could move the mouse a fixed amount, wait for done, repeat.  We could argue how much the amount could be, but at least it would be quantifiable.



tspear said:


> Before getting a new card and/or computer start with using the monitoring tools and see where the current computer has a bottleneck that is limiting performance.
> You might just be I/O bound on the motherboard, or memory constrained....



Yes, indeed.  Except...

Figuring out the bottleneck of a steadily running program (say preview-builds) is fairly straightforward, as there is adequate sample time to see how busy easily measurable components are.  The normal culprits show up as CPU or disk or memory-size (faulting), with certain combinations, e.g. memory speed can affect CPU (a lot) or disk (a little for fast disks).  And for interactive screen oriented stuff there are GPU issues (also involving bus and memory speeds on both sides), as well as GPU offloading.  No network in my case.  But you can see this only if the events are steady.

In something like the brush example, the event (e.g. each update) is lasting tiny fractions of a second. Editing a photo is made up of dozens or hundreds of small steps made by mouse (mostly) movements, each resulting in a flurry of activity.  Some will be CPU limited, some GPU, some maybe memory constrained, some few maybe disk IO. All occurring too fast to measure with any external tools to measure.

Now all that said... if (big if) we could automate the process of editing an image in some fashion that simulates a human (including wait-for-update time), you could run it repeatedly and change a single variable.  At some level I do not care what precise component is the problem, but if experimentally I could come up with which components work best, I'd be a happy camper.  

But where this all fails is a reproducible, human-relevant test scenario.

PS. For those interested in things like preview build speed, I think this is much easier.  Or export speed.  It's just not relevant, particularly, to the human satisfaction during editing, which is what makes me nuts.  So long as my export speed keeps up with my internet to deliver the images, I'm good.


----------



## tspear (Aug 12, 2016)

Ferguson,

Nice write up. I was not thinking so much of the brush, I tend to see more lag on: global adjustments or application of tonal changes in radial/graduated filters. Transforms have the largest lag for me; especially when zoomed in to 1:1
This was a long time ago, but using Rational Testing tool, we had a similar issue to determine when the application was completed processing (UI required the use of a graphic stopwatch followed by a stoplight, do not ask, not my design). The answer was to watch for a change in the graphic and then wait for a stable image for X seconds, as long as the display window had changes, the process was still running. Same method could be used today to handle the majority of the tests.

Lastly, the ultimate subjective test is you use one of these tools and have a human watch it and do the comparison after. 

Good luck


----------



## Linwood Ferguson (Aug 12, 2016)

tspear said:


> Ferguson,
> 
> Nice write up. I was not thinking so much of the brush, I tend to see more lag on: global adjustments or application of tonal changes in radial/graduated filters. Transforms have the largest lag for me; especially when zoomed in to 1:1
> This was a long time ago, but using Rational Testing tool, we had a similar issue to determine when the application was completed processing (UI required the use of a graphic stopwatch followed by a stoplight, do not ask, not my design). The answer was to watch for a change in the graphic and then wait for a stable image for X seconds, as long as the display window had changes, the process was still running. Same method could be used today to handle the majority of the tests.
> ...



Yes, the issue though is that the performance is not independent of such wait times.  Consider any moving tool.  As you move, it has more mouse events and so accumulates a queue of calculations to do.  It is also, separately but not completely independently decides when it can update the display.  If you delay each mouse event for an update, you are going to force a lot more updates, as some (probably a lot) get skipped as a human is moving the mouse, which we see as delays. 

My guess is the best you can do is pick some rate of movement that is "fast human" and just move it that fast, regardless of updates as you go, then wait for final stability after the last move.  THat's not what we want (we want the feedback in apparent real time).  But maybe it might make it viable.  I no longer have those tools (I am not sure any are free either), I used to when I lead a development group.  Now I just do stuff for fun, so spending the price of a good PC for a testing tool is not on the agenda.


----------



## tspear (Aug 12, 2016)

I still lead development teams, but all server side and web based. So no tools either. I will ask a few testers I know if they know of any.
I actually have not developed a "thick" client based application since the mid 90s, I have overseen a few small projects here and there but was not really active on them. Hard to believe it has been twenty years.


----------



## Linwood Ferguson (Aug 12, 2016)

I'm more than game to do some experimenting, help try to build up a database of controlled tests if we can come up with a way.  So yes, please.


----------



## PhilBurton (Aug 13, 2016)

Ferguson said:


> I'm more than game to do some experimenting, help try to build up a database of controlled tests if we can come up with a way.  So yes, please.


And I would be happy to install test tools and run tests.  I'm on LR 6 perpetual, with a GTX 660 graphics card.


----------



## Linwood Ferguson (Aug 15, 2016)

A brief update: I've done two "real" shoots now, coming back with a lot of photos and needing to edit and publish to the web quickly.

I don't think the new card helps at all.  In fact, in the last set, I really was coming to the conclusion it may be worse.  But definitely not better.

All still subjective, but at least then i was less focused on some arbitrary test and more on getting through the photos, and it just did not speed that up. 

I'm thinking now new computer, that's all that's left to try other than adjusting my expectations, and those are much harder to adjust.


----------



## tspear (Aug 15, 2016)

Here was the suggested tool from three testers. All are a mix of tester/coder. So not sure this is the best solution:
Home - AutoIt


----------



## Linwood Ferguson (Aug 24, 2016)

AutoIt: I looked at it briefly without trying it (I'm in the middle of a computer rebuild from h... purgatory and don't want a distraction).

Did you try it? 

In looking at the docs, I did not see a lot there in terms of how one would "wait for up to X seconds, determine if a portion of the screen has changed"?


----------



## tspear (Aug 24, 2016)

Nope. No time right now. I have not touched Lr since last week. I am in the middle of a few large home projects and getting a new software release at work (while trying to get my kids ready for the school year...)


----------

