# new updates october 202



## PCee (Oct 22, 2020)

I have been happily paying photography subscription for a few years. For me it was a dream come true to be able to use full view photoshop and lightroom at an affordable cost. However both my imac and macbookpro are no longer upgradeable from high sierra , despite at the moment functioning quite well.  If i had £2000 to spend on a new mac, it would preferably be spent on a new camera, rather than upgrading my ability to change my sky or colour grading. I understand that the limitations are probably due to the graphics cards, so  sadly eventually Adobe and I will have to eventually part company,
My question is would the new camera raw update be installable on my machines enabling me to use the lightroom and photoshop that I have to be used with a new camera ?
Many thanks and apologies if this question seems stupid but I would hate to try it and maybe wreck whatI have in the process


----------



## Linwood Ferguson (Oct 22, 2020)

Short answer: No.

Slightly longer answer is that the DNG Converter, a standalone program, is generally used for this -- input your new camera raw, it outputs a DNG that's backward compatible with older photoshop/lightroom.

It's not possible to use the camera raw update itself with lightroom; while it is the same ACR engine it is actually built inside of LR, so you cannot mix and match at all.    I know less about photoshop and whether the new ACR itself could somehow front end for old photoshop.  I THINK that might work if the new ACR would run at all; not sure it would.   You might have to use bridge to produce a TIFF to get it into Photoshop or some such.

Now even if you use the DNG converter I guess a question I have never heard answered is whether, with the subscription program, Adobe ever forces an update.  Anyone know?   Or can you stay on 9.4 forever (assuming you keep paying -- you need to keep paying to keep editing).


----------



## Jim Wilde (Oct 22, 2020)

Ferguson said:


> Now even if you use the DNG converter I guess a question I have never heard answered is whether, with the subscription program, Adobe ever forces an update.  Anyone know?   Or can you stay on 9.4 forever (assuming you keep paying -- you need to keep paying to keep editing).


I don't see why not....there are subscribers out there still running 7.x or 8.x, because their OS no longer supports the current version. I'm pretty sure we've also encountered a couple of subscribers still running CC2015.14, and come to think of it I've still got that installed (and it appears in my Creative Cloud app list, and still runs OK). Basically, if the OS is no longer supported for the current version, the update is simply not offered and the user can just carry on regardless, using their currently installed version. I don't know if it will work that way forever, but at the minute it's doable.


----------



## Paul_DS256 (Oct 22, 2020)

PCee said:


> However both my imac and macbookpro are no longer upgradeable from high sierra , despite at the moment functioning quite well. If i had £2000 to spend on a new mac, it would preferably be spent on a new camera, rather than upgrading my ability to change my sky or colour grading. I understand that the limitations are probably due to the graphics cards, so sadly eventually Adobe and I will have to eventually part company


I feel for your problem PCee. I've noted that there seems to be a lack of backward HW compatibility in general for both MAC and Windows. It's like the component/software  manufacturers decide to start using new features in the OS not available back past an older release and not providing backward support. What this does is drive new HW sales.

Like you, I have older MAC laptop and Windows desktop hardware. I've already hit the limit of the 2009 MAC but it still runs LR6 which is good enough for traveling which is what I use it for. I just spent several months resolving a problem with an upgrade to Windows 10 on my 2014 Dell that was only solved by purchasing a new disk drive even though the original report no errors. Windows didn't like it.

I hate throwing out old tech, that works, just because of an unsupported compatibility issue.

When I hit what you hit I will be examining Adobe alternatives that may have a longer life.

Sorry for the long soapbox


----------



## Linwood Ferguson (Oct 22, 2020)

Jim Wilde said:


> I don't see why not....there are subscribers out there still running 7.x or 8.x, because their OS no longer supports the current version. I'm pretty sure we've also encountered a couple of subscribers still running CC2015.14, and come to think of it I've still got that installed (and it appears in my Creative Cloud app list, and still runs OK). Basically, if the OS is no longer supported for the current version, the update is simply not offered and the user can just carry on regardless, using their currently installed version. I don't know if it will work that way forever, but at the minute it's doable.


Well, I said I wasn't sure.  Microsoft's aggressive moves to force people off older windows versions tends to make one paranoid.  I think vendors fear being liable, whether legally or to the social media judges, of security failures in old code.


----------



## clee01l (Oct 22, 2020)

Paul_DS256 said:


> When I hit what you hit I will be examining Adobe alternatives that may have a longer life.


The problem is not one limited to Adobe.  As technology evolves, app are updated to take advantage of the newer technology.  By doing so, the obsolete technology becomes incompatible.    I believer I have seen it stated recently that Adobe will support three OS versions back.   OS manufacturers also limit the number of OS versions to be supported.   
Alternatives to Adobe will also be faced with this same dilemma.  Legacy operating systems can not be supported in perpetuity by any software developer.


----------



## Zenon (Oct 22, 2020)

If you get a new camera that was released after August  of 2020 version 9.4 won't be able to open it. August was the last update for version 9 that included Canon R5&6 cameras. There were probably more. Like stated if using an older version you have to convert to DNG which I didn't like. As Cletus says harware eventually just gets too old. I had the same issue and had to replace my 8 year old iMac last year.  8 years was a good run.


----------



## PCee (Oct 23, 2020)

Thank you all for the replies. My adobe contract runs until june 21. So I guess this gives me plenty of time to have  some free trials on alternatives, at the moment the alternatives still run on high sierra. By June 21  probably Apple, and everybody else , will have made so many other applications redundant that I may be forced to either;  bite the bullet and buy a  new Mac, or revert to my old copy of lightroom 6 and PS essentials. I guess that's the price we pay for continually wanting things to get better and better.


----------



## Paul_DS256 (Oct 23, 2020)

clee01l said:


> obsolete technology becomes incompatible.


Agreed to a point Cletus. However there is no one measuring when something becomes obsolete except the vendors of those products. They benefit from forcing consumers to purchase new products. I can understand the up take of new features but don't understand why that needs to mean something, that was working stops working, though the features it uses are still there. I also understand that backward compatibility cannot be maintained forever but this is not reflected with any sort of granularity in software products. The vendors are making it all or nothing where some new features could be made not to work on older platforms. 

At the end of the day, I hate to throw out anything that still works. Is it me or is the window where something becomes obsolete becoming smaller making us upgrade sooner?


----------



## clee01l (Oct 23, 2020)

Paul_DS256 said:


> However there is no one measuring when something becomes obsolete except the vendors of those products.


Not really true.   A vendor develops a product. It does the job intended.  However,  customers expect new features and if there is no reason to upgrade, customers won't pay more money and the company got out of business.   Competition offers the same product but with new features.  The vendor needs to match and improve on the competitions product to retain  customers.  Faster, better means taking advantage of new operation systems   Remember we have gone from 8 bait architecture to 16, 32 and now 64 in the space of  50 years.   The customers demand ever increasing performance.    The Operating system mfgs and the vendors comply.   The world is never static.


----------



## Paul_DS256 (Oct 23, 2020)

clee01l said:


> ... if there is no reason to upgrade, customers won't pay more money and the company got out of business.


Agreed, but all you are doing then is talking about a different business model. Look at the shift from perpetual to subscription based software.


clee01l said:


> Remember we have gone from 8 bait architecture to 16, 32 and now 64 in the space of 50 years


Oh yes, I've lived through that. Including the CPM vs DOS wars. I was working for DEC at the time and their PC's could run both. For the longest time after 64 bit came out, they would still run 32 bit apps. Apple now only supports 64 but for the longest time ran 32 bit. It's that compatibility window I feel is shrinking.


----------



## Linwood Ferguson (Oct 23, 2020)

Paul_DS256 said:


> Agreed to a point Cletus. However there is no one measuring when something becomes obsolete except the vendors of those products. They benefit from forcing consumers to purchase new products. I can understand the up take of new features but don't understand why that needs to mean something, that was working stops working, though the features it uses are still there. I also understand that backward compatibility cannot be maintained forever but this is not reflected with any sort of granularity in software products. The vendors are making it all or nothing where some new features could be made not to work on older platforms.
> 
> At the end of the day, I hate to throw out anything that still works. Is it me or is the window where something becomes obsolete becoming smaller making us upgrade sooner?


My real life as opposed to hobby is technology, and it is a sad fact of life that computers and all their related "stuff" becomes rapidly obsolete.  

While it is easy to blame vendors and a vast conspiracy aimed and forcing us to buy new repeatedly, vendors tend to be caught up in this also (sometimes without complaint, but still caught up).    A software vendor already has to have quite the array of hardware and O/S to test their products again, including each new version. If they did not decline to support old versions it would become ruinously expensive.   Hardware vendors need to maintain repair parts as designs improve, with similar cost and inventory issues; they consider it better to label a computer obsolete than to either be unable to fix it, or charge outrageous prices to maintain parts forever.  All of them then have legal staff whispering in their ear that if they do "allow" customers to run old software they may be liable for security breaches (social media is shouting in their other ear when a breach occurs of course, a huge percentage of the time from bugs fixed years earlier in current software or hardware). 

It is not so much a conspiracy as a shared nightmare of co-dependency, largely driven by the availability of newer/faster/cheaper technology.

You can fight it or complain -- but you cannot change it.


----------



## Philippe Coudé du Foresto (Oct 23, 2020)

That's why Microsoft, when they stop the public support for a product, can still support it for companies... but at the expenses of the companies.

Also, no one force you to upgrade. You can always stay with the hadware and software that works without upgrading (as long as the HW doesn't fail), you won't benefit of the new functionalities, of course, but if you don't need them...


----------



## Paul_DS256 (Oct 23, 2020)

Ferguson said:


> If they did not decline to support old versions it would become ruinously expensive.


Oh I know Feguson. I worked for a number of SW vendors. One PM once talked about the 'certification matrix of death'.  Customers would come with all sorts of combinations/versions of hardware and software asking for support on our product. It can be a challenge.

However, that doesn't preclude still maintaining version X for OS/HW version Y while a newer version Z supports the latest. Yes, users would miss out on some features but at least they would still be supported. This likely what Cletus was talking about in terms of back versions of OS supported.



Ferguson said:


> Hardware vendors need to maintain repair parts as designs improve, with similar cost and inventory issues; they consider it better to label a computer obsolete than to either be unable to fix it, or charge outrageous prices to maintain parts forever.


This is more of a commercial rather than a consumer issue but agree.  When I was working for DEC, and they built their computer systems into things like nuclear reactors, they had to sign that parts and repairs would be covered for X years. Likely there was more an issue with repair skills than parts.

For consumer products, yes, I'd say when a part fails, it is likely time to upgrade since it is likely the start of other failures. Sort of like when my wife's car, that had over 300KM's on it, needed a new strut, I took that as the sign to replace it after 20 years. It was still working well. They gave me a $300 trade-in. At least they didn't charge me to take it off my hands.

My point is, the length of time something is supported, hardware or software, seems to getting shorter or maybe it's just my advancing years and perception of time.


----------



## Paul_DS256 (Oct 23, 2020)

Philippe Coudé du Foresto said:


> You can always stay with the hadware and software that works without upgrading (as long as the HW doesn't fail),


Well, Phulippe, I don't think this is always the case anymore.

I upgraded my 2014 Dell 8700 from W7 to W10 successfully but then ran into problems. Through a complicated process, I got W10 reinstalled but with an interesting quirk. Several times through the day, the main disk activity would jump to 100% but showed no I/O. Others had this problem. Multiple calls to MS support did not solve it. I ran the Seagate diagnostics on the disk which showed it was ok. No errors in the Event Log except for a period where 'Disk Resets' were being issue during the 2 minute 'brown out periods'. The system would continue normally after that.

Finally, I purchased a new disk, cloned the original too it, and haven't experience the brownouts since.

So, it's not just hardware failures today, but the combination of HW and SW in some cases.

P.S. And I am replying to this from my 2014 Dell desktop. Like I said, I hate throwing things out that can be repaired.


----------



## Linwood Ferguson (Oct 23, 2020)

Paul_DS256 said:


> My point is, the length of time something is supported, hardware or software, seems to getting shorter or maybe it's just my advancing years and perception of time.


I thought it was just me.  I went to update the little mini running my player piano and found the linux was so old that I could not even upgrade, Ubuntu had disabled the upgrade process, so I had to rebuild.  When stuff just sits and works, we tend to ignore it, and as I get older my "ignore" skills have improved a lot.


----------



## Zenon (Oct 23, 2020)

Zenon said:


> If you get a new camera that was released after August  of 2020 version 9.4 won't be able to open it. August was the last update for version 9 that included Canon R5&6 cameras. There were probably more. Like stated if using an older version you have to convert to DNG which I didn't like. As Cletus says harware eventually just gets too old. I had the same issue and had to replace my 8 year old iMac last year.  8 years was a good run.



I just wanted to point out you need to watch competitors as well. DXO and C1 Pro (to mention a few) are on an annual upgrade cycle. Each October/November a new version is released and they stop supporting the previous one. Even if you got it in August/September of that year (depending om the company)  it won't be supported . Adobe (so did C1 Pro) released RAW support for the new Canon R5&6  bodies within  3 weeks of their release. DXO PL3 held out until they recently released PL4.  Now you have to pay if you want to use it. So almost 2 months of not being able to use software you paid for.


----------



## Linwood Ferguson (Oct 23, 2020)

Zenon said:


> I just wanted to point out you need to watch competitors as well. DXO and C1 Pro (to mention a few) are on an annual upgrade cycle. Each October/November a new version is released and they stop supporting the previous one. Even if you got it in August/September of that year (depending om the company)  it won't be supported . Adobe (so did C1 Pro) released RAW support for the new Canon R5&6  bodies within  3 weeks of their release. DXO PL3 held out until they recently released PL4.  Now you have to pay if you want to use it. So almost 2 months of not being able to use software you paid for.


 But see there's the rub... those who want to just sit still and not change anything need to... well, not change anything.

Decide you want to not change Lightroom or Mac or PC, but buy a new camera -- you broke your part of that plan.

(Note I'm not suggesting that DXO's delay was a good thing; I'm commenting on the original issue).


----------



## Conrad Chavez (Oct 24, 2020)

PCee said:


> I understand that the limitations are probably due to the graphics cards


Not necessarily just that. Many companies are declaring shorter supply cycles for multiple reasons.

The depth of legacy support is one already mentioned. The more years to support, the more the total surface area of possible application + OS + hardware combinations to verify against so that you can say you support that. It wasn't so bad the first few years of personal computing, but now that many major tech companies have a long tail of 10 or more years of products, that total surface area can become overwhelming.

The pace of change continues to be fast for reasons we did not anticipate several years ago, including:

*Mobile devices.* These have caused Apple, Microsoft, Google, Adobe, etc. to rewrite entire product systems to let mobile devices participate on a more or less equal footing with desktops, based on people spending more time on mobile devices than desktop devices.
*Cloud integration.* This is driven by everything from needing seamless mobile/desktop connections, to needing seamless ways to work from home with everyone else working at home.
*Machine learning/AI.* This drives up system requirements like GPU advancements do, leaving older machines behind faster.
*GPU advancements*, as we know. GPUs have advanced more than CPUs lately, so more attention is being paid to using the GPU effectively.
*Security.* This is a big one. Some of the changes that have annoyed people the most in macOS and Windows are inconveniences driven by Apple and Microsoft trying to find the right balance between convenience and security, against hackers who can and do attack remotely. The effort needed to keep up with security updates is part of why Apple has limited how far back they’re willing to provide security updates.
Of course, some users who aren’t interested in things like mobile, cloud, security, etc. feel like they’re being forced to go along with basic changes they didn't ask for that are driven by those factors.

For hardware, Apple maintains support for seven years before ceasing hardware parts and service availability. But for macOS, it’s much tighter. Apple typically provides software and security updates for the current version plus two versions back. Right now that means macOS 10.15, 10.14, and 10.13 are getting updates. But when macOS 11 Big Sur comes out soon, everything is going to advance one version so that macOS 11, 10.15, and 10.14 will get updates, and 10.13 will become unsupported for updates.

Adobe announced a similar policy where they currently support the current version and one back. As of last week’s release of Creative Cloud 2021 applications, Creative Cloud 2019 applications have disappeared from the CC desktop app installer, because support is now only for the 2021 and 2020 applications. At least Apple still lets you install those older OSs.

This is really tough if you have equipment more than a few years old. Sure, one issue is that you might not be able to install the newest upgrade, just keep using the older one. But the much bigger issue is that one day upgrades come out and the last supported OS or application version for your hardware rolls off the Supported list and can no longer get updates, or maybe its cloud-only installer disappears.

Put this all together and it’s a picture of it being increasingly difficult to run a computer the same way for 5 to 8 years like we used to. The shortening of support cycles means if you want to run a computer with pro level software and stay reasonably current, it’s got to be in your budget to replace everything  more often. That is a little easier for a business that can expense the cost against revenues, but a challenge for everyone else.


----------



## PhilBurton (Oct 24, 2020)

Conrad Chavez said:


> *GPU advancements*, as we know. GPUs have advanced more than CPUs lately, so more attention is being paid to using the GPU effectively.
> Of course, some users who aren’t interested in things like mobile, cloud, security, etc. feel like they’re being forced to go along with basic changes they didn't ask for that are driven by those factors.
> 
> 
> ...


For laptops the graphics are always built onto the motherboard, if not in the CPU itself.  However, for desktop systems, the GPU card should be and add-on and therefore a reasonable cost, year on year, to upgrade.

Phil Burton


----------



## Gnits (Oct 24, 2020)

Real world experience.

I bought a high end Dell laptop, with max memory, M2 drive and the extra graphics card option with the max graphics card memory of 4GB. 
I noticed lots of issues relating to the graphics card and had the most depressing 3 month period of interaction with Dell technical support.
I became an expert on graphics cards on laptops and acquired various utilities to establish which card was in use (motherboard or dedicated GPU) by different applications. 
I had to turn off the Graphics card option in Microsoft Office because it was the only way to resolve erratic behavior in Office apps.

The bottom line is that the Graphics card was there, was configured as per the o/s..... but was never used. I had Dell tech support remote support into the machine. 

The answer from Dell. 

"I mentioned we have raised this to our product engineers and made investigation and testing, their final response to us is that it's working as designed and under normal settings it will use the intel video to save power and jump to nvidia for high end tasks that require more gpu power. The programs like MS Office, Photoshop and Lightroom are considered low end application and so it will use always the Intel Graphics. "

So Dell regards Lightroom and Photoshop as low end graphics apps and do not use them for these apps.  

My conclusion..... Dell marketing of Graphics cards in their laptops (or this model at least) is a total con job.

So  ...... buyer beware when purchasing a GPU option in a laptop.....  The ability to select and install a graphics card of choice in a Windows desktop provides the best option for Windows users as the graphic card ports will be external and you can decide what ports to use for your display devices.   I cannot comment on Graphics cards on Mac machines.  [This is a Dell specific experience... I am not commenting on other brands].

I am finally about to purchase a custom built PC, with the current fastest Ryzen cpu, 64/132GB of fast memory, 3 of the latest M2 nvme drives for o/s/ apps and Lightroom Catalog/Adobe Cache folders.  Images will be on spinning disks. It will have a medium spec Nvidia graphics card until upgrading to a high end graphics card makes sense for my Lr and Ps use.


----------



## Linwood Ferguson (Oct 24, 2020)

Gnits said:


> So Dell regards Lightroom and Photoshop as low end graphics apps and do not use them for these apps.
> 
> My conclusion..... Dell marketing of Graphics cards in their laptops (or this model at least) is a total con job.


Dell's support organization has some smart people, but they are isolated and insulated  by layer after layer of inexperienced poorly trained staff buried in offshore call centers whose primary mission is low wage cost with no real access to people who actually know anything.

I would not draw any grand conclusions from pronouncements by them.


----------



## Gnits (Oct 24, 2020)

My main issue is that I paid a premium to have an Nvidia GPU option as well as the standard GPU on the motherboard. I opted for this because the screen was 4k AdobeRGB. and I wanted to take advantage of the GPU processing in Lightroom and Photoshop.  The Nvidia GPU was designed NOT to work for my Adobe Lightroom and Photoshop apps, Also, Office apps hang when I click on the Office parameter to use the GPU.  Dell never told me what apps would actually cause the Nvidia GPU to kick in.   

The saga is filled with unbelievable poor responses to my well documented support requests. After several months and several escalations I eventually dealt with an individual who had the ability and authority to deal with the issue. The final solution is that they have agreed to give me a full refund.  That is not the result I wanted as I had spent considerable effort setting up the laptop and re - architecting my data workflows.

The reason I outlined this issue is to warn people that non motherboard extra  gpu options in laptops may not provide the gpu processing  they think they are buying.


----------



## Linwood Ferguson (Oct 24, 2020)

Is the issue you want to just tell it to use the NVidia?  Or are you saying the NVidia won't work? 

If I recall Windows 10 lets you assign GPU's to specific applications.


----------



## Gnits (Oct 24, 2020)

Sorry for not making the problem clear.   The installed extra (non motherboard) graphics card (which happens to be a Nvidia brand) would only work with what Dell regarded as 'graphics intensive applications'.  Dell would not tell me what were examples of graphic intensive applications, but they told me specifically that Lightroom and Photoshop were not regarded as graphics intensive apps. The  Nvidia card in my new Dell laptop would not work with these apps.  This is a Dell issue and not an Nvidia issue.

*If I recall Windows 10 lets you assign GPU's to specific applications. *
Yes... your memory is very good. I tried to use the Windows config options to assign Lr and Ps to the graphics card, but did not work (and Dell told me it would not work). I expect it would work on a basic Windows desktop where the graphics card is installed in a normal slot.... but the Dell implementation in their laptop is obviously proprietary (I am been kind). I also used a whole bunch of utilities to test the graphics cards, etc.... and found that the performance was on par with equivalents installed on other machines. These utilities obviously worked at lower levels in the hardware stack....

So all the apps on my desktop are working and are using the gpu on the motherboard (with the exception of Office apps which crash from time to time). The expensive Nvidia gpu add on option is taking up space inside my laptop and doing nothing.


----------



## PCee (Oct 22, 2020)

I have been happily paying photography subscription for a few years. For me it was a dream come true to be able to use full view photoshop and lightroom at an affordable cost. However both my imac and macbookpro are no longer upgradeable from high sierra , despite at the moment functioning quite well.  If i had £2000 to spend on a new mac, it would preferably be spent on a new camera, rather than upgrading my ability to change my sky or colour grading. I understand that the limitations are probably due to the graphics cards, so  sadly eventually Adobe and I will have to eventually part company,
My question is would the new camera raw update be installable on my machines enabling me to use the lightroom and photoshop that I have to be used with a new camera ?
Many thanks and apologies if this question seems stupid but I would hate to try it and maybe wreck whatI have in the process


----------



## Linwood Ferguson (Oct 25, 2020)

Ouch.  I actually have a XPS15 that has an NVidia card.  I'm in the middle of stuff now, but will try it when I get a bit of time.


----------



## Paul_DS256 (Oct 25, 2020)

Gnits said:


> The reason I outlined this issue is to warn people that non motherboard extra gpu options in laptops may not provide the gpu processing they think they are buying.


Thanks for the insight Gnits. I had been toying with desktop vs laptop for my next major upgrade. Will definitely look for more proof points in the future.


----------



## Gnits (Oct 25, 2020)

Here are some model numbers.





The card is Nvidia GTX 1650, which was a purchase option.


----------



## Gnits (Oct 25, 2020)

There are also other variables to consider..... namely.... if using an external monitor and which port is in use (Hdmi or Thunderbolt).

It is past midnight hear and clocks are changing tonight also, so will look at some of the details I submitted to Dell tomorrow.


----------



## Linwood Ferguson (Oct 25, 2020)

So I have an XPS15 that has a Nvidia GTX 1050 and a Intel HD 630 (this laptop dates from about 2017).

I don't have any Adobe products on it, but I had a astronomy planetarium program that uses the GPU.   I set it up, one after the other, as the Nvidia and then Intel, and ran the program, and it switched as directed.

Are you running a Dell version of Windows, i.e. from their media with all that bloatware in it?  I always do a clean install from Microsoft, then install only the Dell drivers I need.    So besides being slightly different versions I may not be running some Dell crapware you may have?

What happens when you use that option to assign GPU, will it not assign, or does it simply not shift to that GPU?

I wonder what happens if you disable the Intel GPU entirely in the device manager?    (That's not a recommendation, I have no idea if it would then no longer work at all).

PS. If you really think this is Adobe specific I could put Photoshop on it?


----------



## Gnits (Oct 25, 2020)

Thanks for the info.  I did not even think of doing a clean install of Windows when purchased, but had horrendous difficulties with everything at the start.

I am fairly sure it is not an Adobe issue. Dell told me that Ps and Lr would NOT trigger the Nvidia GPU  into action.  I tried to assign the GPU to the screen using Nvidia utility, but nothing happens. Also tried to allocate apps to the card, did not work, cannot remember the exact message now, but did not work.

I am not spending anymore time on this.  It is going back to Dell. Also, Office apps still hang and my cursor disappears from time to time so it is not fit for purpose.  Dell’s reputation in shreds as far as I am concerned.


----------



## Linwood Ferguson (Oct 25, 2020)

Just in the FWIW department, I've been buying Dell laptops (for myself and companies I managed) for decades.  Their hardware tends to be good, their software is awful.  Wiping and doing a clean install from a Microsoft provided kit (they are free to download and put on a thumb drive, microsoft media creation tool does it) is what I always do.  Dell used to complain it was not an official Dell install if you ask for support, but I have not heard them say that in years.  Of course, I never call them for software support, so take that as a caveat. 

This presumes of course you are comfortable doing such an install, which not everyone is. 

This is not a Dell specific problem ... every vendor of laptops fills the factory image with their own bloatware -- trial copies, "easier" versions they author than the manufacturer setup for network, etc., branded replacements just so the logos show up, spyware (lenovo is well known for it if I recall), adware, tracking software... it's just a mess you cannot get away from by changing vendors.   But you generally can by immediately erasing it and starting clean.  Well, as much as a "clean" Microsoft install is free of bloatware, but at least it's just theirs.


----------



## Gnits (Oct 25, 2020)

I deployed massive numbers of Dell laptops and desktops in several large corporate environments.  We used our own image for each project, so worked well.  I asked Dell to tender for an enterprise datacentre solution, which was a total shambles.  So mixed experiences with Dell. 

I will think about doing a clean install before sending it back. I am comfortable in that world, but do not want risk losing my warranty return or wasting too much time on this. No pressure to have a laptop in the current climate, will not be doing a lot of travelling for a while, so not under pressure.  

Building a high spec desktop rig will be my next priority, delayed by 6 months because I got waylaid by the whole Dell laptop experience.  My MacAir died, needed a laptop and was keen to see if a laptop would replace my desktop. 

Thanks for all your feedback.


----------



## Paul_DS256 (Oct 25, 2020)

The above comments by Gnits on limitations imposed by Dell  make my original quote from a Product Manager about the "certification matrix of death" seem trivial now. How is a consumer suppose to know if they will get the most out of their software products even when the specs on the hardware match?


----------



## clee01l (Oct 25, 2020)

Gnits said:


> I am not spending anymore time on this. It is going back to Dell. Also, Office apps still hang and my cursor disappears from time to time so it is not fit for purpose. Dell’s reputation in shreds as far as I am concerned.


I think Linwood is on target with his suggestion od a clean, bare Windows install.  I have never been in favor of the bloatware that most Mfgs. ship with their machines. Removing it was always my first approach to a new off the shelf Windows machine.   Dell is a local company here in Texas.  Over the years, I have seen a lot of their machines from the inside when I was not retired, since they can almost always out bid everyone else on corporate purchases.   Everything inside a Dell is off the shelf components that you could buy anywhere (I don' know about today's MBs as I have not looked at Dell in 12 years).   The nice thing about Dell is that all of the hardware components have been matched and tested to run together well.   The problem with Windows in general is that there are so many permutations and combinations of hardware /drivers that it is very easy to assemble a combination of components that have conflicts.   This is something to consider before shipping that Dell back and starting over with a different combination of components.


----------



## Linwood Ferguson (Oct 25, 2020)

Gnits said:


> I will think about doing a clean install before sending it back. I am comfortable in that world, but do not want risk losing my warranty return or wasting too much time on this.


I always figured at worst I can get Dell's recovery image and put it back on the system if they give me grief over warranty service, or if I need to return.  

But "wasting too much time" ... it's a computer.  Their entire design goal is to waste your time.  You may THINK they are to save us time, and for things like computing the at-bat statistics of each red headed, left handed player on a rainy tuesday in baseball they do.  But for most people they save time like they also provided the "paperless office".


----------



## Paul_DS256 (Oct 25, 2020)

clee01l said:


> I have never been in favor of the bloatware that most Mfgs. ship with their machines. Removing it was always my first approach to a new off the shelf Windows machine.


That's funny because that's the way I feel about Windows 10 in general; lots of bloatware regardless of mfg or native MS. I use the Windows10Debloater on all the family machines I take care of.


----------



## dpirazzi (Oct 26, 2020)

Sorry to hear this. I've just upgraded from a Dell XPS 15 9550 (purchased in 2015). Except for the first 6 months of constant driver problems (Dell Hell) it's been a great machine and has consistently run LR on the Nvidia GPU without fail. The XPS 17 that replaced it also runs LRC and PS on its RTX 2060 GPU without problems. The performance difference is significant.


----------



## dpirazzi (Oct 26, 2020)

Just adding that my XPS 15 did get a clean windows install right out of the box, not sure if that influenced LR running on the GPU. I'm probably not going to do the same with the XPS 17. It is running well after uninstalling the Dell/McAfee crap.


----------



## rjwilner (Oct 26, 2020)

Paul_DS256 said:


> Thanks for the insight Gnits. I had been toying with desktop vs laptop for my next major upgrade. Will definitely look for more proof points in the future.


I'm in that same boat. A Dell XPS desktop that's getting rather long in the tooth, but continues to run LR/PS 'adequately'. No speed demon by any stretch, but gets the job done w/o me feeling like any workflow slow downs are machine related rather than the nut behind the wheel.

So, I've also started eyeing current offerings to start getting a bead on which direction I might head when this thing cries uncle. Historically, I've been staunchly set on having a desktop pc with as fast a processor, as much RAM, and the best gpu I can afford for a photo editing machine. In general, that has meant my eventual purchases were never 'bleeding edge', but a step or two below.

But I'm wondering if I need to stick to the 'desktop or bust' mentality at this point. I do have a small, but much newer laptop that I use mainly to have some editing capabilities when traveling. And it seems to handle LR/PS fine.

Lately, I've begun to wonder if even the concept of a 'desktop' pc (tower or otherwise) is one that has a rather limited future. That idea was reinforced last week when I had occasion to visit Best Buy for other reasons, but can't help but making a pass through the computer area whenever there. And was a bit surprised at how slim their 'desktop' pc offerings were. They had what appeared to be a nice gaming pc or two, but the only other desktop options were 2-3 fairly low end fare that weren't even displayed in the main 'computer room'. They were in an aisle outside that area that seemed almost an after thought.

Now, I realize Best Buy is looking to move volume and not be a full service 'computer store' these days, but the fact that desktops apparently aren't selling in sufficient volume for BB to have more than a slim selection was a bit surprising and made me wonder if that doesn't say something about the general future of such devices. I would have to believe 'specialty' computer builders will lengthen the plank that desktop pc's may be walking to a significant degree, but that I would guess also means increasing expense as those willing to build them becomes fewer and fewer. 

All in all, I've decided to not totally ignore the possibility of moving to a laptop as my next photo editing machine. My biggest need is the ability to run multiple monitors...which laptops seem to have no problem accomplishing...at least no more so than a desktop.


----------



## Linwood Ferguson (Oct 27, 2020)

rjwilner said:


> All in all, I've decided to not totally ignore the possibility of moving to a laptop as my next photo editing machine. My biggest need is the ability to run multiple monitors...which laptops seem to have no problem accomplishing...at least no more so than a desktop.


This subject came up in an astro photo processing thread also.  My 2 cents for the Windows crowd (I don't understand the Mac ecosystem enough to have an opinion):

I think most people who do serious work on a computer, e.g. spend a lot of time photo processing, ought to have both.

I think you need a capable but small, light, portable laptop you can use in the field when needed.  All the things that make it more convenient today to travel with tend to work against serious, long-hour use - smallish keyboard, touchpad, smaller monitor optimized for movies (i.e. hard to calibrate), power management giving longer battery life by also giving slower processing at times.

Then you need a desktop (or my recommendation is an under-desk tower) - great big, color accurate monitor(s), comfortable mouse and keyboard to best survive hours abusing your body by unnatural positions, added horsepower that often requires added cooling and benefits from big roomy towers not tiny cramped cases, added reliability by redundant disks (if you are so inclined, now you have room).

Big roomy tower systems, especially home builts, run much cooler (which means they are more reliable -- electronics hate heat), have room to work on if you want to add disks or memory or a faster CPU in a year or three, can have big low noise fans that cool better without keeping you awake (if you happen to have it near where you nap or sleep).   And in a tower/desktop computer $X almost always buys you more horsepower.

Personally I think the laptop/desktop choice is a bad question.  Most serious photographers are spending thousands on lenses and cameras, the added cost for an inexpensive laptop for when you really need it, and a really good desktop that's pleasant to spend hours on editing, gets lost in the noise over its lifetime.  Doing that will let  you spend less total time editing and more shooting, less time over weird errors because good desktops are more reliable than good laptops, plus the time you spend will be more pleasant.


----------



## Paul_DS256 (Oct 27, 2020)

Ferguson said:


> Personally I think the laptop/desktop choice is a bad question. Most serious photographers are spending thousands on lenses and cameras, the added cost for an inexpensive laptop for when you really need it, and a really good desktop that's pleasant to spend hours on editing, gets lost in the noise over its lifetime


I would agree Ferguson if funding was not an issue. Like rjwilner I'm going to need to make a decision at some time in the future on one or the other. That's why that the problems Gnits had  Dell laptop's are important.


----------



## clee01l (Oct 27, 2020)

Ferguson said:


> I think most people who do serious work on a computer, e.g. spend a lot of time photo processing, ought to have both.


I agree.  However, with the mobile versions of Lightroom that can tie back to LrC on the desktop,  I find a large tablet to be sufficient to replace any Lightroom functionality that I use when in the field.   I replaced my 13" rMBP with a 12.9" iPadPro and it works well.


----------



## Gnits (Oct 27, 2020)

Just to put my situation in perspective.

I had planned to have a custom built high performance (under the desk tower) Pc this autumn .  However, the demise of my Macair (when I was due to make a presentation) prompted the purchase of a new laptop.  I agonized over what spec laptop to go for.  Processing A7R3 images on my MacAir was such a painful experience. I decided to try a laptop with a decent spec, decent screen and dedicated gpu card.  The idea was to try this out and more than likely kick procurement of a high spec desktop further down the road (and maybe have a better idea what GPU to buy).

Running into the GPU issues on my laptop was a really big disaster.  As I had significantly escalated my issue within the Dell organisation and finally got to someone who had the authority and knowledge to deal with my issue, I was gobsmacked that Dell admitted that the Nvidia Card would be left idle (regardless of how I configured it) for for my Lr and Ps apps.

It may be that no one else in the world will have the same issue,  but because I had got confirmation of this by a senior Dell individual, I just wanted to warn people to be aware of the issue.

I am back to square one now in that I need to have my custom rig built and need a new laptop.

(Ps. I agree with Ferguson's comments above re laptop / desktop combo).


----------



## dpirazzi (Oct 27, 2020)

If I can get ~95% of desktop speed/productivity on a similarly priced docked laptop driving external calibrated monitors then I'm happy. It's definitely subjective, but it feels like we're there now with the higher spec'd laptops. Best of all, I'm not tied to my desk.

These are personal choices. If photography was my profession then my choices would likely be different.


----------



## Linwood Ferguson (Oct 27, 2020)

dpirazzi said:


> If I can get ~95% of desktop speed/productivity on a similarly priced docked laptop driving external calibrated monitors then I'm happy. It's definitely subjective, but it feels like we're there now with the higher spec'd laptops. Best of all, I'm not tied to my desk.


If that's net productivity including human time -- maybe.

If that's processing speed --that's tough, depending on how high you aim for performance.  AMD in particular with the threadripper CPU's is really pushing that envelop, and with LR not using the GPU still for many things, CPU speed is still important.  Another consideration is that if on the medium term (say 2-4 years) GPU's become more important to LR or whatever tools you use, it's trivial to swap them out in a desktop and very difficult in a laptop.  That applies to most components of a desktop though I think that is less important (swapping) since Intel and AMD both change socket requirements too frequently, but GPU's are very easy to swap generally at least in the mid-range (which is high range for photo).


----------



## dpirazzi (Oct 27, 2020)

Ferguson said:


> If that's net productivity including human time -- maybe.



Yes, I'm talking productivity. I agree with you that a laptop will not be able to compete with a similarly priced desktop on raw speed on, for example, batch processing large numbers of images. A laptop is probably not a good tradeoff for a wedding photog. And agree with you regarding GPU upgrades on a desktop, big advantage.

But for my workflow, and I'm guessing many other hobbyists, I don't think we would see much of a difference in productivity between the  XPS 17 (10 gen i7, nvme drives, Nvidia 2060 GPU) and a $2000 Puget Systems desktop. Edits in develop module are near instantaneous on Sony A7R3 files.


----------



## rjwilner (Oct 28, 2020)

Ferguson said:


> This subject came up in an astro photo processing thread also.  My 2 cents for the Windows crowd (I don't understand the Mac ecosystem enough to have an opinion):
> 
> I think most people who do serious work on a computer, e.g. spend a lot of time photo processing, ought to have both.
> 
> ...



I would agree with your assessment of ideal computing hardware as things stand *today*. My comment was more questioning how long what is available today will continue to be so how far into the future. (?)My crystal ball is beginning to hint that 'desktop' (or under desk towers) may have a shorter future than some would believe. They're obviously not in danger of becoming part of any endangered species list in the few years...beyond that, who knows.

"...spend less total time editing and more shooting, less time over weird errors because good desktops are more reliable than good laptops, plus the time you spend will be more pleasant...."

I hear essentially the same all the time, and don't get it. I have *no* trouble getting in all the shooting I want today. The computer I use to edit is totally irrelevant to me shooting more. There isn't a computer made that would save me sufficient time to allow me to shoot any more than I do currently.

In addition, I'm a little perplexed how that's become such a broadly repeated mantra in the photographic community. When I started in photography shooting film, I spent *WAY* more time developing and printing one 36 shot roll of film than I would processing the 200+ images I captured digitally in  a single session.


----------



## Linwood Ferguson (Oct 28, 2020)

rjwilner said:


> "...spend less total time editing and more shooting, less time over weird errors because good desktops are more reliable than good laptops, plus the time you spend will be more pleasant...."
> 
> I hear essentially the same all the time, and don't get it. I have *no* trouble getting in all the shooting I want today. The computer I use to edit is totally irrelevant to me shooting more. There isn't a computer made that would save me sufficient time to allow me to shoot any more than I do currently.
> 
> In addition, I'm a little perplexed how that's become such a broadly repeated mantra in the photographic community. When I started in photography shooting film, I spent *WAY* more time developing and printing one 36 shot roll of film than I would processing the 200+ images I captured digitally in  a single session.


I think it depends a lot on the type of work you do, but I think it is also a mis-stated way of really saying "spending time waiting on the computer is painful and I want to do it less".  Despite having said it, I do not find myself editing instead of shooting.  but I frequently find myself editing instead of sleeping (night game, need to cull and process all the shots as fast as I can so people can use them that night and I do not get home until 10pm). 

But I also have many friends who shoot little enough that processing speed is bound to be lost in the noise entirely, as it appears to be for you.


----------



## PhilBurton (Oct 28, 2020)

Ferguson said:


> This subject came up in an astro photo processing thread also.  My 2 cents for the Windows crowd (I don't understand the Mac ecosystem enough to have an opinion):
> 
> I think most people who do serious work on a computer, e.g. spend a lot of time photo processing, ought to have both.
> 
> ...


+1.  As usual, Ferguson's observations are spot-on.

Phil


----------



## PCee (Oct 22, 2020)

I have been happily paying photography subscription for a few years. For me it was a dream come true to be able to use full view photoshop and lightroom at an affordable cost. However both my imac and macbookpro are no longer upgradeable from high sierra , despite at the moment functioning quite well.  If i had £2000 to spend on a new mac, it would preferably be spent on a new camera, rather than upgrading my ability to change my sky or colour grading. I understand that the limitations are probably due to the graphics cards, so  sadly eventually Adobe and I will have to eventually part company,
My question is would the new camera raw update be installable on my machines enabling me to use the lightroom and photoshop that I have to be used with a new camera ?
Many thanks and apologies if this question seems stupid but I would hate to try it and maybe wreck whatI have in the process


----------



## Mickey (Aug 9, 2021)

I realize this is an old thread but I'm wondering if Gnits ever found a solution to forcing the Dell XPS 15 laptop to use the NVidia card with Lightroom?  I stumbled across this thread while searching for something else and was interested since I have that same laptop (2018 version XPS 15 9570) and purposely included the NVidia card as well.  Lightroom Preferences shows that card and I have the graphics acceleration disabled at the moment.  Anyone know why Lightroom would show that card when it's really using the integrated Intel graphics?

I wish I'd known that Dell prevented Lightroom from using this card before I wasted my money.

Based on my reading in Dell support forums it appears this is controlled through software.  Does anyone know of a way to remove it?  I'm not going to waste my time call Dell tech support


----------



## Gnits (Aug 9, 2021)

After extensive interaction and escalation Dell's final comments were that the machine was working as per spec.  After further discussion they agreed to give me a full refund. I then built a Ryzen based desktop workstation with highest performance specs possible without getting into the extreme exotic but installed a gen 1 gpu, as gpu pricing at that time (and probably still) was crazy.   I am happy with my desktop build.   I specifically made sure the motherboard could cater for high performance gpus (in terms of available slots and ability to handle the power and heat requirements, with Pcmie Ver 4. I installed 2 M2 drives with performance of 5000 Megabytes per second.  I note 7000MB/s is currently available at the same price now as I got the 5000MB/s drives. My current years images are on the M2 SSD drive and my image library (125,000 images) reside on a Thunderbolt 3 enclosure.

With Covid, I have not been doing any travelling so was not under pressure to replace my laptop. However, I did purchase a well specified Lenova laptop with decent cpu, 4k AdobeRgb screen, for a trip in the spring, which works fine for my travel needs.   If I was buying a laptop to-day I would probably wait for the next gen Apple Mx MacBook Pro and make a call.   My preference is Windows for workstations, but am agnostic re laptops and other than the issue with the function keys and issues re butterfly keys Apple laptops are well built.

The initial Dell laptop purchase was an experiment to check if I could replace my desktop workstation with a laptop.  As far as I am concerned... that failed. The Lenova laptop I have now is heavier and bulkier than I would like, but is fine for my travel needs.  If I was travelling frequently (as I used to do for business and personal reasons) I would have looked for a slimmer, lighter laptop.


----------



## Gnits (Aug 9, 2021)

Most of the Dell support people I talked to did not understand the problem. I became familiar with most of the gpu utilities around, which should have allowed me configure the gpu to work with specific apps, but I could never get them to work.  My deep suspicion is that the config was hardwired in such a way that it could not be configured later by software.  I could not believe that a Dell engineer confirmed to me that they did not regard Photoshop or Autocad as graphics intensive apps.  I know that the gaming world is totally different, but Photoshop / Autocad etc now make increasing use of the gpu.


----------



## Mickey (Aug 9, 2021)

Thanks, Gnits.  My ancient desktop recently died so I've been using this laptop exclusively for a month or so while I debate getting another desktop or just stick with the laptop and avoid the issues with keeping things in sync.  With a desktop I've consider defecting to Apple, but I know Windows pretty well and am comfortable with with troubleshooting as well as updating/upgrading/repairing a desktop and a laptop when needed, but my only exposure to Apple and IOS has been my iPad.

Anyway, I did find some documentation in the forums on Dell that the Integrated  graphics is programmed to always be primary.  From the Dell forums (which you've probably seen....."... be sure you understand how your system functions.  It does not have a true, discrete GPU.  It has a software-controlled hybrid setup in which the Intel GPU is permanently wired as primary and the nVidia GPU is solely a co-processor for the Intel GPU.  As such, ALL video data passes through the Intel GPU on its way to the screen.  ONLY the Intel GPU has a physical connection to the display panel.  

It can be difficult to tell whether the nVidia GPU is active because Windows will see the Intel GPU as primary.

On some newer XPS systems, the nVidia GPU is hardwired to the thunderbolt port and will drive the external screen without the Intel GPU -- but not the internal one."

I'm in the process of buying a new monitor and will see if the thunderbold port triggers the nVidia GPU.  We'll see.

Mickey


----------



## dpirazzi (Aug 10, 2021)

Mickey said:


> Thanks, Gnits.  My ancient desktop recently died so I've been using this laptop exclusively for a month or so while I debate getting another desktop or just stick with the laptop and avoid the issues with keeping things in sync.  With a desktop I've consider defecting to Apple, but I know Windows pretty well and am comfortable with with troubleshooting as well as updating/upgrading/repairing a desktop and a laptop when needed, but my only exposure to Apple and IOS has been my iPad.
> 
> Anyway, I did find some documentation in the forums on Dell that the Integrated  graphics is programmed to always be primary.  From the Dell forums (which you've probably seen....."... be sure you understand how your system functions.  It does not have a true, discrete GPU.  It has a software-controlled hybrid setup in which the Intel GPU is permanently wired as primary and the nVidia GPU is solely a co-processor for the Intel GPU.  As such, ALL video data passes through the Intel GPU on its way to the screen.  ONLY the Intel GPU has a physical connection to the display panel.
> 
> ...



In Win10,  go into Display -> Graphics Settings -> Graphics Performance preference, select the app you wish to run on the dedicated GPU, then  click on Options, and select the GPU, not "Let Windows Decide".




If you have the Nvidia Control Panel app installed and running on your machine, it can show you which applications are running on the dedicated GPU:


----------



## Mickey (Aug 10, 2021)

Thanks for the suggestion,.  Based on what I've research I'm not sure this will work but I'll give it a try and post back.  According to Dell the addl graphics card in these laptops is a co-processor and not a totally discrete graphics option.


----------

