When you have a monolithic system of any significant size, it’s likely different parts of it require different types of resources to function optimally. Some parts are CPU intensive, while other parts are RAM intensive. It’s possible that there’s at least one part that would greatly benefit from using GPUs for some intensive computational processing.
There are valid reasons most developers don’t like working with old or outdated technology.
Waterfall has clearly defined stages of software development: requirements → architecture → design → implementation → and so on. (As a side note, some people don’t differentiate between architecture and design, but I consider them separate. Architecture is strategic, while design is tactical.)
Then along came Agile. Most people started considering Waterfall evil and started opposing everything about it. They threw away the baby with the bathwater. And by that I mean essential practices like architecting got thrown away along with Waterfall.