We may differ in what was first: abandoning it or closing it, but the process is available at phabricator.
What was first was regular production incidents caused by OCG, which would have required a rewrite (and to some extent rearchitecting) to operate smoothly. That would have been a major project, plus the service had a constant maintenance cost (it was a node.js service, and node is relatively maintenance-heavy), and the WMF did not want to maintain two different renderers forever.
We migh also have a different view on priorities, but a Foundation with 100 million dollars in a vault can pay for someone to solve this issue, no doubts.
Yes, or the money (probably a quarter-year work for a team, at least, so that might be something like $300K?) can be used on something else. There are a huge number of things to spend money on, and IMO it's hard to argue for the strategic importance of PDF book rendering. It wasn't used much, it would have been work-intensive to maintain (every new wikitext feature would have required special handling for the LaTeX transformation, and there are all kinds of wikitext/HTML constructs which are not easy to express in LaTeX), and there isn't much value in a PDF of Wikipedia articles when the originals are freely available over the internet (and for people with difficulties accessing the internet, there are better alternatives).
(Personally, I don't think Proton was worth the investment, either - it doesn't give much value beyond the PDF generation that most browsers are already capable of doing.)
By the way: the Proton PDF render is also failing if the article has a gallery. But no one cares about it. It used to work, it was broken, and no one was responsible for the fail.
The drawback of Proton is that since it uses a headless browser for PDF rendering, there isn't much room to influence how the rendering goes (beyond CSS tweaks or upstream bug reports), so issues like that might not be easily fixed. (OTOH it at least displays galleries, which OCG didn't