A-ha! I have a dual GPU laptop and was able to reproduce that issue by forcing the game to run on the discrete (nVidia) GPU. It works fine on the integrated Intel GPU. I'm on fc34 using the Nouveau driver. If you're also using Nouveau, it might be worth trying the proprietary nVidia driver. If I recall correctly, Mint has some kind of integrated installer to switch the video drivers.
Thanks for chiming in. I'll take any help I can get -- and I DO appreciate it.
My card is the NVIDIA Quadro, so I'm running the proprietary NVIDIA-340.108-Obuntu5.20.04.2 Driver, not the open source X-server-Xorg-video-nouveau 1:1.0.16-1 XOrg X-server Nouveau Driver. The 340 driver was recommended by all the gaming sites, so that was the one I switched to. Even at that, I occasionally get video driver issues, since the Quadro card I'm using is something like, six or eight years old. Another issue I get in other games, some Unity and others Ren'py, is a Direct-X 11 warning, insisting I install Direct-X 11. I DID install Direct-X 11. It's an old GPU with less than 1 Gig of DDR-3 V-RAM. I've been tempted to install Direct-X 12 in Wine, but cringe away, since it smacks of doze-10. There, I'll have to admit, using the XOrg driver might work, since those usually employ Open GL. However, there are many games that won't run, or won't run well if I use the XOrg drivers and force Open GL. Then I'd have to swap drivers in and out with each game.
Are you saying I should try the Nouveau driver, since the NVIDIA driver isn't cutting it?
We were able to get the game running in both html5 and under Wine when the devel disabled a plug-in that allows for flashing screen images, like when you get into the dungeon with the hypno hell traps. I'll say the audio was lost when the plug-in was disabled, but the video issues were overcome and the game was playable in Wine. With the html version, we lost the ability to continue a game at startup. We had to enter the game first, before we could hit the saves. That made us need to create a character every time we started the game. The devel provided a work-around for that, assuming you had a save that would allow you to continue. The issue is getting it running natively in Linux, since they were good enough to provide a Linux distro. Unfortunately, I think the issues are my machine, not the game, since no-one else seems to have issues. I really don't think the driver is the issue here. I think it's my Quadro Card not living up to its end of the bargain.
Back when I was helping with beta play-testing Privateer remake, I had an atypical and out of date AGP -- I think it was AGP -- video card that required the game executable be re-configured so that it would work with that brand of card. I think it's the same thing here. However, with a Quadro, something almost no-one uses unless they're doing video rendering almost exclusively, I think it unreasonable to try to get the game to work for just one card -- an outdated card at that -- that almost isn't used in the average home. Game players -- I won't say gamers, as they're a breed apart -- aren't going to get a Quadro to play games, they're going to purchase GPUs designed for playing games and doing spreadsheets. I got the systems I have as refurbs. Right now, getting an even reasonable GPU is unreasonable, due to cost, thanks to bit coin miners. Right now, a used, five-years-old, 1050-Ti is going for almost twice what it cost new. Three hundred for a card that's already out of date and only sold for a hundred fifty when new, is completely unreasonable. I can do a hundred for something as capable as the old Quadro I have, not three.
Unlike you, I don't have a choice in video chips. The MoBo I have for my HP Z600 does not have integrated video support, since it's an industrial board designed for work stations and the processors designed to fit in it don't have video support built in. I have two of them, since they're refurbs and I got a great price on the pair. I paid less for two complete dual quad core hyper-threading CPU machines I could combine into one nice machine, than I would have for a more conventional, single CPU quad core machine with less RAM and HDD space. Now I'm paying for being frugal -- or maybe being a skinflint, if you must.
Most standard consumer intel chips that come in pre-built and configuired systems are actually APUs of a sort, not simple GPUs. Unlike AMD, whose chips are usually straight CPUs, not APUs. Certain select AMD APUs & Radeon GPU boards can actually work in crossfire configuration. intel refuses to work in SLI with the APUs and GPUs. They force you to purchase a second discrete GPU to use SLI configuration. It's a big selling point with them and doze.
Ready for a trip down Memory Lane?
Corporations LIKE making you spend money. I was one of the invitees for the roll-out of doze 2k. Their biggest selling point was how 2k was going to force the consumers to upgrade their hardware, thereby increasing MY profits as an IT pro, with increased hardware sales, just so they could run the newest and latest doze. That was what had me REALLY looking at Linux instead of doze.
As I said before, I'm no newbie to Linux. I ran Red Hat 5.2, back when Red Hat was a consumer distro, with my old 486 DX2-66, but that was more as a hobbyist, not as my every day driver. Back then, you had to WANT to tinker with your system, if you ran Linux. Configuring things was a major endeavour, with how things always seemed to have conflicting dependencies in the various packages. By the time they came out with 2k, I was using Mandrake 6.2 I think, since that was about the time Red Hat went corporate with their distro and were more an enterprise solution, than for private home use. The 2k roll-out is what sold me on Linux and soured me on doze, even if 2k WAS more stable than most other operating environments in the doze lineup back then. You'll note I call doze an environment, not a system. It's a conglomeration of bits, pieces and parts taken from everywhere and slapped together, pushed out the door and everyone gets to live through update hell as they TRY to fix the bugs. Linux is a system, with how everyone gets together and does their best to integrate things seamlessly. At least it doesn't eat itself, the way doze does ... or did.
Now then, back to the here and now.
My procs are Xenons, without GPU support, unlike most standard consumer intel chips in pre-built and configured systems, which have GPU support built in, often in the CPU, although the GPU can be added to the MoBo. The Xenon is considered an industrial chip, designed for servers and work stations, not a home chip. Therefore it, like the MoBos designed for most chips like it, is stripped down. Corporations buy thousands of systems, not just a few. Therefore, they don't want to pay two bucks a board to include the video output on the MoBos, if they don't need to, same for the CPUs. That can run them thousands they don't need, or want to spend. Why add in the extra silicone, or work configuring it if you don't need it, or it's going to sit there and moulder when your workers need and use the Quadro to do their work?
The system came with the Quadro. It's the only discrete PCI-e graphics card I have. I actually have three of the things, all of them identical. The others I have would REALLY be a step backward, being standard PCI format boards, or the format before PCI-e but after PCI; what was it ... AGP? I THINK I have an ancient AGP card somewhere, but as I said, assuming I could even squeeze the card into the MoBo, which I can't, if I were to run them, they'd be something like 128 MB or 256 MB V-RAM, DDR V-RAM, or DDR2 V-RAM. I'll grant you, my card only has 512 MB DDR3 V-RAM, but DDR-3 is way above simple V-RAM.
I will admit, there are MoBos out there that have specific NVIDIA or Radeon GPU chip sets on them and it's a major selling point for the board manufacturers, so I'm not going to say intel ONLY makes APUs for consumers, I'm not.
As for driver swaps ... yeah, Mint has the same basic driver swap interface as Ubuntu. It's a different desktop format, but with the same base internals as Ubuntu, only with a few tweaks. That IS what it's based on after all. However, I think that running the proprietary NVIDIA driver would be best overall. Don't you think?