I mean this is what happens, game engines evolve to take advantage of the latest technology. A GTX 660 with 2-4GB of VRAM from a decade ago can't run the latest games.Naw that is just shitty devs. Even the resource hog that is UE5 can run games at 8GB VRAM. Unless you run at 4K, which most don't do, (1080p and 1440p is the most popular screen rez) it's not going to need more than 7. 8 is fine so long the game is optimized to the hardware that is using it, but most big studios are told to ship a broken product and fix it later, and not care about some types of optimizing. (regardless of the devs' wishes)
The indie devs usually don't know what they are doing and learning as they go, so they tend to have bad performance, but if they learn or do a lower-powered game, 8 is fine.
Sure, the standard being swapped to 12 would give us some wiggle room, but if devs instead optimized well instead we could make 8GB last at least another 5 years. But naw, the less work devs have to do, the better in their eyes. Why do you think things such as nanite was so loved by devs? They now don't have to worry about textures and models. And then ray-tracing for light, why do hard work when you can press a button in the UE5 menu?
Can you still make a game that requires minimal graphics hardware? Sure, Rocket League runs fine on almost anything. But can you look a AAA title that pushes the latest graphics out the pipe? Probably not.
8GB cards are reaching the point of becoming obsolete for AAA titles. Its not going to be just this game its going to be most of the upcoming games that are pushing out high end graphics. I've already got a bunch in my Steam library where you can basically forget about it unless you turn all the settings down to minimum, and even then they still suck on an 8GB card.