because it wasn't cased by AA, in fact AA should be reducing any noise and jitter and not increase it.
Dithering and artifaction is directly caused by AA, or more precisely the forms of AA that various mod authors decided to use. Contrary to any wiki, AA does not magically make pixels into perfect spheres. It is visual dithering, and like
ANY form of interpolation, it will produce smearing and artifacts no matter how fast the frametime and framerate are.
6+2 is not 8 bit color, and 8+2 is not 10 bit color, because dithering does not and cannot replace actual visual data, all it can do is trick your eye into a proximity of preceived upgrade to quality. Buying an 8+2 monitor is fine for gaming. Buying an 8+2 monitor for print and vfx and visual work is going to earn a one way trip to the Visual Lead's office for a come to jesus meeting.
These are facts, simple as, simple is.
There's a reason the latest SDK from nVidia doesn't include AA options for anything but UI at 8K 10 bit color, because you're expected not to need it, and your model is expected to have a poly count consistent with that presentation philosophy or have a tesselation scheme to cover it.
Point of fact is DLSS is the literal opposite of AA and directly supersedes the AA pipeline to work in games, and will be nVidia's official visual approximation quality tool going forward.
There is no catchall for higher resolution. Period. Whilst there are a ton of tricks, those tricks all come with caveats and exceptions and AA is no different. Interpolation is quite literally leaving AA behind, and when 8K becomes the norm, there will be almost zero reason to have AA on anything other than your UI if it has to scale or kern for some reason.