I normally break down my environments into sections for this very reason. I used to just have one giant scene and still ran out of vram on a 3090. Now rooms and props are all organized in collections that I can quickly enable/disable as needed.im probably gonna have to have two different blend files for upstairs and downstairs this was 10.5GB and i only have 12gb ON MY 3080 TI. gonna have to find some way to shed around 2-3.5 gb of ram for my character models
View attachment 1925933
There's no way that is 10 gigabytes unless you're using ultra highpoly furniture meshes with six subsurf levels and 4k textures for everything.im probably gonna have to have two different blend files for upstairs and downstairs this was 10.5GB and i only have 12gb ON MY 3080 TI. gonna have to find some way to shed around 2-3.5 gb of ram for my character models
View attachment 1925933
Out of curiosity, what's the over-under on using Blender's procedural textures instead of a traditional texture? How big does a texture have to be before replacing it with a Voronoi or a Perlin Noise or something is a good trade? Or, say, replacing a bunch of 2k impostor cloud textures with 256x256 textures mixed with noise to make (apparently) infinite variety of cloud shapes in the sky? I'm guessing you 're basically spending RAM to save VRAM, but how do we know when that's a good trade?Blender has out of core rendering. It will share system memory so you're not limited to VRAM and it's still faster than a CPU render.
You can put a global limit on subsurf levels and texture size (for both viewport and render) from the Simplify panel. Setting all textures to 2k or 1k is the nuclear option when a scene won't render otherwise. Blender does have a function to scale textures but I think it's only in Python, not exposed as an operator.
Replace all textures for small objects or objects that won't be the focus of your scene with 2k or 1k textures. Texture size is quadratic, so doubling the dimensions takes four times the memory. If you replace a 4k texture with a 2k, you can fit all four maps (diffuse, roughness, bump, normal) into the same space as one 4k texture.
If you have similar things like books, rather than a dozen textures you can setup a shader network that randomizes the colors per object.
If you imported highpoly models, use the decimate modifier to reduce the polycount to a manageable level. If it's your own models then they should have 1-2 levels of subsurf max unless it's an extreme closeup.
I'm not Ducky3d, I don't know much about procedural texturing. It's slated to be overhauled anyway so I figure I'll just wait to learn the new system.Out of curiosity, what's the over-under on using Blender's procedural textures instead of a traditional texture? How big does a texture have to be before replacing it with a Voronoi or a Perlin Noise or something is a good trade? Or, say, replacing a bunch of 2k impostor cloud textures with 256x256 textures mixed with noise to make (apparently) infinite variety of cloud shapes in the sky? I'm guessing you 're basically spending RAM to save VRAM, but how do we know when that's a good trade?
So I'm pretty new to blender like only been using it 3 months I figured the issue out I didn't realize you could increase the amount of ram it used from 4gb I upped it to 28gb I have 48gb total so I figured it was fine and about your comment about gig sizeThere's no way that is 10 gigabytes unless you're using ultra highpoly furniture meshes with six subsurf levels and 4k textures for everything.
Blender has out of core rendering. It will share system memory so you're not limited to VRAM and it's still faster than a CPU render.
You can put a global limit on subsurf levels and texture size (for both viewport and render) from the Simplify panel. Setting all textures to 2k or 1k is the nuclear option when a scene won't render otherwise. Blender does have a function to scale textures but I think it's only in Python, not exposed as an operator.
Replace all textures for small objects or objects that won't be the focus of your scene with 2k or 1k textures. Texture size is quadratic, so doubling the dimensions takes four times the memory. If you replace a 4k texture with a 2k, you can fit all four maps (diffuse, roughness, bump, normal) into the same space as one 4k texture.
If you have similar things like books, rather than a dozen textures you can setup a shader network that randomizes the colors per object.
If you imported highpoly models, use the decimate modifier to reduce the polycount to a manageable level. If it's your own models then they should have 1-2 levels of subsurf max unless it's an extreme closeup.
Dang. That's some pretty impressive hardware. Remind me, what rendering engine are you using?So I'm pretty new to blender like only been using it 3 months I figured the issue out I didn't realize you could increase the amount of ram it used from 4gb I upped it to 28gb I have 48gb total so I figured it was fine and about your comment about gig size
and that's with me completing the rest of the house and simplify turned off. but ima still take domeik's advice and separate some things so
blender isnt trying to account for stuff that wont affect scenes
View attachment 1927653
View attachment 1927657
cyclesDang. That's some pretty impressive hardware. Remind me, what rendering engine are you using?
Okay. I was thinking the lighting looks a little too crisp. If you can soften that a little bit, it would look more realistic. Most light bulbs don't cast hard shadows, because the light isn't coming from a single point. It's coming from inside a sphere of treated glass that sort of spreads out the light. Here's a quick list of ways you can soften the lighting, in general, in Cycles:cycles
I render 1920x1080 at 4k samples with OID. The more samples the more data OID has to work with. You can get away with less outdoors using HDRIs but indoors needs lots of indirect lighting so you should go higher.anyone have a number to how many samples should be taken in blender when i used iray 2-4k samples was my go to but i know cycles is alot different then Iray
iv been doing 2k samples in cycles but i dont know if thats over kill or not since i use the open OpenImageDenoise
There isn't really a standard default value since noise is so heavily scene dependent.anyone have a number to how many samples should be taken in blender when i used iray 2-4k samples was my go to but i know cycles is alot different then Iray
iv been doing 2k samples in cycles but i dont know if thats over kill or not since i use the open OpenImageDenoise
SFMLab/Smutbase has Blender files. There's other video game rip sources but they don't convert forward kinematic models to Blender's advanced rigging, and that's too advanced for a beginner.Im not sure if this is the right place to ask but its the first blender related thread I found so: I recently started to learn blender and Im wondering if there is any good website where you can find good content for free. So basically is there a f95zone but for blender content?
I'm surprised nobody's mentionedIm not sure if this is the right place to ask but its the first blender related thread I found so: I recently started to learn blender and Im wondering if there is any good website where you can find good content for free. So basically is there a f95zone but for blender content?
I assumed they wanted nude models, which Polyhaven doesn't have.I'm surprised nobody's mentionedYou must be registered to see the linksyet. They have a small but growing collection of free high-quality furniture, plants, and objects, texture sets suitible for archviz, and all the HDRIs you can eat.