[Stable Diffusion] Prompt Sharing and Learning Thread

Fuchsschweif

Well-Known Member
Sep 24, 2019
1,169
1,994
I don't have controlnet, but online it states everywhere that SD ultimate upscale and controlnet are used together. And I just found this:

"
What does ControlNet Tile do?
At its core, ControlNet Tile operates like an intelligent tiling system. Rather than arbitrarily enlarging your image, it carefully assesses and accurately replicates every tiny section in 512x512 chunks. By working on these manageable pieces, it can pay attention to the intricacies of each segment. When these sections are finally pieced together, the end result is a larger image that hasn't lost any detail. In fact, with the additional power of text-driven guidance, it often introduces new, harmonious details. So, instead of a simple upscale, what you get is an image that not only retains but often surpasses the original's clarity and vibrance. I've tried to incorporate a clearer understanding of the process while maintaining a reader-friendly tone.
"

Do I need that too in order to make it work?
 

hkennereth

Member
Mar 3, 2019
239
784
Okay Fuchsschweif... you're going over a ton of different questions that have different unrelated answers... and you have a picture of a workflow that honestly raises more questions than it answers (why are you calculating the upscale factor from the source image there, instead of just inputting something like 2x into the Ultimate Upscale node itself?).

No, you don't NEED ControlNet Tile to use upscale, be it normal upscale or with the Ultimate Upscale node... but it can help. It is not a silver bullet solution to every problem, and it's not guaranteed that you will even get good results from it. It's the kind of thing you should test yourself, and see if it works for your needs because different types of images need different upscaling workflows. But using ControlNet increases processing time and memory usage, so if you're using a limited and old graphics card that will only make things worse.

None of that probably has anything to do with your crashing problem... which you need to be more specific about anyway. What do you mean by "crashes your computer"? Are you getting a blue screen and having to reboot? Is Comfy crashing and closing? Is Comfy just refusing to render that image and showing a red error screen? These are very different situations with different potential workarounds, and if you mean the first case (which is the only one I would actually describe as "crashing your computer"), it probably has very little to do with your specific workflow, that is a more likely a hardware issue.

You need to be specific in your questions if you want to get good answers.

Edit: also, I forgot to mention that: you don't need a Tiled VAE Encoder/Decoder there. The Upscale node already outputs a pixel file, so there is no reason or point to encoding the image back to a latent image, just to decode it back to pixels so you can save it. That has zero to do with your crashing issue.
 
Last edited:

Fuchsschweif

Well-Known Member
Sep 24, 2019
1,169
1,994
Okay @Fuchsschweif... you're going over a ton of different questions that have different unrelated answers... and you have a picture of a workflow that honestly raises more questions than it answers (why are you calculating the upscale factor from the source image there, instead of just inputting something like 2x into the Ultimate Upscale node itself?).
The workflow was shared and explained by one of the developers at OpenAI itself, who work together with ComfyUI.

The upscale factor is calculated from the source image itself, so that you can throw in whatever picture you want and upscale it as many times as you want. You don't have to do the math anymore since these mathematical operators will gather the width and height of the original and feed the correct values into the upscaler, so that you always get the correct upscale values out of it.

But using ControlNet increases processing time and memory usage, so if you're using a limited and old graphics card that will only make things worse.
"Controlnet tile" sounded like another tiled encoder/decoder to me, so since those are the ones preventing my system from crashes, I thought they might be a solution.

None of that probably has anything to do with your crashing problem... which you need to be more specific about anyway. What do you mean by "crashes your computer"?
The computer completely shuts off immediately, no bluescreen no nothing. I have to take it from the power for 5-10 seconds in order to turn it on again, which points at a problem with the memory needing to unload. I already explained this in the past pages.

That's why the tiled VAE encoder and decoder prevent my system from crashing, they work around the issue that probably my VRAM is going over its limit so that everything shuts down.

With the tiled VAE versions I've had 0 crashes, while the regular ones often crash my system when they reach 99% and are about to drop the final picture.

Edit: also, I forgot to mention that: you don't need a Tiled VAE Encoder/Decoder there. The Upscale node already outputs a pixel file, so there is no reason or point to encoding the image back to a latent image, just to decode it back to pixels so you can save it. That has zero to do with your crashing issue.
I know that, and yet it has. The crash is happening because the VRAM is probably coming to its limits. A tiled VAE encoder would help me to prevent that. So I thought about whether there's an option to incorporate a tiled version of an encoder into this workflow. I tried the switch at the bottom of the upscaler but that one didn't work.

I am pretty sure I won't be able to fix these shutdowns. This happend with A1 and comfyUI and doesn't happen with any other heavy rendering or processing software, so I don't think it's my hardware, otherwise I would it always expect to crash when things get heavy. And like I said on previous pages, it only happens when using upscalers and when these are at 99% and go into the VAE decode stage. So it's not a problem of overheating or a faulty PSU, otherwise it would be expected to happen more randomly, not always at the exact same stage.

But if you got any other idea how to troubleshoot the issue, let me know :p
 
Last edited:

hkennereth

Member
Mar 3, 2019
239
784
The upscale factor is calculated from the source image itself, so that you can throw in whatever picture you want and upscale it as many times as you want. You don't have to do the math anymore since these mathematical operators will gather the width and height of the original and feed the correct values into the upscaler, so that you always get the correct upscale values out of it.
Yeah, but that's why it doesn't make sense. The original field is already a numerical factor, the default value is 2. That means that it will take whatever image size you put there, and make it two times larger. It doesn't matter what the original image size is. I'm not sure what that thing is trying to accomplish there, but it seems mostly useless to me.

"Controlnet tile" sounded like another tiled encoder/decoder to me, so since those are the ones preventing my system from crashes, I thought they might be a solution.
It's not, ControlNet is a way to get information from a source image and use it to help create a new one by providing information about the general shape of the source. In particular, the Tiled processor for ControlNet is to be used in conjunction with tiled upscalers, breaking the source image into discreet "slices" that are upscaled one at a time, and then merged together to build the larger image. It CAN be used in conjunction with Ultimate SD Upscaler, but it won't do anything to help with your issue, it will just add more VRAM usage since ControlNet requires some considerable amount of VRAM to use.

None of that has anything to do with the Tiled VAE Encoder/Decoder, which is a way to optimize the conversion of images from Stable Diffusions internal Latent Space images (basically a type of image that is encoded in a way that the AI can understand it) and the pixel formats that can be displayed for the user (i.e. you and me) and saved into JPG/PNG files.

The computer completely shuts off immediately, no bluescreen no nothing. I have to take it from the power for 5-10 seconds in order to turn it on again, which points at a problem with the memory needing to unload. I already explained this in the past pages.

(...)

With the tiled VAE versions I've had 0 crashes, while the regular ones often crash my system when they reach 99% and are about to drop the final picture.
That really sounds like a hardware issue to me, probably related to overheating. There is nothing about Stable Diffusion or Comfy UI that can cause computer crashes like that, and I can assure you that there isn't much you can do with Comfy to completely solve this for you because computers don't crash when they run out of VRAM, they just fail to execute (like the red error messages I asked you were seeing). The fact you haven't had this happen when using Tiled VAE Codecs is just a coincidence because they for whatever reason haven't made the computer run as hot, but that isn't the underlying issue, and focusing on this part is just a band-aid. Open your machine and check if your fans are working properly.

The crash is happening because the VRAM is probably coming to its limits. A tiled VAE encoder would help me to prevent that. So I thought about whether there's an option to incorporate a tiled version of an encoder into this workflow. I tried the switch at the bottom of the upscaler but that one didn't work.
The Ultimate SD Upscaler doesn't need a VAE Encoder/Decoder because it does that internally. Images need to be in Latent Space for the KSamplers to generate an image, and they need to be in Pixel format for image manipulation like cropping. That plugin slices the image into a grid of overlapping image tiles, and then starts encoding each tile, running it through an img2img Ksampler, decodes it, and then does the same for the next tile. Then it takes all these pixel images and puts them all together back into that grid, slightly over each other with faded edges to help avoid the edges between tiles being visible.

Since all of that happens in pixel format, not latent format, there is no need to decode the final resulting image, it's already decoded; hence why the only output node says IMAGE, not LATENT_IMAGE. And because each slice is usually small enough (as defined by the Tile Width and Tile Height properties, both with default 512px), this internally happens in a size that doesn't require Tiled VAE Codecs, these are ONLY necessary if the latent image is too large for decoding with the standard codecs, which is never the case for 512px images. You seem to have a misunderstanding of what the Tiled VAE Codecs do; there is no magic to them that prevents crashes, so much that using the standard VAE Encoder/Decoder nodes will automatically switch to using Tiled mode if the image is too large to be converted normally after it fails the first attempt.

In summary, you are focusing on the wrong problem. As I said, "VRAM coming to its limits" should NEVER cause computer crashes, it should just stop whatever process from running. The real issue here is your machine probably overheating, or perhaps having other hardware related issues that are happening WHILE using SD, but NOT BECAUSE of SD. Adding unnecessary nodes to that workflow won't solve your issue.
 
  • Like
Reactions: Fuchsschweif

Fuchsschweif

Well-Known Member
Sep 24, 2019
1,169
1,994
Yeah, but that's why it doesn't make sense. The original field is already a numerical factor, the default value is 2. That means that it will take whatever image size you put there, and make it two times larger. It doesn't matter what the original image size is. I'm not sure what that thing is trying to accomplish there, but it seems mostly useless to me.
If you have an original image of 1024x1024 or 1920x1080 matters, because you would always need to input the correct numbers. But this way it will gather the height and width and then only upscale it by twice or four times or as much as you wish. So that you don't have to change the values manually for any picture you load in.

Here's the video:

That really sounds like a hardware issue to me, probably related to overheating.
This can't be. With the tiled VAE encoder I can produce 50 pictures in a row and get my machine really to give me all its power without any crash. I can easily upscale to 1920x1080. ( I never had a single shutdown when I use the tiled VAE encoder with probably by now over 200-300 generations)

With the normal VAE encoder I already get a shutdown when I only want to upscale a single picture from 512x512 to 1024x1024.

Since creating 50 pictures in a row at a higher resolution would overheat my computer more likely, this can't be the issue. Also, my GPU never goes above 60-70°C.


and I can assure you that there isn't much you can do with Comfy to completely solve this for you because computers don't crash when they run out of VRAM
Well, whatever it is, it has to do with something that the encoder requires, and that's as far as I am concerned the vram. There seems to be an issue when dropping the final picture (at 99%) that causes my computer to shut off. And the tiled version prevents this. Since the tiled versions splits up the process in different parts, so that VRAM usage is lower, that's my suggestion.

Also the fact that I have to deprive the computer from energy for 5-10 seconds in order to reboot speaks for a vram issue, because storage only entirely unloads when you take a system completely off from power.

That being said, the SD ultimate upscaler did crash my system too, even with the tiled button activated. But the final output was smaller in size than my 1920x1080 generations that work fine with the tiled VAE encoder.. so I don't know how that fits together. All I know is that the only way I seem to be able to upscale is with the tiled VAE encoder.

You seem to have a misunderstanding of what the Tiled VAE Codecs do; there is no magic to them that prevents crashes, so much that using the standard VAE Encoder/Decoder nodes will automatically switch to using Tiled mode if the image is too large to be converted normally after it fails the first attempt.
This seems not to be the case. I have tried both with dozens of generations and the pattern I described above is replicable every time.

If you can put together an idea based on the hints that are available, I am happily all ears. But overheating won't be the issue, and other heavily demanding software does never make my system crash or shutdown, nor do games. In fact, it really only happens when using SD.
 
Last edited:

hkennereth

Member
Mar 3, 2019
239
784
This can't be. With the tiled VAE encoder I can produce 50 pictures in a row and get my machine really to give me all its power without any crash. I can easily upscale to 1920x1080. ( I never had a single shutdown when I use the tiled VAE encoder with probably by now over 200-300 generations)
Look, I don't know what else I could tell you. As I said before, that node doesn't need any VAE Encode or Decoder nodes added before or after it, because it already includes that functionality internally as it does that continuously for each tile it process.

If you don't click that option TILED_DECODE at the bottom of the node, it will first attempt to use the normal codecs, and if these are unable to process the tile size it will automatically switch to the Tiled one. Turning that option on will always use the Tiled Codecs for each step, which is slower for a smaller sized image than the non-tiled version, but turning that on prevents it from wasting time testing the non-tiled version if you know the tiled is necessary. Meaning that if you turn that on, you ARE ALREADY always using the tiled encoder and decoder. If that is not solving your issue, that... is because that is not your issue.

Forgive my bluntness, but you are committing the classic error of mistaking correlation for causation. I don't know WHAT is causing your issue because I don't have access to your machine, but it's not the presence or absence of a Tiled Encoder/Decoder, that's the only certain thing here. Excessive VRAM (or normal RAM for that matter) usage DOES NOT cause machines to crash, freeze, or reboot. That's just not how computers work. Anyone using ComfyUI with less than top-of-the-line graphics cards probably sees the error message "Not enough VRAM" all the time when experimenting with new workflows, and that's all it does: it says "can't do it", and you try again. It does not crash one's machine. That's a problem with YOUR machine, not with ComfyUI or with Stable Diffusion.

I don't know why it doesn't happen on other things, there's a million reasons why that could be the case; but I know it's misguided to assume that ONE thing is the absolute cause despite any evidence to support that beyond "those two things happen in close proximity", and after hearing detailed explanations of why that cannot be the case, including the fact that your proposed solution is already in there, just behind the scenes.
 
  • Like
Reactions: theMickey_

Fuchsschweif

Well-Known Member
Sep 24, 2019
1,169
1,994
Look, I don't know what else I could tell you. As I said before, that node doesn't need any VAE Encode or Decoder nodes added before or after it, because it already includes that functionality internally as it does that continuously for each tile it process.
I got that, my reply was referring to the idea, that those shutdowns would be overheating-related.

Forgive my bluntness, but you are committing the classic error of mistaking correlation for causation. I don't know WHAT is causing your issue because I don't have access to your machine, but it's not the presence or absence of a Tiled Encoder/Decoder, that's the only certain thing here.
I am not mistaking correlation for causation. I am telling you that with the tiled encoders I never got a single crash in hundreds of generations, while a standard decoder will crash my system instantly even with way less demanding upscalings.

You said my shutdowns would have nothing to do with whether using the tiled or normal nodes, but that's in fact how it is.

I can also not tell you why that is, but it is like that.

It does not crash one's machine. That's a problem with YOUR machine, not with ComfyUI or with Stable Diffusion.
I never claimed that it's SD or comfyui alone. I described when it happens, so that these informations might lead to ideas, what could be happining in the background.

That it only happens with the standard encoders and that it only happens at 100% of the upscaling process, are two valuable hints for you or someone else who knows more than me about how SD works, to maybe come up with a good idea.

And these hints are also enough to figure out that it can't be overheating or hardware related issues, because you would expect them not to occur this precisely. If a system is unstable due to an insufficient PSU or overheating issues, these shutdowns would occur randomly.
 

theMickey_

Engaged Member
Mar 19, 2020
2,248
2,938
You said my shutdowns would have nothing to do with whether using the tiled or normal nodes, but that's in fact how it is.
I can also not tell you why that is, but it is like that.
I'm with hkennereth -- it's most probably an overheating issue. I'm more than 95% sure about that.

If you want to prove it, do the following:
(In fact, I do not recommend doing this, because it can actually damage your hardware):
  • Go into your BIOS/UEFI and turn off every feature that will shut down your computer at a certain temperature level
  • Go into your GPU's driver settings and see if it has a similar setting for shutting down your computer when it gets too hot and turn it off
  • If you do have a 3rd party system monitoring tool that can shut down your computer as well at certain degrees, turn that off
Then run ComfyUI again and your computer will not shut down. You computer or your house might be on fire after this test, but at least that will rule out your "This can't be." response to someone suggesting this might be an overheating problem.


To be fair: I've seen post from you all over this forum in past couple of days: I've answered some of your question in the Virt-a-Mate thread, people have been telling you that DAZ3D isn't the right tool to do real-time animations in the DAZ thread, and you're here to ask about why ComfyUI is causing your computer to crash. All these questions you've been asking tell me that you're new to all this 3D, VR and AI stuff (which is absolutely fine as we all have been newbies at some point and this is what this forum is about -- to ask things and learn stuff), but if someone then gives you a reasonable answer (overheating), please do not just reply "This can't be". At least consider it as a valid point and act accordingly: check your computer system, check the fans, do some tests, check your settings etc..

As hkennereth said: full VRAM will not cause your computer to shut down. It will cause some ComfyUI actions to fail and print error messages, or you might not be able to start additional GPU heavy tasks while ComfyUI is running. Full RAM will also not shutdown your system, instead your OS might start to kill some tasks it does consider not as important anymore. This might lead to programs failing/crashing, but not to a full shutdown of your computer. A crash (without a blue screen, just an instant "turn off") is almost always temperature related. So please start investigating/checking...
 

hkennereth

Member
Mar 3, 2019
239
784
I am not mistaking correlation for causation. I am telling you that with the tiled encoders I never got a single crash in hundreds of generations, while a standard decoder will crash my system instantly even with way less demanding upscalings.

You said my shutdowns would have nothing to do with whether using the tiled or normal nodes, but that's in fact how it is.

I can also not tell you why that is, but it is like that.
Sorry, but you are making that mistake. Yes, those things happened, I believe you, but you are jumping to the conclusion that one is then the CAUSE of the other, and I'm exhaustingly explaining you why that cannot be the case. And you keep insisting that you should be able to add that node somewhere in your workflow to solve the problem, despite the explanation of why that node has no function on that workflow, because the functionality it provides is already happening somewhere else. The node does not perform any magic, and as I explained on the previous post, you can get the exact same functionaly by using that toggle at the bottom, the one you said didn't work for you.

So you don't think I'm just trying to claim to be smarter and you should trust me just because... here is one more attempt to explain to you exactly what the Ultimate SD Upscale node does. Well, a very crappy version of it that fails to do what that node is the best at doing, but one that contains the same fundamental functionality.

The workflow below does the same basic things that the Ultimate SD Upscale node does... except that it does this much better (the image contains the actual usable workflow is you load it into ComfyUI):

workflow.png

It takes an image as an input. My crappy workflow it will only work for a 1024x1024 image, but the node can handle images of any size of course.

It crops that image into a bunch of separate slices, still in pixel mode. Again, my limited workflow is splicing them right at the edge of each other, but the node will crop them with a little bit of overlap, as defined by the TILE_PADDING property, which helps merging the final image more naturally.

It upscales each slice using a pixel upscale model you select, using the multiplication factor you choose in the property UPSCALE_BY. The upscale model is a 4X upscale, so the upscale node is set to 0.5 to get half of that: 2X.

It encodes the image using the VAE into a latent image.

It uses that latent image as source for img2img in the KSampler, using a lot Denoise value to preserve the original image's look.

It decodes the image from latent space into pixels again using the VAE.

It composites each image back together using a mask to help hide the edges between each slice. In my workflow I didn't use any mask and just fit the images side to side without overlap, which will look obvious in the final image.

------

And that's it. All of that is what is happening inside that one node, except it does some additional things to get a decent quality image from it, unlike this example. You can't add a tiled VAE before or after... because the node needs a decoded pixel image as source for its initial steps, and it will automatically output a decoded pixel image at the end since it needs to merge the images together in pixel space.

For reference, here is the original 1024 x 1024 px image, and the crappy composited 2048 x 2048 px upscaled version (but each tile looks... not bad actually):

1707101195878.png 1707101247284.png
 
  • Like
Reactions: sharlotte

Fuchsschweif

Well-Known Member
Sep 24, 2019
1,169
1,994
All these questions you've been asking tell me that you're new to all this 3D, VR and AI stuff (which is absolutely fine as we all have been newbies at some point and this is what this forum is about -- to ask things and learn stuff), but if someone then gives you a reasonable answer (overheating), please do not just reply "This can't be". At least consider it as a valid point and act accordingly: check your computer system, check the fans, do some tests, check your settings etc..
I am not new to "this stuff". I've been building my own computers from scratch for the past 18 years. I got a major degree in digital media, and I got plenty of experience with professional video- and audio rendering software.

I work with my computer every day, often with demanding professional software. I know when my computer gets hot, I know how much warmer it gets when I play games with maxed out graphic settings, that's something you can even feel by hand. We're currently still in winter, the apartment is cold, my computer is nowhere near anything that feels like it's even heating.

Now, the reason why I said it can't be overheating-related is because I wanted to save us time. We're now here having the equivalent of someone stating "I updated all my drivers yesterday and the issues still persist" in the OP, and other users still try to walk him through updating his drivers and checking on the obvious.

Also, think for yourself: VAE decoders only work for 2-3 seconds, not longer. How is a system supposed to overheat from 0 to 100 during 2-3 seconds of VAE decoding?

Of course, that's physically impossible.

But since you seem to be in deep disbelief about my ability to understand computers, I made a factual check:

This is at the start of SD:

SD Start.png

This are my temperatures approx. 2-3 seconds before the shutdown:

2.png

As you can see, there's nothing even remotely within dangerous temperature numbers.

Mainboard max. temp was 67°C, GPU was 69,1°C at peak and the CPU was peaking at 67°C.

In order to cause an emergency shutdown due to overheating, the VAE decoder would need to bring these temperatures up by another 30-50°C in only 2-3 seconds.

but you are jumping to the conclusion that one is then the CAUSE of the other, and I'm exhaustingly explaining you why that cannot be the case.
When I attempt to decode the upscaled version of a picture and out of 50 times it caused my computer to shut down 50 times, at exactly 100%, at the exact same spot every single time, only with upscalers, then there's definitely a causality.

That doesn't mean, and I repeat that, that SD is a faulty software that makes my computer shut down due to a bug. It means that my computer has some problem with whatever it is trying to do in that exact moment. Still, how and when it happens are valuable hints in order to troubleshoot the issue. It's a problem that you can pin down on the exact same short time-frame of 2-3 seconds, with 100% accuracy.

So a computer that would have insufficient cooling due to slow fans, bad airstream, old cooling paste or whatever, would also shut down when you play Cyberpunk 2077 in maxed out graphics, and it would also shut down when you render high res videos in after effects. It would certainly not randomly only shutdown while using SD, since it would suffer from a general lack of proper cooling during high demanding tasks that create a significant increase in temperature.

I am trying to explain, that the fact that it only happens when SD tries to finalize/save an image that has been upscaled with anything than a tiled decoder node, is giving away a hint about what might be the issue and what might not be the issue, and overheating is definitely not in the pool of possibilities, as it doesn't fit into the pattern.

And you keep insisting that you should be able to add that node somewhere in your workflow to solve the problem
I never did that. I asked in the OP if anyone knows if I can break that workflow up, so that I can incorporate a tiled VAE decoder. The idea was to split that process that causes my computer to shut down into an easier process. You said it's not possible and that it also would not be the reason why my computer shuts down.

From then on I only tried to explain to you that my system does in fact excactly that when I attempt to use anything except a tiled decoder. That has nothing to do with insisting that there must be a way to incorporate one into that workflow, I explained to you why I asked that question. Two different things..

And that's it. All of that is what is happening inside that one node, except it does some additional things to get a decent quality image from it, unlike this example. You can't add a tiled VAE before or after... because the node needs a decoded pixel image as source for its initial steps, and it will automatically output a decoded pixel image at the end since it needs to merge the images together in pixel space.
Well, in your shared workflow are a lot of VAE decoders. If I would replace them with tiled versions, my system wouldn't shutdown (just in theory, I'm aware your workflow is just a showcase and not a proper replacement). If the tiled button on the SD ultimate upscaler would work the way the tiled VAE decoders work, it also wouldn't crash.

But there seems to be a difference between how the SD ultimate operates, even with the tiled button activated, and ksampler into -> vae tiled decoder -> save image.

Anyways, thanks for the effort of showcasing in your workflow how SD ultimate works, I appreciate it.
 
Last edited:
  • Hey there
Reactions: DD3DD

hkennereth

Member
Mar 3, 2019
239
784
I am not new to "this stuff". I've been building my own computers from scratch for the past 18 years. I got a major degree in digital media, and I got plenty of experience with professional video- and audio rendering software.

I work with my computer every day, often with demanding professional software. I know when my computer gets hot, I know how much warmer it gets when I play games with maxed out graphic settings, that's something you can even feel by hand. We're currently still in winter, the apartment is cold, my computer is nowhere near anything that feels like it's even heating.

Now, the reason why I said it can't be overheating-related is because I wanted to save us time. We're now here having the equivalent of someone stating "I updated all my drivers yesterday and the issues still persist" in the OP, and other users still try to walk him through updating his drivers and checking on the obvious.

Also, think for yourself: VAE decoders only work for 2-3 seconds, not longer. How is a system supposed to overheat from 0 to 100 during 2-3 seconds of VAE decoding?

Of course, that's physically impossible.

But since you seem to be in deep disbelief about my ability to understand computers, I made a factual check:

This is at the start of SD:

View attachment 3327118

This are my temperatures approx. 2-3 seconds before the shutdown:

View attachment 3327119

As you can see, there's nothing even remotely within dangerous temperature numbers.

Mainboard max. temp was 67°C, GPU was 69,1°C at peak and the CPU was peaking at 67°C.

In order to cause an emergency shutdown due to overheating, the VAE decoder would need to bring these temperatures up by another 30-50°C in only 2-3 seconds.



When I attempt to decode the upscaled version of a picture and out of 50 times it caused my computer to shut down 50 times, at exactly 100%, at the exact same spot every single time, only with upscalers, then there's definitely a causality.

That doesn't mean, and I repeat that, that SD is a faulty software that makes my computer shut down due to a bug. It means that my computer has some problem with whatever it is trying to do in that exact moment. Still, how and when it happens are valuable hints in order to troubleshoot the issue. It's a problem that you can pin down on the exact same short time-frame of 2-3 seconds, with 100% accuracy.

So a computer that would have insufficient cooling due to slow fans, bad airstream, old cooling paste or whatever, would also shut down when you play Cyberpunk 2077 in maxed out graphics, and it would also shut down when you render high res videos in after effects. It would certainly not randomly only shutdown while using SD, since it would suffer from a general lack of proper cooling during high demanding tasks that create a significant increase in temperature.

I am trying to explain, that the fact that it only happens when SD tries to finalize/save an image that has been upscaled with anything than a tiled decoder node, is giving away a hint about what might be the issue and what might not be the issue, and overheating is definitely not in the pool of possibilities, as it doesn't fit into the pattern.



I never did that. I asked in the OP if anyone knows if I can break that workflow up, so that I can incorporate a tiled VAE decoder. The idea was to split that process that causes my computer to shut down into an easier process. You said it's not possible and that it also would not be the reason why my computer shuts down.

From then on I only tried to explain to you that my system does in fact excactly that when I attempt to use anything except a tiled decoder. That has nothing to do with insisting that there must be a way to incorporate one into that workflow, I explained to you why I asked that question. Two different things..



Well, in your shared workflow are a lot of VAE decoders. If I would replace them with tiled versions, my system wouldn't shutdown (just in theory, I'm aware your workflow is just a showcase and not a proper replacement). If the tiled button on the SD ultimate upscaler would work the way the tiled VAE decoders work, it also wouldn't crash.

But there seems to be a difference between how the SD ultimate operates, even with the tiled button activated, and ksampler into -> vae tiled decoder -> save image.

Anyways, thanks for the effort of showcasing in your workflow how SD ultimate works, I appreciate it.
I really want to put a pin in this subject so we can all go back to discussing image generation techniques, but let's just go through it one last time:

So you're telling us that you are 100 percent, without question or any doubt in your mind that the issue is not overheating? Well, fantastic. Now there's only about 50 other things that you can look into your hardware to try and troubleshoot your issue. Our point was not "the problem is overheating", the point we were making was "the problem is almost without a doubt in the hardware, and overheating is the most likely culprit so that's what we would look into first".

The idea that you may be using other software that despite being very heavy, like games such as Cyberpunk 2077, is flawed, however, because pretty much every software like that is designed to avoid ever utilizing 100% of the system resources to avoid crashes. That is not the case with SD, which basically starts a new process from scratch for every operation, so if it crashes you just start it over. It's not a very well designed piece of software from a stability point of view... because it doesn't have to be.

The other point I have made since the beginning is that...

If the tiled button on the SD ultimate upscaler would work the way the tiled VAE decoders work, it also wouldn't crash.
... the tiled button DOES work the way the tiled VAE decoder works, because it is exactly what is being used internally. I don't mean it's doing something similar, I mean it's actually using the exact same piece of code. The person who created the SD Ultimate Upscaler didn't write a new VAE tiled decoder, it just calls the one that comes with ComfyUI, so when you click that little button, internally it's exactly the same as going to my little example workflow and replacing all VAE Codecs with their tiled versions.

So if that option isn't working for you and it is still crashing, then the tiled VAE codecs are not the answer you are looking for... which is the point I have made on pretty much every reply I gave you. It's WHY I created that example workflow, so you would understand that, but you are still thinking there must be a different reason. There isn't.

I don't know how else I could possibly explain to you that you are barking at the wrong tree, but let's try one more time as explicitly as possible: your issue is not caused by VAE Encoders and Decoders, and will not be solved by using Tiled VAE Encoders and Decoders. You issue is most likely caused by a hardware problem, but even if it's a software problem, it's something on YOUR machine, not something you can fix by changing settings/nodes/plugins within ComfyUI, and we can't offer proper suggestions because we don't have access to your machine. The fact that you haven't seen this in other software is just a coincidence because these other apps didn't push the machine in the specific manner that SD does to get the event to trigger, but that doesn't mean that changing something in SD will prevent it. If you never got shot when playing Russian Roulette, but the gun suddenly fires when you're wearing a glove, it wasn't the glove that caused the shot to fire.
 

FallingDown90

Member
Aug 24, 2018
123
38
Hi everyone
I'm sorry if I bother you but I'm not English and therefore it's really difficult to search for threads.
Can you tell me if stable diffusion is still among the best or if it has been overtaken by XL or XL turbo or something else?

I like the idea of generating images in real time but I understand that the interface still works with the node system and differs from stable diffusion... I don't understand anything anymore

Can I please ask you what you suggest I use?
 

Fuchsschweif

Well-Known Member
Sep 24, 2019
1,169
1,994
Hi everyone
I'm sorry if I bother you but I'm not English and therefore it's really difficult to search for threads.
Can you tell me if stable diffusion is still among the best or if it has been overtaken by XL or XL turbo or something else?

I like the idea of generating images in real time but I understand that the interface still works with the node system and differs from stable diffusion... I don't understand anything anymore

Can I please ask you what you suggest I use?
Stable Diffusion is always the basic application, no matter whether you use A1111, ComfyUI or anything else. Checkpoints/Models like 1.5, SDXL or Turbo are all available on their own, you can use whatever you want.

1.5 is the older generation that was mainly trained on 512x512 pictures. SDXL is newer and is mainly trained on 1024x1024 pictures. There is more data in it which means the models have more variety to offer, but they're also more resource hungry.

Turbo is, as far as I am concerned, right now only for research purposes. It allows you to generate pictures almost in real time, but only with low quality. But it's good for quick experiments.

So you can just roll with either A1111 or ComfyUI, whatever you prefer more, and then decide for yourself what models you use.
 

Fuchsschweif

Well-Known Member
Sep 24, 2019
1,169
1,994
So you're telling us that you are 100 percent, without question or any doubt in your mind that the issue is not overheating? Well, fantastic. Now there's only about 50 other things that you can look into your hardware to try and troubleshoot your issue. Our point was not "the problem is overheating", the point we were making was "the problem is almost without a doubt in the hardware, and overheating is the most likely culprit so that's what we would look into first".
And I never denied that it could be my hardware or system, I just said overheating won't be the issue, which was something you guys seemed to hang up on. Also I don't know why you ask me to be more specific about my crashes, only to tell me that I basically should go troubleshoot the other 50 possibilites. It sounded like you were onto something and wanted to help out.

I don't know how else I could possibly explain to you that you are barking at the wrong tree, but let's try one more time as explicitly as possible: your issue is not caused by VAE Encoders and Decoders, and will not be solved by using Tiled VAE Encoders and Decoders. You issue is most likely caused by a hardware problem, but even if it's a software problem, it's something on YOUR machine, not something you can fix by changing settings/nodes/plugins within ComfyUI, and we can't offer proper suggestions because we don't have access to your machine. The fact that you haven't seen this in other software is just a coincidence because these other apps didn't push the machine in the specific manner that SD does to get the event to trigger, but that doesn't mean that changing something in SD will prevent it. If you never got shot when playing Russian Roulette, but the gun suddenly fires when you're wearing a glove, it wasn't the glove that caused the shot to fire.
Dude, I got that from the very beginning, but you seem not to understand my approach at troubleshooting.

It was never about SD being bugged or the VAE decoder being the cause itself. It was simply about figuring out what the tiled VAE decoder does, that the VAE decoder does not do, so that I could at least roughly narrow down the direction I have to look into.

Anyways, I made another test. Since shutdowns never happened with the tiled decoder yet, even with larger upscales than those causing a shutdown, I tried to go even bigger (with the tiled version) to stress my system further and see if I can cause a shutdown.

And it actually happened, with a very big upscale size that goes far beyond the higher things that only work with the tiled version for me, but not with the standard ones.

So now I think the most likely reason it shuts down has to do with the PSU. This theory didn't make sense before, because shutdowns never happened during the heavy demanding upscalings but only in that short 3 second time-frame of conversion, but since I could trigger a shutdown now during the upscaling process for the first time, it points at running out of power or some instability in the PSU.

If it was only the GPU being at its limit I would suspect for it to just throttle down or the display shutting off. Overheating I did rule out with measurements yesterday. Faulty communication on the mainboard between components would probably lead to a bluescreen crash, since conflicting protocols would cause a software error.

Unless someone of you comes up with any objection, I tend to just go with a new PSU as it seems to be the most likely reason.
 
Last edited:

Fuchsschweif

Well-Known Member
Sep 24, 2019
1,169
1,994
I asked this a few pages ago but we didn't find where my workflow fails at this:

Let's say I liked the 2nd generation in the middle, but not some of the details that have been added into the final upscale (right side).

If I only change settings in the final ksampler, so let's say decrease the denoise from 0.55 to 0.50 and the CFG from 8 to 6, and hit queue prompt, comfy should start at the final ksampler since everything before it has the same settings and I changed the seed to fixed after I liked the output, right?

Well, that doesn't work for me.. when I do that and hit queue prompt, I get an entire new generation from the very beginning.

What am I doing wrong?

1707158411509.png

ComfyUI_00311_.png
 

hkennereth

Member
Mar 3, 2019
239
784
I asked this a few pages ago but we didn't find where my workflow fails at this:

Let's say I liked the 2nd generation in the middle, but not some of the details that have been added into the final upscale (right side).

If I only change settings in the final ksampler, so let's say decrease the denoise from 0.55 to 0.50 and the CFG from 8 to 6, and hit queue prompt, comfy should start at the final ksampler since everything before it has the same settings and I changed the seed to fixed after I liked the output, right?

Well, that doesn't work for me.. when I do that and hit queue prompt, I get an entire new generation from the very beginning.

What am I doing wrong?

View attachment 3328781

View attachment 3328787
No idea for sure, but maybe it's because you are sharing the seed value between Ksamplers using that primitive. I have never done that, and never had that issue, so it's my only guess.

There is a plugin called Inspire Pack that contains a number of different utility nodes, among them one called Global Seed. What it does is override and control the seed values of every single KSampler in your workflow without needing to be connected to the rest of the flow at all. I use it often when I want to fine tune a single image without having to mess with the settings of my ksamplers.

1707160753207.png
 
  • Like
Reactions: Fuchsschweif

Fuchsschweif

Well-Known Member
Sep 24, 2019
1,169
1,994
No idea for sure, but maybe it's because you are sharing the seed value between Ksamplers using that primitive. I have never done that, and never had that issue, so it's my only guess.

There is a plugin called Inspire Pack that contains a number of different utility nodes, among them one called Global Seed. What it does is override and control the seed values of every single KSampler in your workflow without needing to be connected to the rest of the flow at all. I use it often when I want to fine tune a single image without having to mess with the settings of my ksamplers.

View attachment 3328914
That's what the primitive node does too :D It takes the seed from the first sampler and then feeds it into ksampler 2 and 3 to keep it consistent between all samplers.
 

hkennereth

Member
Mar 3, 2019
239
784
That's what the primitive node does too :D It takes the seed from the first sampler and then feeds it into ksampler 2 and 3 to keep it consistent between all samplers.
Yes, I am aware, it's why I mentioned it, my point was that I don't know if the fact that it is connected to all samplers is what is causing all of them to re-render when you change the settings of one of them -- I would suggest disconnecting that and manually setting each sampler at a time and see if the problem persists. And I'm letting you know that this node I use does not cause the issue (each sampler gets cached correctly when using it), while offering the same level of control.
 
  • Like
Reactions: Fuchsschweif

Fuchsschweif

Well-Known Member
Sep 24, 2019
1,169
1,994
Yes, I am aware, it's why I mentioned it, my point was that I don't know if the fact that it is connected to all samplers is what is causing all of them to re-render when you change the settings of one of them -- I would suggest disconnecting that and manually setting each sampler at a time and see if the problem persists. And I'm letting you know that this node I use does not cause the issue (each sampler gets cached correctly when using it), while offering the same level of control.
I tried it now, it does still persist. It only works when I start with a seed right away on all samplers.

But when I create a new random seed that I like, and I liked the outputs, there is no chance to only re-run the 2nd or 3rd sampler, it would always start from scratch. Probably because I change the first sampler from "random" to "fixed".

But if I don't do that, I would get new seeds all the time. And if I have to put in the seed first and set it to fixed, I can't keep working with a first randomized generation that I like, that I now want to "lock in".
 
Last edited: