[Stable Diffusion] Prompt Sharing and Learning Thread

Fuchsschweif

Active Member
Sep 24, 2019
986
1,563
All these questions you've been asking tell me that you're new to all this 3D, VR and AI stuff (which is absolutely fine as we all have been newbies at some point and this is what this forum is about -- to ask things and learn stuff), but if someone then gives you a reasonable answer (overheating), please do not just reply "This can't be". At least consider it as a valid point and act accordingly: check your computer system, check the fans, do some tests, check your settings etc..
I am not new to "this stuff". I've been building my own computers from scratch for the past 18 years. I got a major degree in digital media, and I got plenty of experience with professional video- and audio rendering software.

I work with my computer every day, often with demanding professional software. I know when my computer gets hot, I know how much warmer it gets when I play games with maxed out graphic settings, that's something you can even feel by hand. We're currently still in winter, the apartment is cold, my computer is nowhere near anything that feels like it's even heating.

Now, the reason why I said it can't be overheating-related is because I wanted to save us time. We're now here having the equivalent of someone stating "I updated all my drivers yesterday and the issues still persist" in the OP, and other users still try to walk him through updating his drivers and checking on the obvious.

Also, think for yourself: VAE decoders only work for 2-3 seconds, not longer. How is a system supposed to overheat from 0 to 100 during 2-3 seconds of VAE decoding?

Of course, that's physically impossible.

But since you seem to be in deep disbelief about my ability to understand computers, I made a factual check:

This is at the start of SD:

SD Start.png

This are my temperatures approx. 2-3 seconds before the shutdown:

2.png

As you can see, there's nothing even remotely within dangerous temperature numbers.

Mainboard max. temp was 67°C, GPU was 69,1°C at peak and the CPU was peaking at 67°C.

In order to cause an emergency shutdown due to overheating, the VAE decoder would need to bring these temperatures up by another 30-50°C in only 2-3 seconds.

but you are jumping to the conclusion that one is then the CAUSE of the other, and I'm exhaustingly explaining you why that cannot be the case.
When I attempt to decode the upscaled version of a picture and out of 50 times it caused my computer to shut down 50 times, at exactly 100%, at the exact same spot every single time, only with upscalers, then there's definitely a causality.

That doesn't mean, and I repeat that, that SD is a faulty software that makes my computer shut down due to a bug. It means that my computer has some problem with whatever it is trying to do in that exact moment. Still, how and when it happens are valuable hints in order to troubleshoot the issue. It's a problem that you can pin down on the exact same short time-frame of 2-3 seconds, with 100% accuracy.

So a computer that would have insufficient cooling due to slow fans, bad airstream, old cooling paste or whatever, would also shut down when you play Cyberpunk 2077 in maxed out graphics, and it would also shut down when you render high res videos in after effects. It would certainly not randomly only shutdown while using SD, since it would suffer from a general lack of proper cooling during high demanding tasks that create a significant increase in temperature.

I am trying to explain, that the fact that it only happens when SD tries to finalize/save an image that has been upscaled with anything than a tiled decoder node, is giving away a hint about what might be the issue and what might not be the issue, and overheating is definitely not in the pool of possibilities, as it doesn't fit into the pattern.

And you keep insisting that you should be able to add that node somewhere in your workflow to solve the problem
I never did that. I asked in the OP if anyone knows if I can break that workflow up, so that I can incorporate a tiled VAE decoder. The idea was to split that process that causes my computer to shut down into an easier process. You said it's not possible and that it also would not be the reason why my computer shuts down.

From then on I only tried to explain to you that my system does in fact excactly that when I attempt to use anything except a tiled decoder. That has nothing to do with insisting that there must be a way to incorporate one into that workflow, I explained to you why I asked that question. Two different things..

And that's it. All of that is what is happening inside that one node, except it does some additional things to get a decent quality image from it, unlike this example. You can't add a tiled VAE before or after... because the node needs a decoded pixel image as source for its initial steps, and it will automatically output a decoded pixel image at the end since it needs to merge the images together in pixel space.
Well, in your shared workflow are a lot of VAE decoders. If I would replace them with tiled versions, my system wouldn't shutdown (just in theory, I'm aware your workflow is just a showcase and not a proper replacement). If the tiled button on the SD ultimate upscaler would work the way the tiled VAE decoders work, it also wouldn't crash.

But there seems to be a difference between how the SD ultimate operates, even with the tiled button activated, and ksampler into -> vae tiled decoder -> save image.

Anyways, thanks for the effort of showcasing in your workflow how SD ultimate works, I appreciate it.
 
Last edited:
  • Hey there
Reactions: DD3DD

hkennereth

Member
Mar 3, 2019
232
746
I am not new to "this stuff". I've been building my own computers from scratch for the past 18 years. I got a major degree in digital media, and I got plenty of experience with professional video- and audio rendering software.

I work with my computer every day, often with demanding professional software. I know when my computer gets hot, I know how much warmer it gets when I play games with maxed out graphic settings, that's something you can even feel by hand. We're currently still in winter, the apartment is cold, my computer is nowhere near anything that feels like it's even heating.

Now, the reason why I said it can't be overheating-related is because I wanted to save us time. We're now here having the equivalent of someone stating "I updated all my drivers yesterday and the issues still persist" in the OP, and other users still try to walk him through updating his drivers and checking on the obvious.

Also, think for yourself: VAE decoders only work for 2-3 seconds, not longer. How is a system supposed to overheat from 0 to 100 during 2-3 seconds of VAE decoding?

Of course, that's physically impossible.

But since you seem to be in deep disbelief about my ability to understand computers, I made a factual check:

This is at the start of SD:

View attachment 3327118

This are my temperatures approx. 2-3 seconds before the shutdown:

View attachment 3327119

As you can see, there's nothing even remotely within dangerous temperature numbers.

Mainboard max. temp was 67°C, GPU was 69,1°C at peak and the CPU was peaking at 67°C.

In order to cause an emergency shutdown due to overheating, the VAE decoder would need to bring these temperatures up by another 30-50°C in only 2-3 seconds.



When I attempt to decode the upscaled version of a picture and out of 50 times it caused my computer to shut down 50 times, at exactly 100%, at the exact same spot every single time, only with upscalers, then there's definitely a causality.

That doesn't mean, and I repeat that, that SD is a faulty software that makes my computer shut down due to a bug. It means that my computer has some problem with whatever it is trying to do in that exact moment. Still, how and when it happens are valuable hints in order to troubleshoot the issue. It's a problem that you can pin down on the exact same short time-frame of 2-3 seconds, with 100% accuracy.

So a computer that would have insufficient cooling due to slow fans, bad airstream, old cooling paste or whatever, would also shut down when you play Cyberpunk 2077 in maxed out graphics, and it would also shut down when you render high res videos in after effects. It would certainly not randomly only shutdown while using SD, since it would suffer from a general lack of proper cooling during high demanding tasks that create a significant increase in temperature.

I am trying to explain, that the fact that it only happens when SD tries to finalize/save an image that has been upscaled with anything than a tiled decoder node, is giving away a hint about what might be the issue and what might not be the issue, and overheating is definitely not in the pool of possibilities, as it doesn't fit into the pattern.



I never did that. I asked in the OP if anyone knows if I can break that workflow up, so that I can incorporate a tiled VAE decoder. The idea was to split that process that causes my computer to shut down into an easier process. You said it's not possible and that it also would not be the reason why my computer shuts down.

From then on I only tried to explain to you that my system does in fact excactly that when I attempt to use anything except a tiled decoder. That has nothing to do with insisting that there must be a way to incorporate one into that workflow, I explained to you why I asked that question. Two different things..



Well, in your shared workflow are a lot of VAE decoders. If I would replace them with tiled versions, my system wouldn't shutdown (just in theory, I'm aware your workflow is just a showcase and not a proper replacement). If the tiled button on the SD ultimate upscaler would work the way the tiled VAE decoders work, it also wouldn't crash.

But there seems to be a difference between how the SD ultimate operates, even with the tiled button activated, and ksampler into -> vae tiled decoder -> save image.

Anyways, thanks for the effort of showcasing in your workflow how SD ultimate works, I appreciate it.
I really want to put a pin in this subject so we can all go back to discussing image generation techniques, but let's just go through it one last time:

So you're telling us that you are 100 percent, without question or any doubt in your mind that the issue is not overheating? Well, fantastic. Now there's only about 50 other things that you can look into your hardware to try and troubleshoot your issue. Our point was not "the problem is overheating", the point we were making was "the problem is almost without a doubt in the hardware, and overheating is the most likely culprit so that's what we would look into first".

The idea that you may be using other software that despite being very heavy, like games such as Cyberpunk 2077, is flawed, however, because pretty much every software like that is designed to avoid ever utilizing 100% of the system resources to avoid crashes. That is not the case with SD, which basically starts a new process from scratch for every operation, so if it crashes you just start it over. It's not a very well designed piece of software from a stability point of view... because it doesn't have to be.

The other point I have made since the beginning is that...

If the tiled button on the SD ultimate upscaler would work the way the tiled VAE decoders work, it also wouldn't crash.
... the tiled button DOES work the way the tiled VAE decoder works, because it is exactly what is being used internally. I don't mean it's doing something similar, I mean it's actually using the exact same piece of code. The person who created the SD Ultimate Upscaler didn't write a new VAE tiled decoder, it just calls the one that comes with ComfyUI, so when you click that little button, internally it's exactly the same as going to my little example workflow and replacing all VAE Codecs with their tiled versions.

So if that option isn't working for you and it is still crashing, then the tiled VAE codecs are not the answer you are looking for... which is the point I have made on pretty much every reply I gave you. It's WHY I created that example workflow, so you would understand that, but you are still thinking there must be a different reason. There isn't.

I don't know how else I could possibly explain to you that you are barking at the wrong tree, but let's try one more time as explicitly as possible: your issue is not caused by VAE Encoders and Decoders, and will not be solved by using Tiled VAE Encoders and Decoders. You issue is most likely caused by a hardware problem, but even if it's a software problem, it's something on YOUR machine, not something you can fix by changing settings/nodes/plugins within ComfyUI, and we can't offer proper suggestions because we don't have access to your machine. The fact that you haven't seen this in other software is just a coincidence because these other apps didn't push the machine in the specific manner that SD does to get the event to trigger, but that doesn't mean that changing something in SD will prevent it. If you never got shot when playing Russian Roulette, but the gun suddenly fires when you're wearing a glove, it wasn't the glove that caused the shot to fire.
 

FallingDown90

Member
Aug 24, 2018
115
38
Hi everyone
I'm sorry if I bother you but I'm not English and therefore it's really difficult to search for threads.
Can you tell me if stable diffusion is still among the best or if it has been overtaken by XL or XL turbo or something else?

I like the idea of generating images in real time but I understand that the interface still works with the node system and differs from stable diffusion... I don't understand anything anymore

Can I please ask you what you suggest I use?
 

Fuchsschweif

Active Member
Sep 24, 2019
986
1,563
Hi everyone
I'm sorry if I bother you but I'm not English and therefore it's really difficult to search for threads.
Can you tell me if stable diffusion is still among the best or if it has been overtaken by XL or XL turbo or something else?

I like the idea of generating images in real time but I understand that the interface still works with the node system and differs from stable diffusion... I don't understand anything anymore

Can I please ask you what you suggest I use?
Stable Diffusion is always the basic application, no matter whether you use A1111, ComfyUI or anything else. Checkpoints/Models like 1.5, SDXL or Turbo are all available on their own, you can use whatever you want.

1.5 is the older generation that was mainly trained on 512x512 pictures. SDXL is newer and is mainly trained on 1024x1024 pictures. There is more data in it which means the models have more variety to offer, but they're also more resource hungry.

Turbo is, as far as I am concerned, right now only for research purposes. It allows you to generate pictures almost in real time, but only with low quality. But it's good for quick experiments.

So you can just roll with either A1111 or ComfyUI, whatever you prefer more, and then decide for yourself what models you use.
 

Fuchsschweif

Active Member
Sep 24, 2019
986
1,563
So you're telling us that you are 100 percent, without question or any doubt in your mind that the issue is not overheating? Well, fantastic. Now there's only about 50 other things that you can look into your hardware to try and troubleshoot your issue. Our point was not "the problem is overheating", the point we were making was "the problem is almost without a doubt in the hardware, and overheating is the most likely culprit so that's what we would look into first".
And I never denied that it could be my hardware or system, I just said overheating won't be the issue, which was something you guys seemed to hang up on. Also I don't know why you ask me to be more specific about my crashes, only to tell me that I basically should go troubleshoot the other 50 possibilites. It sounded like you were onto something and wanted to help out.

I don't know how else I could possibly explain to you that you are barking at the wrong tree, but let's try one more time as explicitly as possible: your issue is not caused by VAE Encoders and Decoders, and will not be solved by using Tiled VAE Encoders and Decoders. You issue is most likely caused by a hardware problem, but even if it's a software problem, it's something on YOUR machine, not something you can fix by changing settings/nodes/plugins within ComfyUI, and we can't offer proper suggestions because we don't have access to your machine. The fact that you haven't seen this in other software is just a coincidence because these other apps didn't push the machine in the specific manner that SD does to get the event to trigger, but that doesn't mean that changing something in SD will prevent it. If you never got shot when playing Russian Roulette, but the gun suddenly fires when you're wearing a glove, it wasn't the glove that caused the shot to fire.
Dude, I got that from the very beginning, but you seem not to understand my approach at troubleshooting.

It was never about SD being bugged or the VAE decoder being the cause itself. It was simply about figuring out what the tiled VAE decoder does, that the VAE decoder does not do, so that I could at least roughly narrow down the direction I have to look into.

Anyways, I made another test. Since shutdowns never happened with the tiled decoder yet, even with larger upscales than those causing a shutdown, I tried to go even bigger (with the tiled version) to stress my system further and see if I can cause a shutdown.

And it actually happened, with a very big upscale size that goes far beyond the higher things that only work with the tiled version for me, but not with the standard ones.

So now I think the most likely reason it shuts down has to do with the PSU. This theory didn't make sense before, because shutdowns never happened during the heavy demanding upscalings but only in that short 3 second time-frame of conversion, but since I could trigger a shutdown now during the upscaling process for the first time, it points at running out of power or some instability in the PSU.

If it was only the GPU being at its limit I would suspect for it to just throttle down or the display shutting off. Overheating I did rule out with measurements yesterday. Faulty communication on the mainboard between components would probably lead to a bluescreen crash, since conflicting protocols would cause a software error.

Unless someone of you comes up with any objection, I tend to just go with a new PSU as it seems to be the most likely reason.
 
Last edited:

Fuchsschweif

Active Member
Sep 24, 2019
986
1,563
I asked this a few pages ago but we didn't find where my workflow fails at this:

Let's say I liked the 2nd generation in the middle, but not some of the details that have been added into the final upscale (right side).

If I only change settings in the final ksampler, so let's say decrease the denoise from 0.55 to 0.50 and the CFG from 8 to 6, and hit queue prompt, comfy should start at the final ksampler since everything before it has the same settings and I changed the seed to fixed after I liked the output, right?

Well, that doesn't work for me.. when I do that and hit queue prompt, I get an entire new generation from the very beginning.

What am I doing wrong?

1707158411509.png

ComfyUI_00311_.png
 

hkennereth

Member
Mar 3, 2019
232
746
I asked this a few pages ago but we didn't find where my workflow fails at this:

Let's say I liked the 2nd generation in the middle, but not some of the details that have been added into the final upscale (right side).

If I only change settings in the final ksampler, so let's say decrease the denoise from 0.55 to 0.50 and the CFG from 8 to 6, and hit queue prompt, comfy should start at the final ksampler since everything before it has the same settings and I changed the seed to fixed after I liked the output, right?

Well, that doesn't work for me.. when I do that and hit queue prompt, I get an entire new generation from the very beginning.

What am I doing wrong?

View attachment 3328781

View attachment 3328787
No idea for sure, but maybe it's because you are sharing the seed value between Ksamplers using that primitive. I have never done that, and never had that issue, so it's my only guess.

There is a plugin called Inspire Pack that contains a number of different utility nodes, among them one called Global Seed. What it does is override and control the seed values of every single KSampler in your workflow without needing to be connected to the rest of the flow at all. I use it often when I want to fine tune a single image without having to mess with the settings of my ksamplers.

1707160753207.png
 
  • Like
Reactions: Fuchsschweif

Fuchsschweif

Active Member
Sep 24, 2019
986
1,563
No idea for sure, but maybe it's because you are sharing the seed value between Ksamplers using that primitive. I have never done that, and never had that issue, so it's my only guess.

There is a plugin called Inspire Pack that contains a number of different utility nodes, among them one called Global Seed. What it does is override and control the seed values of every single KSampler in your workflow without needing to be connected to the rest of the flow at all. I use it often when I want to fine tune a single image without having to mess with the settings of my ksamplers.

View attachment 3328914
That's what the primitive node does too :D It takes the seed from the first sampler and then feeds it into ksampler 2 and 3 to keep it consistent between all samplers.
 

hkennereth

Member
Mar 3, 2019
232
746
That's what the primitive node does too :D It takes the seed from the first sampler and then feeds it into ksampler 2 and 3 to keep it consistent between all samplers.
Yes, I am aware, it's why I mentioned it, my point was that I don't know if the fact that it is connected to all samplers is what is causing all of them to re-render when you change the settings of one of them -- I would suggest disconnecting that and manually setting each sampler at a time and see if the problem persists. And I'm letting you know that this node I use does not cause the issue (each sampler gets cached correctly when using it), while offering the same level of control.
 
  • Like
Reactions: Fuchsschweif

Fuchsschweif

Active Member
Sep 24, 2019
986
1,563
Yes, I am aware, it's why I mentioned it, my point was that I don't know if the fact that it is connected to all samplers is what is causing all of them to re-render when you change the settings of one of them -- I would suggest disconnecting that and manually setting each sampler at a time and see if the problem persists. And I'm letting you know that this node I use does not cause the issue (each sampler gets cached correctly when using it), while offering the same level of control.
I tried it now, it does still persist. It only works when I start with a seed right away on all samplers.

But when I create a new random seed that I like, and I liked the outputs, there is no chance to only re-run the 2nd or 3rd sampler, it would always start from scratch. Probably because I change the first sampler from "random" to "fixed".

But if I don't do that, I would get new seeds all the time. And if I have to put in the seed first and set it to fixed, I can't keep working with a first randomized generation that I like, that I now want to "lock in".
 
Last edited:

Thalies

New Member
Sep 24, 2017
13
50
So, I’m pretty new to all this, but I just had to share. I made my first image using SDXL Turbo in Fooocus. Probably not a big deal for most here, but for a noob like me, it’s pretty awesome how fast it is and how good the quality is.

You don't have permission to view the spoiler content. Log in or register now.
 
Last edited:
  • Love
  • Like
Reactions: Sepheyer and Mr-Fox

Mr-Fox

Well-Known Member
Jan 24, 2020
1,401
3,794
SDXL turbo is not only experimental. You can find many ckpt's by simply use the filter on civitai "SDXL turbo" or simply search for SDXL Turbo. For SD1.5 there is LCM, that is essentially the same thing. High quality results with much fewer steps and a lower cfg scale. LCM: " Latent Consistency Model" .
SDXL Turbo.png
SD15 lcm.png
 
Last edited:

hkennereth

Member
Mar 3, 2019
232
746
I tried it now, it does still persist. It only works when I start with a seed right away on all samplers.

But when I create a new random seed that I like, and I liked the outputs, there is no chance to only re-run the 2nd or 3rd sampler, it would always start from scratch. Probably because I change the first sampler from "random" to "fixed".

But if I don't do that, I would get new seeds all the time. And if I have to put in the seed first and set it to fixed, I can't keep working with a first randomized generation that I like, that I now want to "lock in".
Okay, so if I understand it right, you are leaving the first sampler at Randomized so you can find a new image you like, and then when you find it, you are changing the seed to Fixed, is that correct? And that's why it's not working for you?

If that's the case, the reason is that the randomization of seeds happens AFTER you add the image to the queue (you can see that your primitive node, where it says "CONTROL_AFTER_GENERATE"). Meaning you are looking at a seed value of "5", for example, you click Queue Prompt, that image with seed "5" is added to the list, and then a new seed value "42" is displayed, so if you switch NOW to Fixed, you are using the value for the NEXT image, not the one you just created. A quick workaround for that is once you find an image you like, you can drag it back to the ComfyUI window, which will load all the values as they were for the creation of this image, including the seed, and THEN switch all samplers to Fixed, which will keep them at that specific value. Now you can fine tune all sampler values, prompt tags, etc, and they will all remain at the original seeds.

That Global Seed node I mentioned, on the other hand, has an option that doesn't exist on Comfy by default, which is the option to change the seed randomization mode to "CONTROL_BEFORE_GENERATE", which will randomize the seeds at the moment you click to Queue Prompt and then add the image to the queue, keeping all seed values as they were. This is how other UIs like A1111 and EasyDiffusion tend to work.
 
  • Like
Reactions: Fuchsschweif

Fuchsschweif

Active Member
Sep 24, 2019
986
1,563
A quick workaround for that is once you find an image you like, you can drag it back to the ComfyUI window, which will load all the values as they were for the creation of this image, including the seed, and THEN switch all samplers to Fixed, which will keep them at that specific value. Now you can fine tune all sampler values, prompt tags, etc, and they will all remain at the original seeds.

That Global Seed node I mentioned, on the other hand, has an option that doesn't exist on Comfy by default, which is the option to change the seed randomization mode to "CONTROL_BEFORE_GENERATE", which will randomize the seeds at the moment you click to Queue Prompt and then add the image to the queue, keeping all seed values as they were. This is how other UIs like A1111 and EasyDiffusion tend to work.
Thanks, that are two great tips and the background information about how the seeds work was very helpful! I will try them both out. I already got the inspire pack installed too.

Edit: Works very well with the Global Seed node. :)
 
Last edited:

me3

Member
Dec 31, 2016
316
708
And I never denied that it could be my hardware or system, I just said overheating won't be the issue, which was something you guys seemed to hang up on. Also I don't know why you ask me to be more specific about my crashes, only to tell me that I basically should go troubleshoot the other 50 possibilites. It sounded like you were onto something and wanted to help out.



Dude, I got that from the very beginning, but you seem not to understand my approach at troubleshooting.

It was never about SD being bugged or the VAE decoder being the cause itself. It was simply about figuring out what the tiled VAE decoder does, that the VAE decoder does not do, so that I could at least roughly narrow down the direction I have to look into.

Anyways, I made another test. Since shutdowns never happened with the tiled decoder yet, even with larger upscales than those causing a shutdown, I tried to go even bigger (with the tiled version) to stress my system further and see if I can cause a shutdown.

And it actually happened, with a very big upscale size that goes far beyond the higher things that only work with the tiled version for me, but not with the standard ones.

So now I think the most likely reason it shuts down has to do with the PSU. This theory didn't make sense before, because shutdowns never happened during the heavy demanding upscalings but only in that short 3 second time-frame of conversion, but since I could trigger a shutdown now during the upscaling process for the first time, it points at running out of power or some instability in the PSU.

If it was only the GPU being at its limit I would suspect for it to just throttle down or the display shutting off. Overheating I did rule out with measurements yesterday. Faulty communication on the mainboard between components would probably lead to a bluescreen crash, since conflicting protocols would cause a software error.

Unless someone of you comes up with any objection, I tend to just go with a new PSU as it seems to be the most likely reason.
It might not be a "problem" with the PSU failing as such, more that there's spikes and very shifting fluctuations causing voltage "drops". There seems to be reported issues of things like tensorflow and pytorch causing reboots (without blue screens etc) and the fixes are mainly limiting the power usage for the GPU. Some refer to issues with bios versions but that obviously become much more specific to cards than a general power usage issue. One place at least mentioned ASUS x299 Sage having this bios issue. Not sure if that applies in your case or if similar issues exist with other cards.
You could try limiting the power usage for the GPU using nvidia-smi, one post i saw had to set power usage to 120 or less for their 1070 so might be a place to start. Probably a good idea to do a quick search and see what safe values are for the card.
Some had luck disabling a turbo mode for intel CPUs, but seems you are using AMD, might have something similar
 

Mr-Fox

Well-Known Member
Jan 24, 2020
1,401
3,794
Hi everyone
I'm sorry if I bother you but I'm not English and therefore it's really difficult to search for threads.
Can you tell me if stable diffusion is still among the best or if it has been overtaken by XL or XL turbo or something else?

I like the idea of generating images in real time but I understand that the interface still works with the node system and differs from stable diffusion... I don't understand anything anymore

Can I please ask you what you suggest I use?
The main software is Stable Diffusion. There are mainly 2 user interfaces that is the most established. Automatic 1111 and ComfyUi but there are also others such as Fooocus (a simplifyed UI).
Regardless which user interface you choose, the software is still Stable Diffusion.
There are different model generations, just like there are windows 7, 10 and 11 etc. The ones that are used the most are SD1.5 and SDXL. I would recommend either going with Automatic1111 or Fooocus to get started, when you have become proficient with it, then try a less beginner friendly user interface such as ComfyUi (node based UI) if you wish.

How to install Stable Diffusion Automatic1111:
How to get started Using SD A1111:

Stable Diffusion Fooocus tutorial:

*Edit
Something to be aware of.
Fooocus is primarily using SDXL.
It is supposedly lighter on the vram requirement compared to A1111 though.
I don't know if you can download SD1.5 models and use with it.
 
Last edited:

Delambo

Newbie
Jan 10, 2018
99
86
Somewhere in here was linked a page that showed examples of all kinds of art styles by artist. I can't for the life of me find it anymore either here or in my history. Anyone got any suggestions that they think might be the one I'm thinking of, or a substitute for it?

Page had a blue background and broke out all kinds of styles by artist, then gave 4 example pictures using AI to generate in that style with 4 standard prompts. One of the prompts was Henry Cavill.
 

FallingDown90

Member
Aug 24, 2018
115
38
The main software is Stable Diffusion. There are mainly 2 user interfaces that is the most established. Automatic 1111 and ComfyUi but there are also others such as Fooocus (a simplifyed UI).
Regardless which user interface you choose, the software is still Stable Diffusion.
There are different model generations, just like there are windows 7, 10 and 11 etc. The ones that are used the most are SD1.5 and SDXL. I would recommend either going with Automatic1111 or Fooocus to get started, when you have become proficient with it, then try a less beginner friendly user interface such as ComfyUi (node based UI) if you wish.

How to install Stable Diffusion Automatic1111:
How to get started Using SD A1111:

Stable Diffusion Fooocus tutorial:

*Edit
Something to be aware of.
Fooocus is primarily using SDXL.
It is supposedly lighter on the vram requirement compared to A1111 though.
I don't know if you can download SD1.5 models and use with it.
Thank you for such clear and helpful answers.
I already used stable diffusion and got used to it, but I was hoping to be able to have real-time results similar to those I saw on leonardo.ai (I know they are different anyway but it would be nice if stable diffusion could use img to img to draw>generate >edit>generate quickly but I think that's too much to ask)
I'm afraid the node interface is too complex for me, I could be wrong because I avoided it with any software....
 
  • Like
Reactions: Mr-Fox

hkennereth

Member
Mar 3, 2019
232
746
Somewhere in here was linked a page that showed examples of all kinds of art styles by artist. I can't for the life of me find it anymore either here or in my history. Anyone got any suggestions that they think might be the one I'm thinking of, or a substitute for it?

Page had a blue background and broke out all kinds of styles by artist, then gave 4 example pictures using AI to generate in that style with 4 standard prompts. One of the prompts was Henry Cavill.
No Henry Cavill here, but maybe this one?