The AI debate may become moot sooner than we think

tanstaafl

Active Member
Oct 29, 2018
967
1,400
As the title says, the AI debate may become pointless sooner than we think. Not because people will become more accepting, but because soon enough we won't be able to tell what's AI and what's not in adult games that use 3d models. You can now tell gpt 3.5 to write a blender script to generate any model you want and even has some (very bad) attempts at animating. I tried to generate a human model and it failed spectacularly and created what looked like a broom standing bristles up. But then I told it to write a script to create a "cute little green alien" and it gave me a script to drop into blender that created this:
1717825467749.png

It should also be noted that I have never used blender, never used any 3d modeling tools other than to try intro tutorials years ago. To create that very simple model was as easy as 1. install blender 2. click script button 3. copy script GPT generated into blender 4. click run script.

I watched youtube video on this topic and the guy talking had made basic progress in creating an environment (if 3 square blocks called a "house" is an environment) and a wiggly animation. It's fairly interesting.

Am I saying that it will be making adult games for people next week? No. Or even next year? Probably not. But in a few years? It's a real possibility, especially with the amount of people working on this kind of thing.
 

Insomnimaniac Games

Degenerate Handholder
Game Developer
May 25, 2017
2,871
5,084
reality though is saying no:
That was fascinating to listen to, thanks for the link! What that video doesn't go in to is cost. AI is extremely expensive. ChatGPT alone costs $700,000 a day to run. Now, they probably make enough to cover that. But will they continue to do so? I'm unsure. Especially if it gets more complex.
 
Last edited:
  • Like
Reactions: woody554

woody554

Well-Known Member
Jan 20, 2018
1,396
1,748
That was fascinating to listen to, thanks for the link! What that video doesn't go in to is cost. AI is extremely expensive. ChatGPT alone costs $700,000 a day to run. Now, they probably make enough to cover that. But will they continue to do so? I'm unsure. Especially if it gets more complex.
that's kinda the whole point. due to diminishing returns per buck the cost of reaching that higher level of generalization likely becomes astronomical. which means it simply won't be doable in practice even if it were in theory. throwing a billion a day at it might only give you 1% more and so on. it's just not good enough, we need a new strategy.

the good news is such strategies are possible as our brain already do it effortlessly, but the problem is we have no idea how.
 
  • Like
Reactions: Insomnimaniac Games

tanstaafl

Active Member
Oct 29, 2018
967
1,400
reality though is saying no:
lol, no. AI has not peaked. The one british guy drawing with sharpie in a notepad about cost effectiveness isn't proof of the opposite. That's just wishful thinking by people who hate it for some reason.

Edit: To expand on that, AI as discussed in that video barely costs a fraction of what it costs google to maintain just their search servers and yet they pay it happily. Why? Because it's profitable. If you don't think AI will continue to be profitable as it develops you are seriously deluding yourself. The company I work for pays AWS ~$50k a month (last month's bill was $38k, but it averages out) to use their services, including Kendra (Amazon web service intelligent search). And Kendra doesn't even have close to GPT 3.5 capabilities. Companies are willing to pay through the nose for it. The vast majority of the horrible AI Companion sites are paying openai buckets of money. I know Candy and Nectar AI are using OpenAI anyway, and given the sameyness of all those websites I wouldn't doubt the rest of them are too.

Edit 2: Also, to use GPT 4 and above costs $20 a month. They're smashing their current running costs with just that. Keep that in mind.

AI is going to continue to develop.
 
Last edited:
  • Haha
Reactions: Exocrine

tanstaafl

Active Member
Oct 29, 2018
967
1,400
This discussion of AI progression aside, my original post doesn't need drastic AI development to improve. It could progress drastically with current GPT 4o levels of technology with enough people developing templates and methodologies.
 

QQP_Purple

Well-Known Member
Dec 11, 2020
1,249
1,466
AI is vastly overhyped. And this is my professional opinion. It is however also a far greater danger and trap than people give it credit for.

Put simply, every profession has two tiers to it. You have the technicians that do the mundane boring things and than you have the engineers who design it, put it all together and tweak it to turn it from meh to actually good. These are your senior developers that do code review, highly paid consultants, or the old master artist that slaps you on the back of your head when you mistakenly draw an extra finger or two.

Without these people NOTHING ever gets done and quality is always terrible. And the sort of AI being developed now is going to be able to replace the entry level of technicians. But that's it. It simply does not have the capacity to do anything more than that by its very nature. So it is not ever going to be a serious issue as far as employment goes. At least not to actual high skill human labor.

Therein lies the trap I mentioned however. Put simply, the way you become an "engineer" level is by starting off as grunt and working your way up learning from experience as you go. And if AI can undercut those entry level position we might end up in a situation when the engineer level is slowly aging out of the workforce and there are no new people rising up through the ranks to replace them.

And that is a scary future indeed.
 

tanstaafl

Active Member
Oct 29, 2018
967
1,400
AI is vastly overhyped. And this is my professional opinion. It is however also a far greater danger and trap than people give it credit for.

Put simply, every profession has two tiers to it. You have the technicians that do the mundane boring things and than you have the engineers who design it, put it all together and tweak it to turn it from meh to actually good. These are your senior developers that do code review, highly paid consultants, or the old master artist that slaps you on the back of your head when you mistakenly draw an extra finger or two.

Without these people NOTHING ever gets done and quality is always terrible. And the sort of AI being developed now is going to be able to replace the entry level of technicians. But that's it. It simply does not have the capacity to do anything more than that by its very nature. So it is not ever going to be a serious issue as far as employment goes. At least not to actual high skill human labor.

Therein lies the trap I mentioned however. Put simply, the way you become an "engineer" level is by starting off as grunt and working your way up learning from experience as you go. And if AI can undercut those entry level position we might end up in a situation when the engineer level is slowly aging out of the workforce and there are no new people rising up through the ranks to replace them.

And that is a scary future indeed.
See, this is the alarmist bullshit that people need to just ignore. It's not even close to true. At most AI is currently capable of assisting (keyword: assisting) any level of software engineering, and only then with vigorous checks in place. Source: I am a senior engineer currently implementing Kendra AI in a LMS, lol.

The possibilities with AI are fairly open, but almost no progress has been made to accurately or competently do an engineers job. One AI company made the claim a few months ago that it could examine large projects and fix/expand on them, but that turned out to be an overhyped scam within a week.
 

QQP_Purple

Well-Known Member
Dec 11, 2020
1,249
1,466
See, this is the alarmist bullshit that people need to just ignore. It's not even close to true. At most AI is currently capable of assisting (keyword: assisting) any level of software engineering, and only then with vigorous checks in place. Source: I am a senior engineer currently implementing Kendra AI in a LMS, lol.
I am also a senior engineer who has been doing this for 20 years now.

Which is why I think you misunderstood what I meant to say there. I was not talking about the AI we have now but about the absolute maximum possible end development of the technology assuming infinite money and time is pored into it.

Right now, AI is just a glorified autocomplete++ that gets things wrong half the time. But in time when the technology reaches its actual zenith it might just be able to replace the sort of copy-paste stack overflow to editor code monkeys that make up entry level positions in a lot of the industry. The sort that you give simple tasks like "add a save button to the main menu" to.

But that is also the limit of what it will ever be able to do. You will agree with me when I say that modern approaches to "AI" simply do not mechanically have the capacity to reason in any meaningful way and thus will not ever be able to process any sort of metaknowledge of the sort we engineers actually do on a daily basis. It's just not possible no matter how much work is put into advancing them.

And there in lies the danger that I see. AI will not ever replace us. But we all had to start somewhere. And if it replaces the code monkeys at the bottom, even if it does so worse than they are, it still gives management incentive to greatly reduce the number of viable entry level positions. It's the same story as with outsourcing did back in the day. Worse but cheaper > better but more expensive just as long as the total output > min.

And with those gone or greatly reduced what jobs will the next generation of developers have as a means of entry? Where is the next crop of US going to rise from?

The possibilities with AI are fairly open, but almost no progress has been made to accurately or competently do an engineers job. One AI company made the claim a few months ago that it could examine large projects and fix/expand on them, but that turned out to be an overhyped scam within a week.
Did you even read what I wrote? Or are we just in agreement on this part?
 
Last edited:
  • Like
Reactions: PsychicStress

hakarlman

Engaged Member
Jul 30, 2017
2,093
3,267
Hear me out. In the movie Avatar, humanity had HUGE energy requirements that led to earth finally dying. Guess what, you know what's a huge energy requirement? AI

I believe AI will keep getting better and better like the OP said, but it will require more and more energy, ENORMOUS amounts of energy.

I truly believe humanity will end up in a situation where a person can put on a device, see a blank 3D virtual world, and can create their own planet and society from scratch, doing whatever they want without worrying about the physical limitations of the human body (unless this person takes off the device to come back to reality). But think of the energy requirements to do that.

Once you have that perfect AI algorithm that mimics the human brain, from there, all you'll need is just enormous amounts of energy to do whatever the user wants.

Open to corrections and discussion.
 

tanstaafl

Active Member
Oct 29, 2018
967
1,400
I am also a senior engineer who has been doing this for 20 years now.

Which is why I think you misunderstood what I meant to say there. I was not talking about the AI we have now but about the absolute maximum possible end development of the technology assuming infinite money and time is pored into it.

Right now, AI is just a glorified autocomplete++ that gets things wrong half the time. But in time when the technology reaches its actual zenith it might just be able to replace the sort of copy-paste stack overflow to editor code monkeys that make up entry level positions in a lot of the industry. The sort that you give simple tasks like "add a save button to the main menu" to.

Problem is therefore not that it will ever replace us. God no. But we all had to start somewhere. And if it replaces the code monkeys where is the next crop of us going to rise from?
I absolutely agree about the glorified autocomplete. And it will progress past that, but even given a decade it still won't have the capabilities that people seem to be afraid of. It will not be able to account for a client randomly asking for an out of left field change without human guidance and oversight. Instead of your scenario of AI replacing low level engineers, instead it will be handed to them to enhance their work and make it faster and more adherent to standards. Seniors will be pretty much exactly as they are now, the cranky bastards screaming "WHY!?!?!" repeatedly.

Did you even read what I wrote? Or are we just in agreement on this part?
I kind of agree that it is overhyped, but not in the same way you are. I'm excited about what it will eventually be able to do and I'm annoyed at the companies that are the cause of the overhype by making ridiculous claims for a short term buck.
 

QQP_Purple

Well-Known Member
Dec 11, 2020
1,249
1,466
I absolutely agree about the glorified autocomplete. And it will progress past that, but even given a decade it still won't have the capabilities that people seem to be afraid of. It will not be able to account for a client randomly asking for an out of left field change. Instead of your scenario of AI replacing low level engineers, instead it will be handed to them to enhance their work and make it faster and more adherent to standards. Seniors will be pretty much exactly as they are now, the cranky bastards screaming "WHY!?!?!" repeatedly.
Honestly I am concerned with two things.

The first thing is reduced headcount due to increased "productivity". As in, why hire 10 juniors when you can hire 3 and give them each a magic AI that solves all their problems anyway. Only it can't and won't because they are juniors and you just gave them 3 x the workload they are supposed to have and a magic box they have to babysit.

The second is reduced quality due to imperfect tools being handed to people who simply do not have the experience to know how to properly correct them when they do go wrong and who will be convinced that they can not go wrong because of the hype. Combine that with the extra workload I mentioned and they will just end up blindly following its instructions no matter where it takes them. All the while the errors that need correcting have just become too complex for juniors to fix anyway.Joy.

Or in other words, more screaming for us and less people rising through the ranks to help share the pain.

I kind of agree that it is overhyped, but not in the same way you are. I'm excited about what it will eventually be able to do and I'm annoyed at the companies that are the cause of the overhype by making ridiculous claims for a short term buck.
It is definitively going to be interesting to see what comes out of it once the dust settles. But for that the current hype bubble must blow up to its maximum and pop. And the damage that can potentially happen in that process kind of kills all the excitement for me.

Also, fun anecdote. Last week I sped up a loop by 75% by removing a piece of code that was using thrown exceptions as flow control inside a loop with both the try and catch being in said loop and the catch actually doing processing. Oh, and the "exceptional" route was actually the thing that happens in 9 out of 10 use cases. Not even making this up. Because an IF statement was apparently not clever enough.

I am pretty sure that was just plain human stupidity though. So AI has tiny boots to fill.
 
Last edited:

tanstaafl

Active Member
Oct 29, 2018
967
1,400
Honestly I am concerned with two things.

The first thing is reduced headcount due to increased "productivity". As in, why hire 10 juniors when you can hire 3 and give them each a magic AI that solves all their problems anyway. Only it can't and won't because they are juniors and you just gave them 3 x the workload they are supposed to have and a magic box they have to babysit.
This already started a few decades ago (it really picked up steam in the 90s when I was still in college) when the scam of "full stack developer" became all the rage. The tag line is "You're so impressive you can do everything!" The reality is "Why should we pay for four people when we can get this schmuck to kill himself doing it all.

The second is reduced quality due to imperfect tools being handed to people who simply do not have the experience to know how to properly correct them when they do go wrong and who will be convinced that they can not go wrong because of the hype. Combine that with the extra workload I mentioned and they will just end up blindly following its instructions no matter where it takes them. Joy.

Or in other words, more screaming for us and less people rising through the ranks to help share the pain.
I can see this happening already. Our product team talked the higher ups to spring for Copilot integration into our IDEs/Text editors (VSCode) and, while it does help at times, its suggestions are...interesting, to say the least, sometimes.

It is definitively going to be interesting to see what comes out of it once the dust settles. But for that the current hype bubble must blow up to its maximum and pop. And the damage that can potentially happen in that process kind of kills all the excitement for me.

Also, fun anecdote. Last week I sped up a loop by 75% by removing a piece of code that was using thrown exceptions as flow control inside a loop with both the try and catch being in said loop and the catch actually doing processing because an IF statement was apparently not clever enough.

I am pretty sure that was just plain human stupidity though.
Yeah, once people get used to it and companies stop trying to push it like car salesmen selling a Pinto it should settle down into acceptable hype, lol. And yeah, just last week I had to reject a PR from a dev that was trying to copy logic from a fairly hefty SQL proc over to a dynamo database (SQLless database) during a production change. Even after AI fully hits its stride I'm pretty sure I'll still be preaching to read the docs until I retire.
 
  • Like
Reactions: DuniX

tanstaafl

Active Member
Oct 29, 2018
967
1,400
I believe AI will keep getting better and better like the OP said, but it will require more and more energy, ENORMOUS amounts of energy.
I once read an article that said that the biggest limitation to our progress in regards to technology is our battery technology. And I don't disbelieve it.
 
  • Like
Reactions: hakarlman

QQP_Purple

Well-Known Member
Dec 11, 2020
1,249
1,466
This already started a few decades ago (it really picked up steam in the 90s when I was still in college) when the scam of "full stack developer" became all the rage. The tag line is "You're so impressive you can do everything!" The reality is "Why should we pay for four people when we can get this schmuck to kill himself doing it all.
And than you end up hiring two full stack developers anyway because it turns out one man can only do one mans worth of work no matter if its split between two jobs or not. Especially since there is no such thing as an actual full stack developer. Only a backend that knows some frontend or vice versa. Oh, and they cost more and know it too. So yay for (not) savings?

This being said, full stack is hopefully the way I wish to see the whole AI trend going as well. As in, objectively speaking you do get more done from two full stack guys than from two full specialists simply because they both know a bit about the other guys work and can thus work together better. And AI has the potential to be a technology to bridge that gap as well and just make people more productive.

If only it does not bury us beforehand.

I can see this happening already. Our product team talked the higher ups to spring for Copilot integration into our IDEs/Text editors (VSCode) and, while it does help at times, its suggestions are...interesting, to say the least, sometimes.
Yea, I have one guy like that where I work. Great guy. No complaints. But every time he answers his own question with "I'll just ask the AI" I die a little bit on the inside.

Yeah, once people get used to it and companies stop trying to push it like car salesmen selling a Pinto it should settle down into acceptable hype, lol. And yeah, just last week I had to reject a PR from a dev that was trying to copy logic from a fairly hefty SQL proc over to a dynamo database (SQLless database) during a production change. Even after AI fully hits its stride I'm pretty sure I'll still be preaching to read the docs until I retire.
Truly mankind at its finest.
 

QQP_Purple

Well-Known Member
Dec 11, 2020
1,249
1,466
For those in the audience who have no idea what we are talking about we are basically discussing how bad coworkers created code NTR. And not the you steal someone else's wife kind either.
 
  • Haha
Reactions: DuniX and tanstaafl

tanstaafl

Active Member
Oct 29, 2018
967
1,400
For those in the audience who have no idea what we are talking about we are basically discussing how bad coworkers created code NTR. And not the you steal someone else's wife kind either.
Cucked by newbies using AI. The new Adult VN trend.