One danger of taking a break on a project

anne O'nymous

I'm not grumpy, I'm just coded that way.
Modder
Donor
Respected User
Jun 10, 2017
10,384
15,294
I think that those with enough experience know how to value the different aspects of the craft. It's just that different people place different accents on certain aspects.
But the aspect on which they put the accent can also render a theory wrong, when not totally obsolete. TDD will always be a valid approach, but aren't OO algorithms, and so all your views about OO itself, depending if you center on the code or on the data ? And the same can be said regarding "good practices". , goes totally at the opposite of the "always know what object you're using" principle.
Not all theorizing became false, but it's only those regarding the pure theory of programming that aren't effectively entering in conflict with modern computing ; or past computing if it's a modern theory. The instant your theory start to talk about the code or the data, it will become totally dependent of the context. Not only because you don't code a game like you code a database, but also because in a game most of the time is past processing the data, while in a database it's finding/storing/sending the data that take most of the time. Therefore, for a game it's the optimization of the code that matter the most, while for a database it's the optimization of the data flow that come first. Whatever how optimized is your code, sending the data through filters corresponding to the SQL request will always be faster than calling functions accordingly to the SQL request and telling them where they can find the previously proceeded data.
In the first approach, you build a tunnel with the filters needed by the request, and let the data pass through it. This while in the second you've your data in a basket, and goes door to door with them, following a pattern that correspond to the request. The time you'll lost build the tunnel will never be as high that the time you'll lost traveling between the doors.
And this process will be facilitated by a data centered approach that too few among the old generation achieve to apprehend as a necessity. Once again not because we are stupid, but because, between school then our years of experience, our brain isn't formatted to think in the right way. And we now have past too many decades working with this format for it to be easily changed. Updated yes, but not changed.


I must say, you are a better programmer then me. That's really hard core and I admire that. ASM is like seeing the green letters of the Matrix. I can view it, but I'm no Neo.
Before building me a throne, remember that it was in the 80's. I can surely retrieve the logical approach needed to use ASM, but between the multi-core building, the 8, 16, 32 and 64 bits registries and instructions, I would drown really fast. This said, it taught me a certain coding style that can probably still be seen time to time in my codes, like by example the Perl hack I talked about previously and that is a pure ASM approach, do not loose time using intermediaries when you can avoid it. What is probably the most common ASM hack is to use XOR to reset a value, registry XOR itself being probably still faster than the assignation of a wild value to a registry.


And now the switch is being made from scripting to visual languages. That something like Blueprints for Unreal can function, enabling designers to program takes it to a whole new level. Although I hear, that the designers who do that need to know a little thing about programming.
It follow the logic I approached previously, now people goes to programming not by passion but because it's a job like any others. The simpler will be the "code to human" interface, the more people will be able to learn how to use it correctly. And something like Blueprints is just the next step. We had compilers to make us free of the complexity of ASM codding, and now we start to have a visual approach that tend to free us from the complexity of codding.
And it will probably follow the same patterns that was followed by compilers. At first the result will just be an approximation of what you had in mind ; a working one, but still just an approximation. Then it will become more and more accurate, before becoming more and more efficient. And it the end it will be more than "good enough".
But like for compilers, anytime you'll want sometime totally accurate, you'll fallback to the previous step ; ASM for compilers, compiler language for a Blueprints-like approach. This until too many decades past and too few peoples remember what "codding" mean in the good old time.
Well, the last step will probably need a century to happen ; it's a bigger one than just passing from ASM to compiler languages. But it will happen, and people who then will use #V++Extended will looks at those who can write few lines in C++ as heroes.

I'll not say that I fear this time, it's progress and it will surely lead to some good things ; and anyway I doubt that I'll see my 150th birthday. But as I implied, having to use ASM taught me a lot of things that I wouldn't have know, or at least not fully and truly understood otherwise. It's like driving a car by example, you don't need to know how it works to do it. But if you know, then you'll drive it in a more efficient way, an efficiency that generally imply a natural optimization.


[...] but I'm hitting 45 and am actually unemployable, while there is a shortage of 50K on 150K IT people in NL. Bizarrely, every headhunter thinks I'm a unicorn with a pot of gold attached to it, while every hiring manager has very different ideas.
Reading that in perspective with what you said above about, starting by your "poor choice", give it a really bad feeling, while throwing me back in my own past ; especially my early years after graduation when I was nothing more than a code monkey (not , or whatever it is) ; don't think, simply turn this algorithm into code, thanks. Personally I turned to freelance career, until I had the opportunity to works for a friend of mine ; not that I was unhappy being freelancer, but after my wife's death being at home every time was, well let's call it "less funny".
I hope you to find a place like that one day, a place where, "as long as the job is done, and done well, when it need to be done, do whatever you want". Those places exists, but you need to looks at small structures that don't necessarily dreams too big ; what don't mean that you'll not have a decent salary, happy workers do better works, and customers are ready to pay more for good works. But it's a long road before finding them, alas.
This said, there's many way to reach this goal. Join an open source project, start a blog, help people on a forum, therefore let yourself be known while showing what you worth. We are in an age of communication, if you don't communicate, you don't exist, and while communicating, you're showing your capabilities. It will not be enough by itself, but with time you'll be noticed by people who can be in need of more than another code monkey.


That really takes me back. CGI makes me think of Cyberspace and Cyberpunk, of the limitless possibilities the internet could have. The weird, the hippies and nut-jobs that pioneered the net. When a burned out stock trader from NY wanted to retire with a bookstore in SanFran, being good to people and did the internet thing because it was cool. Bezos and Amazon are a long way from where they started and so are all the rest.
Hey, the net also gave us Open Source. It obviously existed before internet, but Linux, among so many other things, would never have been what it is now if we were still limited to people around us and BBS communications. Centralized repositories available fast and easily to anyone who participate, mailing list to centralize the discussions, and obviously the web to promote and distribute. And it's not just a giant step, it's also a culture change.
Look at the adult gaming scene. 99% of the guys and girls who're actually working on a game available here would never ever have thought about this otherwise. Don't let the bad parts blinds you to the good ones.


Checking first is faster then throwing exceptions, because the exceptions stack is not optimized for speed, at least not in .Net. Python doesn't care much about speed, so I get the approach.
It goes further than that for Python. It's not just a question of speed that don't matter, the whole language is designed around exceptions. The attribute (whatever a variable or a method) of an object don't exist ? It's an exception that will tell it. You've reach the end of an iteration ? Again it's an exception that will tell it. And so on. I'm not totally sure, but normally Python have the most efficient implementation for exceptions that can be found.
But what bother me the most about this approach is that it feel like a big fuck given to the users. Take Ren'py by example, it not validating the data, you'll trigger a problem at the start of your game, and the error you'll get is that something deep inside the core goes wrong. It would be so much useful and interesting to be said something like, "what the fuck are you trying to do".
I totally understand that it's impossible to validate every possibilities, but it's why exceptions exists, to catch what you haven't thought about. After all, it's called "exception", not "it's the norm".



There is hope that things get more adopted when they get simpler. I do sometimes feel that amateur game makers rather have problems then avoid them.
It's what they are doing. It would be useless to name them, but I know many games that were abandoned because it wasn't possible, or not possible anymore, to ignore the problem. And the fact that Ren'py do not validate beforehand is part of the reason. It's already difficult to solve a problem when your knowledge is limited (what isn't a problem by itself) and so you don't understand why this problem happen, so when you aren't even told where you did the error, you've near to no way to solve it.


There is something heroic when people chase a bug for days, while people who carefully program to avoid mistakes are seen as boring.
Few days ago, twitted something really relevant to what you said: "Weeks of coding can save you hours of planning." Unknown
I think that, more globally, with the years passing, we understand that there will always be bugs, while discovering that, well, chasing them tend to make us have better code in the end. So, we are planning, we are trying to avoid mistakes, but we don't overdo it. We will have to debug anyway and, what is worse because boring, at some point we will have to rewrite this code, so...
But, more importantly, we fucking love to code, and planning or avoiding mistakes isn't coding, it's thinking. This while chasing bug is coding, we change tons of lines, and so all, and obviously, we follow Kernighan's advice, we write tons of print ; "the most effective debugging tool is still careful thought, coupled with judiciously placed print statements."
Oop's, he said "judiciously placed" ;)
 
  • Like
Reactions: Marzepain

Diconica

Well-Known Member
Apr 25, 2020
1,100
1,150
I think those with experience value all parts of the craft, but place different accents on parts of it. It depends on the circumstances. I admire your focus on performance, because many hobbyist programmer go's into professional programming with the same mentality, but they are made to see things differently. You managed to maintain it and that shows me it's possible.
For me my accent is not on performance, but on architecture, design and ergonomics. I hold true the "Man is the measure of all things" You are probably a lot better programmer then me.

What is funny is that you mentioned . The wiki has been updated, but it used to state that it was introduced at GDC in 2006, but that there where ideas floating around before that. Now it states that it hails back to much older software architecture principles. You may think it's hubris, but I came up with something very similar to ECS in 1998 when I applied every Design Pattern I could find to game engine design. In 2000 I did my final internship for my bachelors degree at a game startup trying to implement an editor with my version of ECS. The startup failed, although the went another way finally and I failed personally, because of performance. I wanted to implement it with maximum speed and tried C/C++ but using the same Macro system the STL implementation is known for. I managed to get compiler errors on empty lines among other things. I lot of other things went wrong, also on a personal level.
After months I packed myself together and managed to write my bachelors degree paper in 2001, containing ECS. It never reached the trades and I doubt me alma mater has managed to save a copy of it.
It even contained a paragraph on the implementation of the separate system arrays/lists although somebody on gamedev.net had suggested that he was working on that. I think that person was the one who did the presentation at GDC. He deserves the credit as I was disillusioned with game development and went a different way. I may even have held him back for years.
The thing is ECS didn't make any inroads in mainstream game development for more then a decade. The game industry's focus on performance is now getting a little less so a thing like Unity ECS has a chance. Although I have already heard developers criticizing it, mainly because the don't want to change the way they do things.
As a side note E of Entity was inspired by the of ERD . I was influenced by invented at the same time at the Uni the startup was associated with. Later Eric Evans created that in my mind takes these concerns even further and now is back in fashion thanks to . The .Net framework fixed a lot that was wrong by, not defining a like Java did, but taking the many compilers create and add management of code to it with at the time. When MS introduced the deal was done for me. While XNA was not the components style I envisioned, but more a managed version of DirectX, it made stuff so much simpler that I felt that it was enough. Stupid MS killed it in 2013 leaving it to and to carry on. These days Unity and Unreal have taken lots of development out of the equation. Visual programming languages like Blueprints are taking over from scripting languages. This relegates the movement in game scripting from Lua and Python to C# .Net at the same level as Java bytecode and .Net IL now surpassed by Javscript frameworks and NodeJS. A curious fact of history, but no longer relevant anymore.

The gist of my story is this, I'm a Software Architect. I combine theory and practice into some new structure. You may think this is all a big fat lie, or that I'm one of those people who draws clouds and make live difficult for real programmers. That's up to you. I recently tried to create an language based on , started a movement to have a layer of indirection between the assets and code by an open datafile, so broken assets can be fixed in amateur games and known to some here, I created the article "Game Assets folder structuring approaches and dimension. Introducing the Theatrical approach." Probably nothing will come of those things. Maybe I'm just crazy. I just happen to like structuring things.

Unfortunately the world does not need Software Architects anymore. I try to sell myself as a Software Architecture coach these days. Most of the flexibility Design Patter's offered has first moved into frameworks and tools and now moved into the languages used and the processes around software development.

The thing that inspires me about your story with performance that there is a niche that it matters. There may be a niche for architecture too. I miss the success of somebody like Jim Keller, so maybe not for me. As they say in Hollywood, "When your hot, you're hot, when not, then not.", but I can try.
I wouldn't say I'm better. If the program you create does the job you want to the level you need then the task is complete. That's the job getting to those results. That's what counts. Not every program needs to be done the same way.

My methods came out of necessity. When I started programming we had a lot less memory to work with. Add to that working in industrial environments on different levels. You have to get things right or people loose lives. Working across different systems from a database one day and a robotic arm the next to automating a warehouse you tend to get a different perspective on the way things need to be handled. A database done wrong can loose millions of dollars. A robotic are can crush or kill a man and a warehouse full of automation well that's a lot of people put at risk. Then work on other stuff like chemical plants and power systems is an even bigger issue. Working on graphics, AI, games and other stuff is a lot less stressful. But you can incorporate so much of what you picked up from other places.

As for ECS. I would say it probably goes back further than wiki has it. Think about the early databases and the identity number used and the small amount of memory. Trying to keep track of data processing on a multi core processing system of the 70's and 80's. You know the systems that had a dozen 80186 & 80286 cpus with a dozen terminals connected. You didn't really have the memory on cpus or ram to handle large structures. So keeping track of the identifier and plus the data was about it when you have a dozen things going on at one time. Can't really blame someone for not thinking about it because most the people who write these articles are limited to the knowledge they have and how far back it goes.
 
Last edited:
  • Like
Reactions: Marzepain

KingmakerVN

Active Member
Game Developer
Nov 23, 2019
708
310
I know stuff like this will bite me in the ass eventually, even though what I do is pretty straightforward.