- Jun 10, 2017
- 11,069
- 16,484
But the aspect on which they put the accent can also render a theory wrong, when not totally obsolete. TDD will always be a valid approach, but aren't OO algorithms, and so all your views about OO itself, depending if you center on the code or on the data ? And the same can be said regarding "good practices".I think that those with enough experience know how to value the different aspects of the craft. It's just that different people place different accents on certain aspects.
You must be registered to see the links
, goes totally at the opposite of the "always know what object you're using" principle.Not all theorizing became false, but it's only those regarding the pure theory of programming that aren't effectively entering in conflict with modern computing ; or past computing if it's a modern theory. The instant your theory start to talk about the code or the data, it will become totally dependent of the context. Not only because you don't code a game like you code a database, but also because in a game most of the time is past processing the data, while in a database it's finding/storing/sending the data that take most of the time. Therefore, for a game it's the optimization of the code that matter the most, while for a database it's the optimization of the data flow that come first. Whatever how optimized is your code, sending the data through filters corresponding to the SQL request will always be faster than calling functions accordingly to the SQL request and telling them where they can find the previously proceeded data.
In the first approach, you build a tunnel with the filters needed by the request, and let the data pass through it. This while in the second you've your data in a basket, and goes door to door with them, following a pattern that correspond to the request. The time you'll lost build the tunnel will never be as high that the time you'll lost traveling between the doors.
And this process will be facilitated by a data centered approach that too few among the old generation achieve to apprehend as a necessity. Once again not because we are stupid, but because, between school then our years of experience, our brain isn't formatted to think in the right way. And we now have past too many decades working with this format for it to be easily changed. Updated yes, but not changed.
Before building me a throne, remember that it was in the 80's. I can surely retrieve the logical approach needed to use ASM, but between the multi-core building, the 8, 16, 32 and 64 bits registries and instructions, I would drown really fast. This said, it taught me a certain coding style that can probably still be seen time to time in my codes, like by example the Perl hack I talked about previously and that is a pure ASM approach, do not loose time using intermediaries when you can avoid it. What is probably the most common ASM hack is to use XOR to reset a value,I must say, you are a better programmer then me. That's really hard core and I admire that. ASM is like seeing the green letters of the Matrix. I can view it, but I'm no Neo.
registry XOR itself
being probably still faster than the assignation of a wild value to a registry.It follow the logic I approached previously, now people goes to programming not by passion but because it's a job like any others. The simpler will be the "code to human" interface, the more people will be able to learn how to use it correctly. And something like Blueprints is just the next step. We had compilers to make us free of the complexity of ASM codding, and now we start to have a visual approach that tend to free us from the complexity of codding.And now the switch is being made from scripting to visual languages. That something like Blueprints for Unreal can function, enabling designers to program takes it to a whole new level. Although I hear, that the designers who do that need to know a little thing about programming.
And it will probably follow the same patterns that was followed by compilers. At first the result will just be an approximation of what you had in mind ; a working one, but still just an approximation. Then it will become more and more accurate, before becoming more and more efficient. And it the end it will be more than "good enough".
But like for compilers, anytime you'll want sometime totally accurate, you'll fallback to the previous step ; ASM for compilers, compiler language for a Blueprints-like approach. This until too many decades past and too few peoples remember what "codding" mean in the good old time.
Well, the last step will probably need a century to happen ; it's a bigger one than just passing from ASM to compiler languages. But it will happen, and people who then will use #V++Extended will looks at those who can write few lines in C++ as heroes.
I'll not say that I fear this time, it's progress and it will surely lead to some good things ; and anyway I doubt that I'll see my 150th birthday. But as I implied, having to use ASM taught me a lot of things that I wouldn't have know, or at least not fully and truly understood otherwise. It's like driving a car by example, you don't need to know how it works to do it. But if you know, then you'll drive it in a more efficient way, an efficiency that generally imply a natural optimization.
Reading that in perspective with what you said above about, starting by your "poor choice", give it a really bad feeling, while throwing me back in my own past ; especially my early years after graduation when I was nothing more than a code monkey (not[...] but I'm hitting 45 and am actually unemployable, while there is a shortage of 50K on 150K IT people in NL. Bizarrely, every headhunter thinks I'm a unicorn with a pot of gold attached to it, while every hiring manager has very different ideas.
You must be registered to see the links
, or whatever it is) ; don't think, simply turn this algorithm into code, thanks. Personally I turned to freelance career, until I had the opportunity to works for a friend of mine ; not that I was unhappy being freelancer, but after my wife's death being at home every time was, well let's call it "less funny". I hope you to find a place like that one day, a place where, "as long as the job is done, and done well, when it need to be done, do whatever you want". Those places exists, but you need to looks at small structures that don't necessarily dreams too big ; what don't mean that you'll not have a decent salary, happy workers do better works, and customers are ready to pay more for good works. But it's a long road before finding them, alas.
This said, there's many way to reach this goal. Join an open source project, start a blog, help people on a forum, therefore let yourself be known while showing what you worth. We are in an age of communication, if you don't communicate, you don't exist, and while communicating, you're showing your capabilities. It will not be enough by itself, but with time you'll be noticed by people who can be in need of more than another code monkey.
Hey, the net also gave us Open Source. It obviously existed before internet, but Linux, among so many other things, would never have been what it is now if we were still limited to people around us and BBS communications. Centralized repositories available fast and easily to anyone who participate, mailing list to centralize the discussions, and obviously the web to promote and distribute. And it's not just a giant step, it's also a culture change.That really takes me back. CGI makes me think of Cyberspace and Cyberpunk, of the limitless possibilities the internet could have. The weird, the hippies and nut-jobs that pioneered the net. When a burned out stock trader from NY wanted to retire with a bookstore in SanFran, being good to people and did the internet thing because it was cool. Bezos and Amazon are a long way from where they started and so are all the rest.
Look at the adult gaming scene. 99% of the guys and girls who're actually working on a game available here would never ever have thought about this otherwise. Don't let the bad parts blinds you to the good ones.
It goes further than that for Python. It's not just a question of speed that don't matter, the whole language is designed around exceptions. The attribute (whatever a variable or a method) of an object don't exist ? It's an exception that will tell it. You've reach the end of an iteration ? Again it's an exception that will tell it. And so on. I'm not totally sure, but normally Python have the most efficient implementation for exceptions that can be found.Checking first is faster then throwing exceptions, because the exceptions stack is not optimized for speed, at least not in .Net. Python doesn't care much about speed, so I get the approach.
But what bother me the most about this approach is that it feel like a big fuck given to the users. Take Ren'py by example, it not validating the data, you'll trigger a problem at the start of your game, and the error you'll get is that something deep inside the core goes wrong. It would be so much useful and interesting to be said something like, "what the fuck are you trying to do".
I totally understand that it's impossible to validate every possibilities, but it's why exceptions exists, to catch what you haven't thought about. After all, it's called "exception", not "it's the norm".
It's what they are doing. It would be useless to name them, but I know many games that were abandoned because it wasn't possible, or not possible anymore, to ignore the problem. And the fact that Ren'py do not validate beforehand is part of the reason. It's already difficult to solve a problem when your knowledge is limited (what isn't a problem by itself) and so you don't understand why this problem happen, so when you aren't even told where you did the error, you've near to no way to solve it.There is hope that things get more adopted when they get simpler. I do sometimes feel that amateur game makers rather have problems then avoid them.
Few days ago,There is something heroic when people chase a bug for days, while people who carefully program to avoid mistakes are seen as boring.
You must be registered to see the links
twitted something really relevant to what you said: "Weeks of coding can save you hours of planning." UnknownI think that, more globally, with the years passing, we understand that there will always be bugs, while discovering that, well, chasing them tend to make us have better code in the end. So, we are planning, we are trying to avoid mistakes, but we don't overdo it. We will have to debug anyway and, what is worse because boring, at some point we will have to rewrite this code, so...
But, more importantly, we fucking love to code, and planning or avoiding mistakes isn't coding, it's thinking. This while chasing bug is coding, we change tons of lines, and so all, and obviously, we follow Kernighan's advice, we write tons of
print
; "the most effective debugging tool is still careful thought, coupled with judiciously placed print statements."Oop's, he said "judiciously placed"