Computer programming for fun and profit

Yo mama’s so BASIC… etc


In a lot of ways, I am the programmer that has the best argument for Impostor Syndrome: started out in graphics, took up HTML & Friends shortly before the turn of the century, then learned VB courtesy of Access and eventually became a “full stack” developer in the .Net world.

Yet even I, with my non-CS-educated brain “ruined by Basic”, am horrified over and over (and over again!) by the inherent messiness that is dynamic typing.

Yes, I understand it, know how (and certainly why!!!) to check for data types and various flavors of null in twelve different ways, and even occasionally do some clever things with it, but seriously, what the ringtailed fuck? Why, why, why?

What fraction of the code that humanity generates serves solely to avoid the various problems caused by dynamic typing? I keep thinking that as I gain more experience and deeper insight that I will eventually understand the underlying wisdom of it all, but so far, nope, not yet. Maybe next year?


(raises hand)


(also raises hand)

Although I’m a EE who took a few CS classes

(slowly lowers hand)


brain “ruined by Basic”


My path was even more roundabout than that… I majored in Radio-TV-Film (making content for 'em, not fixing 'em), and along the way one professor taught us about Ohm’s law and that got me into electronics in general.


C# has quite a bit of dynamic typing that lazy programmers love to use but it’s not required in most cases. To me dynamic typing is just fucking evil unless there’s an application where it’s appropriate (such as literally defining a dynamic type on the fly for JSON serialization or something).

var foo =;

If I’m in Visual Studio and use IntelliSense I can easily see that ‘foo’ is of type “Qux”. Wonderful. What if I’m looking at code in a web browser or notepad? Why shouldn’t I be able to just glance at the code and know what something is?


For all its flaws, and they are legion, that was one thing I thought VB got right: passing args as ByVal or ByRef is pretty bleedin’ unambiguous. (Of course, I never revisited my early efforts to learn c++ twenty years ago; is there an aspect I’m missing there?)

Oh that’s easy, just append ‘str’ or ‘int’ to the nam—— OWW, WHO THREW THAT??!??!?


Yeah, don’t get me started on that – and the evils of Hungarian notation.


While I will (reluctantly) admit to having done shoddy things like that in my misspent youth, I do advocate for the liberal usage of the proper version of HN (referred to as Apps Hungarian in the Wikipedia article). In other words, if a string is unsanitized, name it as such, and you (or whoever comes after you) become a whole lot less likely to pass it to the DB or the browser without cleaning it first. (Though I would never say that every variable needs a prefix, only the few where it truly makes sense.)


You won’t, because the people who do that were all born in, like, the 80s or something /s



I hate it how nowadays the vast majority of people who write code think that that everything is either a web front end or a back end that comes with a web front end.

Care to elaborate? I went from C++03 to C++11 to C++14 to C++17, and I have no clue what you are referring to.

That’ll be a very long course. I’ve learned some abstractions that way, by first getting the hang of them in a lower-level language and then using the higher-level language as a convenient shorthand for it. But I also know of too many people who got stuck writing Go and never progress to higher-level languages.

1 Like

I’m an EE who can program well enough to be dangerous. I’ve done plenty of work on systems that are in use in the real world, but in precisely none of them was the software a customer facing component. Also, in very few of them (if any) were the customers Joe and Jane Consumer, but more likely government and commerical entities, research labs, etc. So when a recruiter asks me if I am a front end or back end developer I have to reply that I’m not a “developer” at all. If they’re looking for someone to develop commercial software, they’re barking up the wrong tree, but they’d know that if they bothered to read my resume.

The only question I hate more than that is “do you develop web apps or desktop apps” >___<

I was referring to the auto keyword, and other associated fuckery relating to runtime typing.


The fact that Rust doesn’t pop up in this threads makes me sad. I’m learning Rust right now it’s fun to actually learn something that isn’t stuck in the DotNet stack. :smiling_imp:


Python is great for data analysis/processing and scientific computation because of the libraries. It’s also good for quick, small automation scripts. I’ve found that, once a project gets beyond a certain size, dependence on external libraries and the lack of type safety really start to bite you.



… but the auto keyword has nothing to do with runtime typing. It’s pure static typing. It just relieves you from writing out the same type over and over again. Allowing you to use more static typing, not less, before people use less meaningful types in order to avoid having to type to much… (pun not avoidable).
In fact, I can’t think of any “fuckery related to runtime typing” recently introduced into C++ (exceptions and RTTI are the only run-time-type related features I can think of, and they’re both from the second millenium).

I understand, but at least almost everything that doesn’t require a web browser to run can somehow be made to run on a desktop computer (if only for testing) and can thus be described as a “desktop app” :slight_smile:


Python, the language that compiles down to … something … it’s sure not native code, and it really shows it in performance. (Yes, there are native code compilers out there, but I doubt many people use them.)

I can’t believe people use it for data crunching and analysis, or hahaha cryptocurrency mining. (Probably with C libraries or something.)

Virtual environments that take a big snapshot of all those libraries so that your project won’t bit-rot in a week when one of them breaks its interfaces. Of course, later fixes and security patches won’t happen unless you take the time to update all the snapshots, but who has the time for that?


I hadn’t thought about (or, didn’t remember) there being a difference. As I’ve previously mentioned, I don’t really write any code. But in my case, if someone asked me whether I was a computer programmer, a software developer, or a sysadmin, my answer would be: “Yes.”

My company pays me to be a “systems analyst” but I don’t really think that’s what I do. According to Webster, “software engineering” might describe what I’ve done, at one time or another, but I believe that the term “engineer” should be reserved for those who do one or more of the following:

  1. Possess an engineering degree
  2. Maintain an engineering license
  3. Operate a choo-choo