Computer programming for fun and profit

If I was going to teach ANYONE how to write software, I’d first teach them an assembly language. Nothing like working near the bare metal to grasp the fundamental concepts of computation. Nothing too difficult, maybe add a list of numbers and the print “Hello, World!”. Then we’d implement the same in C etc.

4 Likes

Taking a course on how to write assembly on a z80 processor was what convinced me that I do not have the aptitude or interest to become a programmer. The work it took just to illuminate the training board’s segmented LED display seemed insane to me at the time.

3 Likes

Thank goodness it wasn’t the 8088! All that work normalizing pointers causes permanent damage.

2 Likes

Assembly is NOT the best for learning programming, not even close.

8 Likes

Agree. My path went js, to java, to C++, to assembly. Also throw in MATLAB & Fortran and some unix on the side. If I had started at assembly I’d have been screwed.

Say I have a value I need to multiply by two. Is it easier to say a = 2*a? Or to push my variable into a register, do a roll left, then pop the value back into my variable? The former please.

5 Likes

You don’t teach programming languages you teach programming techniques/algorithms. I’ve lost track of all the different languages I learned in CSci but I never had one class that was a language class.

That background (even though I’m not a software engineer) has been incredibly helpful. I recently had to pick up Go because the Python BLE library had an unfortunate habit of crashing in a spectacular and un-catchable fashion. It was NBD because my education taught me how to learn any language not because I was taught a specific language.

6 Likes

That’s how they did it when I took classes (i.e. courses for various languages), at a two-year college, where maybe things are different? I figure the idea there was that someone could apply the coursework directly to their job, or finding a job – hence the Unix class that I took. OTOH, the 2nd semester of C Programming could’ve been named “Data Structures” that happened to be taught using C; the aforementioned Pascal class could’ve been “Algorithms” (anyway that’s the first assignment he actually gave us; no code involved); the C++ course could have been “Object-oriented programming.” (In hindsight, the 1st semester of C might as well have been a Unix class, since we had to write in vi, etc. and it felt like a baptism-by-fire)

(They also had an Ada class, but I never took it)

When my wife went through the same program, just a few years after I did, IIRC Pascal was gone and Java might have supplanted C/C++.

3 Likes

APL was cool because many operations got their own nifty symbol. I mean mathematics has a bunch of weird symbols and gets new symbols all the time. From that I see that there is a certain logic to using more than just the Latin alphabet and a handful of symbols present in ASCII. On the other hand having a normal keyboard where I can easily touch type makes COBOL rather nicer than APL.

4 Likes

I will agree that, judging by the plethora of programming languages in existence, there are a very many opinions on the topic.

My thinking is once you understand intuitively how von Neumann computers actually work, there are very few computing topics you won’t be able to understand.

Sure, but they don’t teach CS without teaching you a language first.

But ANYHOW, I guess everyone is right. Even though my early assembly experience taught so much about how computers work a handful of years before I even ventured into CS, they’re probably not crucial for people to learn anymore. Although we’re now using Web Assembly in our js front end, so some things come full circle.

2 Likes

Yep. Always fundamentals first, flavor of the month can be picked up later.

5 Likes

a *= 2

(*a) *= 2

No need to push a variable into a register and pop it back. Do it all in place.

However, I suspect that these programming bundles don’t clearly teach the differences between variables and pointers. Throw pass-by-reference into the mix and it gets even more confusing.

3 Likes

Clearly I wasn’t a great student.

2 Likes

Currently chasing down busted object references (and/or lack thereof) in a dotNetCore app I started last year. Due to a persistent bug in my IDE, I updated it and the main libraries, which have changed substantially in that timeframe. The underlying paradigms and syntax are still the same of course, but the handling of MVC endpoint routing are now different enough that training material produced in 2019 would be telling me that I’m doing it wrong.

3 Likes

When I was learning C++ back in the 1990s, pointers is what messed everybody up. It was where these kids who grew up programming their Commodore 64’s and never wanted to do anything else suddenly switched majors from CS to Management. Not only is the difference between a and a* important, but a* is “the value pointed to by a” not “the value of a*”. Everyone who didn’t understand that (and didn’t understand why it was important) got weeded out.

7 Likes

I would argue those C64 kids should’ve waited until Javascript went from a half baked idea to pass state between html elements to literally the Only Language that Really Matters to Capitalism™.

No pointers. A deeply gray area between reference and value types. No type safety. Really shitty object oriented-ness. You can whip up a decent looking front end in 5 minutes and then spend the next 3 years trying to make it work right while switching from React to Polymer to Lit to Sweet Jesus every other week.

And you don’t need to understand what a von Neumann architecture is, you don’t need to know “complex” computer math (omg 2’s complement and 3’s a crowd, shift … left and right? No way man I don’t do politics). You don’t even need any of that fancy I/O stuff because there’s a package for that (and if you’re lucky that package won’t be yet another rat hole that will suck your productivity down the web-toilet… the “weblet” as I call it).

Don’t ask me what the state of the web will be when the people who understand how to actually code stuff all retire. Who’s going to fix, debug, and re-write the packages that actually do stuff more than make pretty webpages?

(okay that was a bit of a rant, and I’ll gladly show you where on this doll javascript hurt me…)

11 Likes

:heart:++

5 Likes

Don’t hide behind innuendo and hyperbole, tell us how you really feel.

7 Likes

Python isn’t much better, except it’s for back end and not front end.* Shitty object-orientedness, complete lack of type safety passed off as a feature and not a bug, a big ole shit pile of static and dynamic types all mixed together. But hey, you can import an entire project from God knows where and expect that to basically do all your work for you instead of fucking up your code.

And all that nonsense is creeping into C++ and calling itself C++14.

I don’t know to which extent Java and C# use auto typing, but their dynamic memory management is a hot mess regardless.

Obviously, my answer is, you don’t learn any language first. You forget all of them, then go back in time to 1990 and learn C++ from Stroustrop himself.

Problem is, the time machine is written in Python with a JS front end, so you find yourself stuck in the year -2^32 when wild NoneTypes roamed the earth.

*See how I could have made a “Python for my back end” joke but didn’t? Congratulate me.

9 Likes

Yo mama’s so BASIC… etc

6 Likes

In a lot of ways, I am the programmer that has the best argument for Impostor Syndrome: started out in graphics, took up HTML & Friends shortly before the turn of the century, then learned VB courtesy of Access and eventually became a “full stack” developer in the .Net world.

Yet even I, with my non-CS-educated brain “ruined by Basic”, am horrified over and over (and over again!) by the inherent messiness that is dynamic typing.

Yes, I understand it, know how (and certainly why!!!) to check for data types and various flavors of null in twelve different ways, and even occasionally do some clever things with it, but seriously, what the ringtailed fuck? Why, why, why?

What fraction of the code that humanity generates serves solely to avoid the various problems caused by dynamic typing? I keep thinking that as I gain more experience and deeper insight that I will eventually understand the underlying wisdom of it all, but so far, nope, not yet. Maybe next year?

8 Likes