Can you break it into smaller merges? It seems like it would be really difficult to review 96 changes across dozens of files while keeping track of the plot.
I generally like git.
It has a huge learning curve and even after using it daily for 5 or so years I can’t say that I understand many of its intricacies. I still need to pull up a tutorial or man page at least once a week when things shit the bed. Its inconsistent and often confounding naming conventions don’t help matters any. You can tell it was written by engineers. That being said it’s a pretty good RCS, warts and all.
I like how I can do a lot of work with minimal or no network connectivity, and it’s so easy to do branching and collaborative development. These are things that other RCSes I’ve used in the past really suck at.
Can you break it into smaller merges? It seems like it would be really difficult to review 96 changes across dozens of files while keeping track of the plot.
Perhaps, but that’s not really the process at this company. You complete your feature, the MR gets reviewed, and the code gets merged into the development branch and deployed to staging for testing. Merging half or a third of a feature would just make things more complicated downstream.
Why not?
Well, I learnt some new stuff here. Thanks, Mark.
To be clear I don’t think it is a disaster to do so, but I don’t think it is great either.
So the upside of a learner language is they are indeed designed to be easy to learn, and are sometimes successful (Pascal for example I do not think is successful, it isn’t notably easier to learn than a real language). Learner languages are also more limited in what they can do.
So if you learn a learner language, but your dream is to ship the very best iOS calculator app, well once you are done learning you can throw away all the language specific things you learned and start again on a language you can actually deploy. Granted learning a second language is a lot easier than learning a first, but in my opinion not by enough to justify the waste of learning a whole throwaway syntax and type system.
Alternately if you learn a learner language and your goal is to ship an Android game, well once again when you are done learning you get to throw away all that language specific stuff.
On the other hand if you learn say Java, it is harder because Java has a weight of libraries and crafted on added on features, and you may wonder why you learned about Array and then most of the time you see ImmutableList, or some other Java8 streams thing, why isn’t it all just in the stuff you started with? However you do get to use most of what you learned. At least if you were planning on writing a Android game. If you wanted to write that iOS calculator you picked the wrong first language.
In my opinion the best language to learn first is something that can solve a real problem you have, and if possible has an interpreter (or a fast enough compiler that it pretends to be an interpreter like Swift’s playground). It is best if it is still popular enough that you have a choice of tutorials and a community that can help. It is best if it can help you solve other problems later.
That doesn’t really rule out a learning language. For example if the “real problem” you have is you want to write the kind of simple game you can actually write in Scratch (a visual language designed for learning), then yep Scratch ticks most of the boxes I set out. In fact in that case learning something like Python first might be worse because I think Python is farther from your initial goal (or I may be wrong, if it has some easy to use Sprite libraries for the platform you are using).
People always talk up the distributed thing and then proceed to use it exactly like cvs.
This cannot be overstated. You personally have to be invested in solving some problem with code; otherwise you run the risk of being overwhelmed by information and giving up. Being driven to arrive at a solution will keep you on task regardless of the choice of language you made.
But how to choose that first language when you know nothing about languages? Look at the dominant language of the environment you’ll be working in. Is it an iOS project? Plan to learn Swift. Will you be doing a lot of statistics modeling and chart generating? Consider learning R. Is a Microsoft Windows web app? Check out C#. Are you going to script existing things together? Python runs everywhere but is native to nowhere, shell scripts are the mother tongue of the Unix/Linux world, while PowerShell is the most capable Windows scripting language.
Also, consider what resources you have access to and who you might be working with. Do you have a friend who is a web developer and knows spring boot, and you’re trying to write a web app? Learn java and spring, and lean on your friend for jump-starting advice; see if you can ask them to do some pair programming with you at the keyboard. Do you have a local Community Ed teacher you could take classes from? Might be worth investing time in more formal training to learn some of the core concepts that are central to all programming languages, and branching out from there.
Yeah, I’ve seen too much of that. “Checking out” source code is what my dad had to do sixty years ago (because it was a box of cards kept in a library!)
It’s always surprising to me when I have to point out to seasoned developers that computers have changed dramatically over the last sixty years, so why wouldn’t they think software engineering practices havn’t also changed dramatically?
History teaches us lessons. We should be better than our history.
Thanks for the detailed answer but I don’t understand why you need to have a “real problem” at all in order to learn to program. Why can’t you do it for the fun of it, or just to gain a basic understanding of how programming works? I think this attitude of being all super-serious about programming just turns off people who might otherwise be interested in trying out a course on beginning Python on Coursera or whatever.
I also believe society as a whole would benefit if we could get more non-IT people into programming by getting them started with “learner languages.” If nothing else, it would encourage analytical thinking and reinforce the importance of programming skills for the jobs of the future, including those not in the IT industry.
Sorry, but… you don’t. I started learning Python earlier this year and I just do it for fun, as a hobby. I used to do logic puzzles in magazines but this is more enjoyable. I mean, I am invested in solving whatever problem the instructor gives me on the given topic lesson, but beyond that I am not invested in solving any larger problem with the knowledge I’m learning. Not everything has to have a utilitarian purpose. We assume that people can make art for its own sake but programming has to have some defined end goal? I wish it wasn’t this way.
But you are invested; invested in solving the puzzle, invested in aceing the software class, invested in making a web site for your social club, or invested in getting a task done at work. It’s the commitment to finish that’s going to take you further and further down the rabbit hole, where you will gain more and more experience
My favourite git “resource” is this delightful man page generator.
Ok, that’s absolutely hilarious. Did they just feed all of the git man pages through a Markov chain generator?
Like this:
https://git-man-page-generator.lokaltog.net/#85774d539ced0a5231f5099618eeb2a0
And this:
https://git-man-page-generator.lokaltog.net/#b2d27fbe3557e9b6daedcfbb22529a83
Are just so damn convincing.
Right, at some level that is true! But my point is that not everybody is interested in going “further down the rabbit hole” or gaining more experience, nor do they need to. I wish we would emphasize that point more, as I think it would encourage more people to start doing it. Thanks for responding.
I got taught C++ then Pascal, in that order (I can’t code for shit).
In the UK, back in the 1980s, the BBC led a project to teach programming skills, and the BBC Micro was extensively adopted in schools for just that purpose. Then, in the 1990s, they trashed the teaching of programming skills in favor of teaching Word and Excel.
Here in the US, we never had a nationwide effort of that nature, but my own early-1980s high school experience involved a DEC PDP-11, with BASIC in my junior year and COBOL in the senior class. I can only wonder if my high school switched to Word and Excel rather than programming, and it wouldn’t surprise me if they did.
I’m glad to see the Raspberry Pi take off - making a capable, versatile computer that can be had for the price of a few restaurant meals (or, for the Zero, the price of a fast-food meal!) goes a long way towards rectifying the “computer literacy only means knowing how to use Office” debacle.
This topic was automatically closed after 5 days. New replies are no longer allowed.