Here's a working paperclip computer from the 1967 book, "How to Build a Working Digital Computer"

Originally published at:


Can anybody here recommend a resource for learning about how to build a Turing-complete computer starting with basic NOT/AND or NOT/OR logic gates?

I have found this and it seems pretty good, but something even a little more summarized and less practically-oriented would suit my purposes well:

Web-based resources would be great as well as books. :slight_smile:

1 Like

That looks like a good general overview of computer architecture. There are plenty of CS text books, but from what I’ve seen they’re (like a lot of text books) poorly written and edited. They go into more detail and drill into individual topics rather than a general survey as that book appears to be, but without a classroom to guide you through lectures and discussion sections, I wouldn’t recommend starting with them. I’d start with that one you linked to instead.

I will recommend this book as an overview of programming.

But before you begin either book, to get the most out of them, I’d learn the basics of an object-oriented programming language such as Python, Java or C++ if you’re not already familiar with one, as it will make both subjects easier to grasp.

1 Like

Appreciate the tip! Goal is to learn more about how basic Boolean logic gates can be assembled into a computationally complete system, vs learning a skill like programming.

1 Like

I know what you need: Turing Tumble!

It’s a ‘board game’, where you can assemble logic gates on a peg board and demonstrate how to string them together to compute something. Very hands on!


Oh I love it, thanks! :pray:

You might want to look into Redstone computers in Minecraft. They are built from basic logic gates and have been scaled up to quite large contraptions. It’s a bit goofy looking but the logic is exactly the same.

2 Likes perhaps?

1 Like

For me as a kid it was transistors → gates-> flip flop → counters → comparator (gates again) and we’re off.

It was the flip-flop’s ability to retain state and then have that compared, and the counter’s ability to sequence operations that was my :bulb: moment.

Clearly, there are other approaches, and this one is way over-wired, however…

It was that sequence of devices, which I could build from transistors at home and cheap 74xx series chips, which was the germ of my understanding.

Having a Dad who had programmed the CDC-6600 at AECL in Chalk River, and who managed a “straight-8” transistor-based PDP-8 as well, was a big help too…

(Edit:) And having, in Grade 6 :thinking: :face_with_raised_eyebrow: :astonished: a school library with The Scientific American Book of Projects for the Amateur Scientist (oooh… PDF here…) which had some computer projects to get me thinking (as well as how-to’s for a home made atom-smasher and X-ray machine).


I used Digital Circuit Design by Niklaus Wirth to learn how to do this. I found it very simple and approachable, so it might be what you are looking for.


1 Like

This is a really cool project; I’m sure you’ll enjoy it a lot. I built one in a class in college 20 years ago. It’s really worthwhile because you really learn how a CPU works when you have to build it from scratch! :slight_smile:

We started with smaller projects like counters and adding devices, then went on to a full 4-bit computer. Honestly, if you have the time and the interest, doing this may help you understand it better too! :slight_smile:

I couldn’t find any good books on Amazon, which surprised me. I did find a blog article about it on the EE Website, I put the link below.

Good Luck! This is going to be so much fun for you! :slight_smile:

1 Like

Really appreciate all the great tips, folks! It will help me tremendously in further developing my theory that the multiverse is a self-programming computer. :laughing:


This topic was automatically closed after 5 days. New replies are no longer allowed.