Reddit Reddit reviews Introduction to Computing Systems: From Bits and Gates to C and Beyond

We found 22 Reddit comments about Introduction to Computing Systems: From Bits and Gates to C and Beyond. Here are the top ones, ranked by their Reddit score.

Computers & Technology
Books
Computer Science
Introduction to Computing Systems: From Bits and Gates to C and Beyond
Check price on Amazon

22 Reddit comments about Introduction to Computing Systems: From Bits and Gates to C and Beyond:

u/lyinch · 11 pointsr/EmuDev

Hey,

It looks like you haven't actually started with your emulator development. Begin to check out the opcodes of Chip-8, have a look at the hardware and try to reason, why you won't have an x86 assembly file.

In addition, have a look at the difference between those numbers and those numbers.

It looks like your understanding of computer architecture is quite limited - nothing to be ashamed of - you might be better off by reading Digital Design and Computer Architecture by Harris and Harris or Introduction to Computing Systems: From Bits and Gates to C and Beyond by Patt and Patel than writing an emulator right now.

Both books give you an absolutely wonderful overview of computer architecture, starting at the boolean logic, the transistors and the gates you can form with them, going over combinatorial and sequential logic and ending at the design of your own CPU written on an FPGA with a HDL (which few programmer encounter in their career). The latter even gives you a detailed overview of multicycle processors, and goes beyond that to analyse GPUs and some more modern techniques to improve multicycle processors.

As I'm giving out advice, you might want to have a look at openGL which is used to render modern games or SDL which is more common in the emudev community. (be aware that SDL is a few abstraction layers above openGL and relies on it)

u/trixandhax · 7 pointsr/gatech

Here

Ubuntu is recommended, but you can use some other distro. However all of the support I'm doing is only for Ubuntu based distros. Some others have done things to get it working for other distros like Arch.

And here is the book

And as a bonus [here](https://drive.google.com/open?id=0B6g7zcZaFwPTVVI1eDBYcG1GbE0
) is a presentation I did last semester which includes an overview and some sample programs.

edit fixed 3rd link.

u/Drcool54 · 5 pointsr/UIUC

Okay I came in to school like you with very little programming experience. Probably even less than you since I only messed around on my TI. I am going to assume you're only taking ECE110 first semester. If not I recommend getting in as soon as you can. They may give you some crap about it depends on last names, but it doesn't really matter. After a certain point its open to everyone.

Either way, programming in ECE doesn't really start until you take ECE190 which is all C programming and a very simplified assembly language for educational purposes. Like I said I went into the class with practically zero programming experience and still did very well in the class, so don't let anyone scare you on that. If you put the time aside to read the book (really helpful in 190) and doing your MPs/ask the TAs questions you will do fine.

I wouldn't fret too much over the summer with learning stuff, but I would definitely recommend C over Python. Python is pretty easy to pick up, but its also very high level. If you need an introductory language to get familiar you can try python for a bit, but I'd go with C after that. It is worth noting that the other two required programming class you have to take (CS 225 and ECE 391) are C++ and C/x86 respectively. So learning C should definitely be your focus.

I recommend the book written by the creators of the language. The book the school requires is pretty good too actually and would give you a better idea of what to expect. They're kind of pricey, so its your call how you want to get them. As a heads up, codecademy does have Python, but not C as far as I recall. I've never used lynda do I can't comment on them C Book ECE 190 Book

I honestly wouldn't fret too much about it all. Enjoy your summer, depending on how busy your schedule is next semester you can probably set aside some time now and then to study some languages. If you have any more questions I'd be happy to answer.

u/schreiberbj · 3 pointsr/compsci

This question goes beyond the scope of a reddit post. Read a book like Code by Charles Petzold, or a textbook like Computer Organization and Design or Introduction to Computing Systems.

In the meantime you can look at things like datapaths which are controlled by microcode.

This question is usually answered over the course of a semester long class called "Computer Architecture" or "Computing Systems" or something like that, so don't expect to understand everything right away.

u/epakai · 2 pointsr/C_Programming

I liked Introduction to Computing Systems: From Bits & Gates to C & Beyond It does start out at the low level, and uses a made up Instruction Set Architecture, the LC3 (or LC2 in older editions). Simulators are available so you can actually run code for this machine.

u/6553321 · 2 pointsr/learnprogramming

In the same line http://courses.ece.illinois.edu/ece190/info/syllabus.html. There are videos there from the course that I learned programming in. Tells you what a computer is from transistor and gate level to finite state machines. Finally working its way up to C. It is based on the book Froms bits and gates to C and Beyond and the lecturer is one of the co authors.

u/prylosec · 2 pointsr/learnprogramming

As someone who has gone through the rigors of learning assembly programming for various architectures, I believe I can offer a bit of insight to the situation.

The Art of Assembly Programming (AoA) is great for going right into the x86 architecture, however it very complicated and there may be a bit of a learning curve when going straight into it from the get-go.

Patt & Patel have a great book that starts from how gates are organized to form basic components, and moves into assembly programming for an artificial processor called the LC-3. It then goes into developing a C compiler for this processor.

After going through the previous book, I started AoA and got into x86 programming, which even after a solid understanding of the basic fundamentals of computer organization still had quite a learning curve.

If you are ok with your education taking a bit longer, you might look into working up through a few different processors. It will give you a very robust knowledge base, and will give you insight to assembly programming on other processors, as well as give you a few more "tools" for playing around with.

Here are some fun things to learn:

-LC-3: Incredibly basic, and has no "real-life" applications, but exists solely to teach basic computer organization. Great for starting out.

-PIC 16-series: Very simple and cheap microcontroller, but it has a lot of "higher-level" functionality. These are great for playing around with basic robotics and making fun toy projects, plus it uses the Harvard-architecture, which is very interesting and quite different than the "standard" processors we find in modern computers.

-6502: This was mentioned in this thread and its great. The 6502 was very popular back in the day (Apple II, Atari 2600) and learning it can lead to some fun projects. I wrote an Atari 2600 emulator back in the day in Java, and it was a lot of fun.

Structured Computer Organization by Andrew Tannenbaum (creator of Minix) is really good for learning some more complex computer organization topics such as pipelining, and it goes into detail about how the JVM works.

Once I had a solid base understanding of x86 programming, I started getting into Malware Analysis & Reverse engineering. This is a hot topic right now and there are a ton of resources out there from beginning to advanced.

u/PartyProduct · 2 pointsr/computerscience

If you are looking for specifically the basic hardware stuff, this is the book that I used. It is a textbook but it is written well. https://www.amazon.com/Introduction-Computing-Systems-Gates-Beyond/dp/0072467509

u/somethingtosay2333 · 2 pointsr/hardware

Depending on depth, if you just want hardware (since you're asking in r/hardware) without regard to the computational side of computing then perhaps a free A+ certification course on Youtube or test prep book would be helpful.

Later if you wish to learn the theoretical side then I recommend a CS 101 course. A good Introduction to Computer Science textbooks will introduce things like abstraction and logic using gates then move into how programming is encoded into circuits. Good ones also give an overview of other things like complexity and networks without bogging down the reader in abstract mathematics. Although I haven't read it, I think this looks good - https://smile.amazon.com/Introduction-Computing-Systems-Gates-Beyond/dp/0072467509?sa-no-redirect=1

Another resource might be NAND to Tetris where you build a working computer.

u/goodolbluey · 1 pointr/explainlikeimfive

Hey, I just started a class about this subject! Let me share what's in the first chapter of my book.

One of the main principles behind the architecture of a computer is called abstraction, which means more or less that each layer is separated from the next. When someone drives a car, they know what the dashboard lights mean and what the pedals do, but they might not necessarily know how the internal combustion engine works -- and, as long as it's working they don't need to. This lets different people work on different "layers" of the computer in different ways.

Think of a computer program as solving a problem. You need to do math? There's a calculator to solve that problem. You need to read an email? There's an email client to solve that program. So, at the top layer, you have something you want to accomplish, and you need to tell the computer how to do it.

The next layer is one of algorithms -- giving the computer step-by-step instructions to accomplish the task. We design ways for computers to sort lots of data quickly, or to ask another computer for specific data.

To implement those algorithms, you need to use a programming language. Sometimes they're called mechanical languages because where a natural language like English or Spanish can be ambiguous, mechanical languages are never ambiguous -- they always mean exactly what they say. Some examples of a programming language are C++, Java, Fortran, COBOL, or Pascal. Here, you can teach the computer, for example, IF a button is clicked, THEN a window should open.

But, how does a computer understand a programming language? Programming languages are often translated by a compiler, which turns the language into instructions that are understood by the computer's ISA, or "instruction set architecture." The most common ISA is the x86, from Intel. PowerPC is another ISA, that Mac computers used to use. The ISA contains the instructions, data types, and addressing modes that a computer can understand.

The ISA is implemented in the next layer by a detailed organization called a microarchitecture. If an ISA is a type of chip, a microarchitecture is a specific chip, like a Pentium IV or a PowerPC 750FX. It communicates directly with the chip hardware.

The next layer is the logic circuit - I think (but I'm not 100% sure) that this has to do with the path operations take on the chip itself. Different operations will be performed in different ways, depending on the speed or power requirements of the operation.

The last layer is the device itself - my book gives examples of CMOS circuits, NMOS circuits, and gallium arsenide circuits, but it doesn't go into more detail than that. This is the physical layer, where transistors and diodes live.

I hope this helps give you a little more perspective! I'm still learning about this stuff too, and I think it's pretty neat.

u/captkckass · 1 pointr/askscience

OP if you are really interested in learning all this I read this book for a class at Colorado State University. It was a really good book.

http://www.amazon.com/Introduction-Computing-Systems-gates-beyond/dp/0072467509

u/HidingFromMyWife1 · 1 pointr/ECE

If you're specifically into computer architecture, I'd suggest buying a college textbook and working through some of it. The University of Illinois used this book when I was there. I found it very useful as an intro to computer architecture. They use an LC-3 16 bit fixed length instruction set to teach you the basics of computer architecture. Next steps would be implementing this processor in either a simulation environment or possible even an FPGA.

u/miguel_sucks · 1 pointr/compsci

My favorite CS course at Penn was Intro to Computer Architecture. It went through each abstraction layer from transistors to C. It was very digestible and a good prerequisite to a thorough OS or computer system design course. It looks like the lecture slides are still available here, and the textbook is here.

u/Pillagerguy · 1 pointr/UIUC

https://www.amazon.com/Introduction-Computing-Systems-Gates-Beyond/dp/0072467509

Look for a PDF of this book somewhere. I don't have one.

u/kickopotomus · 1 pointr/computerscience

If you want to better understand how computers work and take a bottom-up approach into programming, check out Patt's Introduction to Computing Systems.

u/minagawargious · 1 pointr/UTAustin

Get into Yale Patt's Introduction to Computing Class (EE 306), and get his textbook for the class and start reading it over Summer.

u/ClimbingWolfBear · 1 pointr/AskNetsec

https://www.amazon.com/Introduction-Computing-Systems-Gates-Beyond/dp/0072467509

This was the grand majority of my undergrad computer engineering degree. You can end up with most of the functional knowledge without a lot of the mathematical background.

u/croyd · 1 pointr/books

Yale Patt's book.

I took the course based on/around this book with the author, Yale Patt, my freshman year. I basically went from knowing nothing concrete about computers to having a pretty good idea of how I could make one if I had, say, a tub full of transistors.

(The book went hand-in-hand with the class; most of the lectures and examples were drawn almost entirely from the book.)

Also, the book basically takes you from practically the lowest level, transistors, all the way up to a higher level programming language, C.

u/[deleted] · 1 pointr/ComputerEngineering

It might not be exactly what you're looking for, but my school has a required freshman computer architecture class based around the LC-3 architecture. It's more simple than x86 or ARM, but it can run the C language, and in my experience, is better for an intro to architecture.

The slides are all online for free: https://wiki.illinois.edu/wiki/pages/viewpage.action?pageId=697870760

The first ~half of the slides to my memory aren't architecture specific, and they have lots of info on gate level architecture, which seems like what you are looking for.

There is also a companion book to the course that wasn't required, but was pretty helpful/well written: https://www.amazon.com/Introduction-Computing-Systems-Gates-Beyond/dp/0072467509/ref=sr_1_1?keywords=patt+and+patel&qid=1566327880&s=gateway&sr=8-1

And if you want to look into LC-3 assembly coding, there are LC-3 simulators/compilers available for download online as well.

u/Merad · 1 pointr/askscience

> I wanted to write a program to emulate a CPU so I could fully understand how it's operation actually worked, but I have no idea of what the "start up" process is, or how we determine when to get a new instruction

The CPU begins loading instructions at a fixed address known as its reset vector. On AVR microcontrollers the reset vector is always at address 0, and it is always a jump instruction with the address where startup code actually begins. On x86 processors the reset vector is at address 0xFFFFFFF0. For a CPU emulator, which presumably doesn't need to deal with interrupt vectors or anything like that, I would just start loading instructions from address 0.

Also, you should should look at some of the simplified CPU designs that have been made for teaching. In my classes we used the LC-3 (a very simple and straightforward design) then moved to the y86 (a simplified x86 design mainly used to teach pipelining). It will be much more realistic to make an emulator for one of them rather than an extremely complex real-world design. I've linked below to textbooks that are based around each of those designs.

http://highered.mheducation.com/sites/0072467509/index.html

http://www.amazon.com/Computer-Systems-Programmers-Perspective-Edition/dp/0136108040

u/Loupiot · 1 pointr/france

Je vais te donner une opinion orientée.

Déjà, en programmation il faut bien distinguer l'approche « algorithmique » d'une approche, disons, plus « architecturale » des choses.

Pour débuter je ne saurais trop te conseiller l'approche « algorithmique », c'est-à-dire savoir résoudre des problèmes simples : partir d'une entrée, savoir composer proprement des fonctions qui fonctionnent bien entre elles, et aboutir à une sortie. Comprendre la récursion, l'itération, le mapping, etc. Pour ça il faut un langage qui soit de très haut niveau, proche du « pseudocode ». Python est une bonne idée ; pour ma part je vais te proposer Scheme.

Il existe d'excellentes ressources pour apprendre à coder en Scheme, notamment How to Design Programs qui est impressionnant. Ce cours utilise un langage proche de Scheme, Racket. Si tu te lances dedans tu en as pour longtemps, et tu développeras une approche très bien construite de la programmation. Ne te fie pas au Prologue qui est justement là pour te montrer l'approche qu'il ne faut mieux pas avoir. Il existe un MOOC gratuit associé. L'étape d'après si tu veux vraiment pousser les choses (très) loin serait Structure and Interpretation of Computer Programs. Pour ma part je conseille d'y revenir beaucoup plus tard.

Quand tu auras développé des compétences « algorithmiques », alors il serait bon de s'intéresser à une approche plus « architecturale ». Pour ça je te recommanderai Java, qui va t'obliger à penser en terme d'objets et à développer des compétences d'organisation de ton code et de pattern strategy (comme on te l'a dit plus bas, la programmation orientée objet n'est jamais que de la refactorisation). Il y a de bons MOOC sur OpenClassRooms ou Coursera par exemple.

Commencer par Java me semble être une mauvaise idée malgré le typage fort, dans le sens où le langage se mettra en travers de la pensée « algorithmique » (syntaxe de Java, le fait que tout soit classe, etc.). Beaucoup d'universités remettent en cause Java comme premier langage.

Commencer par un langage comme C ne me paraît pas non plus une bonne idée dans le sens où c'est un langage d'assez bas niveau, qui va te forcer à gérer des aspects de la programmation (mémoire & co) qui sont très proches de la machine. Ce serait à mon avis un excellent troisième langage, après Java. D'autres personnes te diront exactement le contraire, c'est vraiment un parti pris. Il existe des livres excellents qui offrent une approche bottom-up et commencent justement par la machine (bits, portes logiques, etc.) pour arriver à C : je pense à Introduction to Computing Systems: From Bits and Gates to C and Beyond. Je ne crois pas qu'il soit disponible gratuitement. Tout dépend de tes objectifs mais aussi de tes goûts.

La pire approche à mon avis : se lancer directement dans des projets sans avoir développé une vision claire de comment structurer son code. J'éviterai pour cette raison, par exemple, des ressources comme Automate the Boring Stuff with Python. C'est très bien quand tu as déjà construit des « cases » mentales où mettre les choses que tu apprends, mais au départ ça te perdra complètement (bien sûr, pourquoi pas en parallèle pour le fun et faciliter ta vie numérique quotidienne :)).

Bon courage ; savoir programmer est l'affaire de plusieurs années.