Best machine theory books according to redditors

We found 787 Reddit comments discussing the best machine theory books. We ranked the 61 resulting products by number of redditors who mentioned them. Here are the top 20.

Next page

Top Reddit comments about Machine Theory:

u/ilknish · 482 pointsr/learnprogramming

Code: The Hidden Language of Computer Hardware and Software.
It may be a bit more lower level than you're looking for, but it'd be a great foundation to build off of.

u/samort7 · 257 pointsr/learnprogramming

Here's my list of the classics:

General Computing

u/HeterosexualMail · 187 pointsr/programming

We did something similar as well. The labs were tons of fun. I remember having to run a couple dozen lines of code through the CPU cache on a test once, including some sneakery of using code as data at one point. I do appreciate having done it, but I'm not sure how much practical lasting value that really contributed.

That said, for those who are interested in this there is The Elements of Computing Systems: Building a Modern Computer from First Principles, more commonly known as "NAND to Tetris".

Petzold's Code is excellent as well.

Edit: Actually, while I've suggested those two let me throw Computer Systems: A Programmer's Perspective into the mix. It's a book we used across two courses and I really enjoyed it. We used the 2nd edition (and I have no issue recommending people get a cheaper, used copy of that), but there is a 3rd edition now. Being a proper text book it's stupidly priced (you can get Knuth's 4 book box set for $30 more), but it's a good book.

Anyone have suggestions similar to that Computer Systems's text? I've always wanted to revisit/re-read it, but could always used a different perspective.

u/falcojr · 103 pointsr/programming

If you're really serious about learning, I HIGHLY recommend the book Code: The Hidden Language of Computer Hardware and Software. It's basically a book that 'builds' a computer from the ground up, starting with simple morse code type stuff through the wire, and each chapter just keeps building until you get to assembly and some higher level language stuff at the end. You do have to think through (or glaze over) some stuff in a few chapters, but it's a very eye opening book.

Edit: Grammar

u/PM_ME_YOUR_MAKEFILE · 75 pointsr/learnprogramming

CODE by Charles Petzold is the book to read to understand computers at a base level. It literally starts at a single bit and moves all the way up the stack. I cannot recommend this book enough for someone starting out.

u/Lhopital_rules · 64 pointsr/AskScienceDiscussion

Here's my rough list of textbook recommendations. There are a ton of Dover paperbacks that I didn't put on here, since they're not as widely used, but they are really great and really cheap.

Amazon search for Dover Books on mathematics

There's also this great list of undergraduate books in math that has become sort of famous: https://www.ocf.berkeley.edu/~abhishek/chicmath.htm

Pre-Calculus / Problem-Solving

u/old_dog_new_trick · 58 pointsr/learnprogramming

But How Do It Know? is a great introduction into how computers work.

u/srnull · 54 pointsr/programming

Sorry to see this getting downvoted. Read the about page to get an idea of why /u/r00nk made the page.

I have to agree with one of the other comments that it is way too terse at the moment. I remember when we learnt about e.g. d-latches in school and it was a lot more magical and hard to wrap your head around at first then the page gives credit for. That and, or, and xor gates can be built up from just nand gates (the only logic gate properly explained) is also glossed over. Either go over it, or don't show the interiors of the other logic gates.

The interactive stuff is really neat. Good work on that.

Edit: If anyone reading wants to learn this stuff in more detail, two good books are

u/reddilada · 51 pointsr/learnprogramming

CODE: The Hidden Language of Computer Hardware and Software is a great book written for the non-tech crowd. It gives a good basis for what computers are all about.

If he works in an office, I'd point him to Automate the Boring Stuff with Python as it will deal with things he is probably already familiar with.

u/indrora · 42 pointsr/programming

I think one of the most profound articles I read about just what makes a computer "work" is Brian Kernighan's article, "What an educated person should know about computers". It's a decade old now and was developed into a book of similar name. (Amazon link)

Another was sitting down and reading Code: the hidden language of computing (Amazon link) and actually walking through it. The book is coming up on 20 years old, but Petzold (who has taught many a developer how to do fancy tricks with their silicon) really sat down and wrote a book that anyone could understand and come away from feeling better off and more knowledgeable about the way our world works. This is the book I refer new programmers and old knitting circle nannies to read when they ask how a computer works.

u/greentide008 · 42 pointsr/compsci
u/myfavoriteanimal · 40 pointsr/compsci

Code, by Charles Petzold

Here it is on Amazon.

u/Pally321 · 33 pointsr/mildlyinteresting

If you're serious about getting into software development, I'd recommend you start looking into data structures and algorithms as well. It's something I think a lot of people who were self-taught tend to miss because it's not required knowledge to program, but it will give you a huge competitive advantage.

While I haven't read it, this book seems like a good introduction to the concept: https://smile.amazon.com/dp/1617292230/?coliid=I34MEOIX2VL8U8&colid=MEZKMZI215ZL&psc=0

From there I'd recommend looking at MIT's Intro to Algorithms, 3rd Edition. A bit more advanced, but the topics in there will play a huge role in getting a job in software.

u/wall_time · 32 pointsr/programming

Charles Petzold also wrote Code: The Hidden Language of Computer Hardware and Software. It's a great book. I'm sure most of the people browsing this subreddit will already understand most of what is in the book (or have read it already) but fantastic read nonetheless.

u/KobayashiDragonSlave · 28 pointsr/learnprogramming

Not OP but I discovered this book 'Grokking Algorithms' from a fantastic youtube channel 'The Coding Train'. The book explains a lot of the algorithms and data structures that I am learning in my first sem of CS at school. This thing even has the stuff that I am going to learn in the next semster. I found this book much more fun than my monotonous textbooks.
If anyone wants to get a good grasp of the fundamentals of A&DS this is a great starting point and then move on to MOOCs by famous Universities. MIT's A&DS was the one that I used. Dunno if it's still available on YouTube because I remember that OCW courses were removed or something?

Link

u/hooj · 28 pointsr/explainlikeimfive

The whole subject is a bit too complicated and a bit too deep for a short ELI5, but I'll give a stab at the gist of it.

The reason why computers work (at least in the vein of your question) is very similar to the reason why we have language -- written, spoken, etc.

What you're reading right at this very moment is a complex system (language) simplified to symbols on the screen. The very fact that you can read these words and attain meaning from them means that each sentence, each word, and each letter represent a sort of code that you can understand.

If we take an apple for example, there are many other ways to say that in different languages. Manzana. Pomme. Apfel. And so on. Codes -- some symbol maps to some concept.

In the context of computers, well, they can only "understand" binary. Ones and zeros. On and off. Well, that's okay, because we can map those ones and zeros to codes that we (humans) care about. Like 101010111 could represent "apple" if we wanted it to.

So we build these physical circuits that either have power or don't (on and off) and we can abstract that to 1's (power flowing through that circuit) and 0's (no power flowing through it). This way, we can build physical chips that give us basic building blocks (basic instructions it can do) that we can leverage in order to ultimately make programs, display stuff, play sounds, etc. And the way we communicate that to the computer is via the language it can understand, binary.

In other words, in a basic sense, we can pass the processor binary, and it should be able to interpret that as a command. The length of the binary, and what it should contain can vary from chip to chip. But lets say our basic chip can do basic math. We might pass it a binary number: 0001001000110100 but it might be able to slice it up as 0001 | 0010 | 0011 | 0100 -- so the first four, 0001, might map to an "add" command. The next four, 0010, might map to a memory location that holds a number. The third group of four might be the number to add it to. The last group might be where to put it. Using variables, it might look like:

c = a + b. Where "c" is 0100, "a" is 0010, "b" is 0011, and the "+" (addition operator) is 0001.

From there, those basic instructions, we can layer abstractions. If I tell you to take out the trash, that's a pretty basic statement. If I were to detail all the steps needed to do that, it would get a lot longer -- take the lid off the can, pull the bag up, tie the bag, go to the big garbage can, open the lid, put the trash in. Right? Well, if I tell you to take out the trash, it rolls up all those sub actions needed to do the task into one simple command.

In programming, it's not all that different. We layer abstractions to a point where we can call immense functionality with relatively little code. Some of that code might control the video signal being sent to the screen. Some of that code might control the logic behind an app or a game. All of the code though, is getting turned into 1's and 0's and processed by your cpu in order to make the computer do what is asked.

If you want to learn more, I highly recommend Code by Charles Petzold for a much more in depth but still layman friendly explanation of all this.

u/cholland89 · 27 pointsr/compsci

I just finished reading Code: The Hidden Language of Computer Hardware and Software and will state unequivocally that this book is the most satisfying read I've experienced. It starts with flashlights blinking through windows, moves to Morse code, introduces electrical relays and demonstrates how they can be connected to form logic gates, then uses those gates to construct an ALU/counter/RAM and multiplexors. It goes on to describe the development of an assembly language and the utilization of input and output devices.

This book can be described as knowledge hose flooding the gaps in my understanding of computer hardware/software at an extremely enjoyable pace. It may help satisfy your interest in the concepts and technology that led to modern computers. Check out the reviews for more info.

If you haven't already studied logic gates in depth in your formal education, I would suggest using a logic simulator to actually build the combinational logic structures. I now feel very comfortable with logic gates and have a strong understanding of their application in computing from my time spent building the described logic.

I went through the book very slowly, rereading chapters and sections until I felt confident that I understood the content. I can not recommend this book enough.

After reading CODE, I have been working through The Elements of Computing Systems: Building a Modern Computer from First Principles. If you are looking to gain a better understanding of the functions of hardware components, this is the book to read. This book's companion site http://www.nand2tetris.org has the first chapters free along with the entire open source software suite that is used in the book's projects. You will build, in the hardware design language starting with Nand gates, each logic gate and every part of a computing system up to a modern high level language with which you can program custom software of your own design to compile in a compiler you designed into an assembly language you specified which is turned into binary that runs in a processor you built from Nand gates and flip flops. This book was very challenging before reading CODE, now I feel like I'm simply applying everything I learned in code with even more detail. For somebody that hasn't attended college for computing yet, this has been a life changing experience.

http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319


http://www.amazon.com/The-Elements-Computing-Systems-Principles/dp/0262640686

u/stevenxdavis · 27 pointsr/compsci

I just started reading CODE by Charles Petzold and I've really enjoyed it so far. It's an accessible take on the basics of computer science that doesn't just focus on computers themselves.

u/bpikmin · 26 pointsr/programming

I highly recommend the book Code. I read it in middle school and it was absolutely fascinating. Pretty short too.

u/devilbunny · 23 pointsr/explainlikeimfive

That's a pretty interesting course. I've read the book and done exercises up until you actually have to start building the CPU.

However, I would strongly recommend reading Charles Petzold's CODE first. It's a little less technical, but explains the general concepts much better than nand2tetris.

u/abstractifier · 22 pointsr/learnprogramming

I'm sort of in the same boat as you, except with an aero and physics background rather than EE. My approach has been pretty similar to yours--I found the textbooks used by my alma mater, compared to texts recommended by MIT OCW and some other universities, looked at a few lists of recommended texts, and looked through similar questions on Reddit. I found most areas have multiple good texts, and also spent some time deciding which ones looked more applicable to me. That said, I'm admittedly someone who rather enjoys and learns well from textbooks compared to lectures, and that's not the case for everyone.

Here's what I gathered. If any more knowledgeable CS guys have suggestions/corrections, please let me know.

u/nsfmc · 21 pointsr/programming

he lost me when he said this about GEB
> If you only read one book on this list, it should be this one.

seriously? it's not that i don't appreciate the sentiment, but things douglas hofstadter thinks are neat is no substitute for any single book on the rest of the list unless you

  • have no other way to explain at cocktail parties what you studied at school
  • try to sound smart at cocktail parties by talking about things in GEB without actually referencing the book.

    for my part, i'd add sipser's computation book and why not throw in some ken thompson in there as an amuse bouche?
u/neutronfish · 20 pointsr/cscareerquestions

One book that helped me a lot while starting out and which I highly recommend to any new student of computer science is Code: The Hidden Language of Computer Hardware by Charles Petzold, which starts out as a general interest book about the history of computing and then very quickly ratchets up into how modern computers, compilers, operating systems, and hardware drivers are built. You basically have to learn some discrete math and assembly language just to follow along, and by the end you have a really good idea of what happens under the hood when you run your programs and why.

u/shivasprogeny · 20 pointsr/learnprogramming

How deep do you want to go? Code: The Hidden Language of Computer Hardware and Software goes all the way from binary to computer code.

If you don't really care about the hardware, you might start dabbling in assembly on a Raspberry PI.

u/KernlPanik · 20 pointsr/learnprogramming

I'm a ~10 year sysadmin that has decided to rebuild my software dev skills that I haven't used since college. Here's what I did to reawaken that part of my brain:

  1. Harvard's CS50. I figured an entry level college course would be "beneath me" but it was a great experience and I learned a surprising amount. It's very entertaining as well so that made the "simple" parts fun to do as well.

  2. Read CODE by Charles Petzold. Great insight into the nuts and bolts of how computers work. Read through it on my lunch breaks while taking CS50 in the evenings.

  3. Read and do the problems in C Primer Plus. This is a great book for learning how to write in C, which is the basis for all modern languages and is still widely used today. Great starter book for anyone who wants to learn to program.

    3.5) After going through the last chapters of C Primer Plus, I realized that some of my math skills were not up to par, so I took this MOOC from MIT to supplement that. No idea if that's something you need.

  4. Here comes the fun one: The Structure and Interpretation of Computer Programs, aka The Wizard Book. This book is more about how to design software in general, and it is pretty difficult. That being said, if you can get through it then you have the chops to do this professionally.
u/yggdrasilly · 19 pointsr/compsci
u/christianitie · 18 pointsr/math

Without knowing much about you, I can't tell how much you know about actual math, so apologies if it sounds like I'm talking down to you:

When you get further into mathematics, you'll find it's less and less about doing calculations and more about proving things, and you'll find that the two are actually quite different. One may enjoy both, neither, or one, but not the other. I'd say if you want to find out what higher level math is like, try finding a very basic book that involves a lot of writing proofs.

This one is aimed at high schoolers and I've heard good things about it, but never used it myself.

This one I have read (well, an earlier edition anyway) and think is a phenomenal way to get acquainted with higher math. You may protest that this is a computer science book, but I assure you, it has much more to do with higher math than any calculus text. Pure computer science essentially is mathematics.

Of course, you are free to dive into whatever subject interests you most. I picked these two because they're intended as introductions to higher math. Keep in mind though, most of us struggle at first with proofwriting, even with so-called "gentle" introductions.

One last thing: Don't think of your ability in terms of your age, it's great to learn young, but there's nothing wrong with people learning later on. Thinking of it as a race could lead to arrogance or, on the other side of the spectrum, unwarranted disappointment in yourself when life gets in the way. We want to enjoy the journey, not worry about if we're going fast enough.

Best of luck!

u/JustBesideTheWindow · 18 pointsr/HowToHack
u/totemcatcher · 18 pointsr/linux
  • CODE: The Hidden Language of Computer Hardware and Software by Charles Petzold

    A ground up approach to understanding digital processing and transmission in a broad sense. I only recommend this book if you are looking for an intrinsic understanding of computing rather than merely a handle on using a particular programming language or operating system. By the end of the book you should have a handle on actually building your own computer, however it's actually an excellent "first book" for anyone interested in computing.
u/myrrlyn · 18 pointsr/learnprogramming

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

This book is an excellent primer for a bottom-up look into how computers as machines function.

https://www.amazon.com/gp/aw/d/0123944244/ref=ya_aw_od_pi

This is my textbook from the class where we built a CPU. I greatly enjoy it, and it also starts at the bottom and works up excellently.

For OS development, I am following Philipp Opperman's excellent blog series on writing a simple OS in Rust, at http://os.phil-opp.com/

And as always Wikipedia walks and Reddit meanders fill in the gaps lol.

u/Spasnof · 17 pointsr/learnprogramming

Awesome book Code , really helps you understand from a bottom up perspective. Super approachable without a CS background and does not need a computer in front of you to appreciate. Highly recommended.

u/Afro-Ninja · 17 pointsr/explainlikeimfive

It doesn't "know." Any logical operation (especially basic math calculations) can be broken down into binary digits, and a single binary digit (bit) can be represented as the presence or absence of electricity.

It's almost how if you were to build a sequence of pipes and valves, and pour water into the opening, the water would end up flowing through the same way each time. The pipes don't "know" where the water goes, it just happens.

A computer does the same thing but on a tiny scale with tiny electric pulses travelling through sequences of thousands of gates all connected to each other. Imagine that the buttons you hit on a calculator slightly change how the valves open and close. (or which opening to dump the water into) You hit enter, the water is poured, and the result shows on screen.

fair warning: I am not a hardware guy so this explanation is probably not 100% accurate.
If you have more interest in the subject I HIGHLY recommend reading this book: http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

u/mohabaks · 17 pointsr/unixporn

Thanks ;). Not so skilled on that and my advice might be misleading; though I got a background in cs:This would be my suggestion for someone beginning.

u/Rinnve · 16 pointsr/learnpython

You should read this book. The best explanation of how computers work I know of.

u/chub79 · 15 pointsr/algorithms

The Algotitms Design Manual by Skienna helped me a lot.

I was also curious about this one.

Also, this site may help :)

u/jhanschoo · 14 pointsr/compsci

Google hasn't been helpful because no such algorithm exists. Check out Rice's Theorem for the impossibility.

edit:

Let S be the set of languages that you can reduce SAT to in polynomial time.

SAT is clearly in S, and we know some machine recognizes it.

The empty language is not in S (even if P=NP, so that SAT is P-complete), and we know some machine recognizes it.

By Rice's Theorem, no machine decides, when given a machine as input, whether that machine recognizes a language in S.

(we assume that the "any custom problem" input is as a machine encoding)

edit2:

I see that you make a lot of questions about computational complexity, but do not have a good foundation. Many things you propose or ask already have known impossibility results. May I suggest you have a look at Sipser (https://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X)? That will give you a better understanding of computability and complexity to understand the feedback you're getting.

u/goodbyegalaxy · 14 pointsr/hardware

Code: The Hidden Language of Computer Hardware and Software

As the title implies, it's not just about hardware, it goes into how software is written for hardware as well. But it's a really cool book, takes you from the very basics of circuitry (a battery, a light bulb, and wire) in the first chapter, and building only on things taught in the book gets you to a fully working computer.

u/twopoint718 · 14 pointsr/programming

My favorite example of a mind-bogglingly well-staffed company was "Thinking Machines Corporation":

(Taken from Wikipedia, not exhaustive!):

  • Greg Papadopoulos (Sun CTO)
  • Guy L Steele, Jr. (Scheme designer)
  • Brewster Kahle (Internet Archive, Founder)
  • Marvin Minsky (AI pioneer)
  • Doug Lenat (AI pioneer, Cyc project)
  • Stephen Wolfram (Mathematica creator)
  • Eric Lander (Human Geneome Project, President Obama's council of science and technology advisors, Co-chair)
  • Richard Feynman (Nobel Prize, Physics, Manhattan Project)
  • Alan Harshman (High-performance computing, AI)
  • Tsutomu Shimomura (security expert, notable for his involvement in the arrest of Kevin Mitnick)

    I found this when I was reading about Feynman one time. This isn't meant to disparage Google at all, it's an amazing list though.

    EDIT: I forgot to mention what I started out writing. Feynman produced an excellent book, The Feynman Lectures on Computation, which if you're familiar with the physics version of the same, is an incredibly lucid, short, and informative book. I think this would make an excellent textbook for a course in computer architecture.

u/expedient · 13 pointsr/programming

Not a video, but Code: The Hidden Language of Computers by Charles Petzold is really great.

u/maruahm · 13 pointsr/compsci

Always liked Introduction to Automata Theory, Languages, and Computation by Hopcroft and Ullman as an intro text. Undergraduate-level but good treatment of TCS.

If that's too basic, I recommend Theory of Computation by Kozen. It's roughly 1st-year graduate level, intended for those already with some background.

If that's too basic, for a research-level survey of TCS, take a look at Wigderson's Mathematics and Computation.

u/MirrorLake · 12 pointsr/learnprogramming

"Code" by Charles Petzold, if anyone wants the link.

u/mooshoes · 11 pointsr/IWantToLearn

I'd recommend you start with the book "Code", which handles just this progression: http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

From there, investigate operating system development. The Minix OS is very well documented.

u/entropicone · 11 pointsr/compsci

Riding on your top post coattails...

The Elements of Computing Systems and Code by Charles Petzold are exactly what you want.

Code goes through number systems, basic information theory, circuits (from gates on up), memory, machine code and programming languages, all with accessible diagrams and explanations.

TECS has you build an actual working computer from the ground up.

u/Shadowsoal · 11 pointsr/compsci

In the theoretical field of complexity...

The 1979 version of Introduction to Automata Theory, Languages, and Computation by Hopcroft & Ullman is fantastic and used to be the canonical book on theoretical computer science. Unfortunately the newer versions are too dumbed down, but the old version is still worth it! These days Introduction to the Theory of Computation by Sipser is considered to be the canonical theoretical computer science text. It's also good, and a better "introduction" than H&U. That said, I prefer H&U and recommend it to anyone who's interested in more than getting through their complexity class and forgetting everything.

In the theoretical field of algorithms...

Introcution to Algorithms by Cormen, Leiserson, Rivest and Stein is dynamite, pretty much everything you need to know. Unfortunately it's a bit long winded and is not very instructive. For a more instructive take on algorithms take a look at Algorithms by Dasgupta, Papadimitriou and Vazirani.

u/toastisme · 11 pointsr/IWantToLearn

A similar question was posted on Quora not long ago, and the main recommendation was Code by Charles Petzold:

http://www.amazon.co.uk/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_1?ie=UTF8&qid=1395088237&sr=8-1&keywords=code+charles+petzold

Having subsequently read the book I think it's a fantastic introduction, and goes through everything from the importance of binary code and applying Boolean logic to circuits, to the details of the inner workings of the first microprocessors, and all in an interesting and engaging way.

u/punctured-torus · 11 pointsr/compsci
u/enteleform · 11 pointsr/compsci

Check out:
Grokking Algorithms: An illustrated guide for programmers and other curious people
 
I'm also pretty rusty at math right now, and have been getting by with a try-different-things-until-it-works approach.  Even for the types of problems I've become efficient at solving, in many cases I don't know the actual terminology, so it makes it difficult to expand upon concepts or communicate them with others.  I'd like to get to a point where I can mentally reason about processes & formulas without having to execute them in order to see the results, and I feel like the first step to get there is to get reacquainted with terminology & foundational concepts.  Here are Some Resources I've queued up to work through for that purpose.

u/Pandasmical · 11 pointsr/computerscience

I enjoyed this one!
Code: The Hidden Language of Computer Hardware and Software

Here is someone else's detailed review on it

"Charles Petzold a does an outstanding job of explaining the basic workings of a computer. His story begins with a description of various ways of coding information including Braille, Morse code, and binary code. He then describes the development of hardware beginning with a description of the development of telegraph and relays. This leads into the development of transistors and logic gates and switches. Boolean logic is described and numerous electrical circuits are diagramed showing the electrical implementation of Boolean logic. The book describes circuits to add and subtract binary numbers. The development of hexadecimal code is described. Memory circuits are assembled by stringing logic gates together. Two basic microprocessors are described - the Intel 8080 and the Motorola 6800. Machine language, assembly language, and some higher level software languages are covered. There is a chapter on operating systems. This book provides a very nice historical perspective on the development of computers. It is entertaining and only rarely bogs down in technical detail."

u/Grazfather · 10 pointsr/engineering

Anyone who likes this stuff should really read code. The author goes from tin-can phones to building a computer, in language anyone could follow.

u/remembertosmilebot · 10 pointsr/learnprogramming

Did you know Amazon will donate a portion of every purchase if you shop by going to smile.amazon.com instead? Over $50,000,000 has been raised for charity - all you need to do is change the URL!

Here are your smile-ified links:

https://smile.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

---

^^i'm ^^a ^^friendly bot

u/krunk7 · 10 pointsr/programming

Absolutely.

Check out The Elements of Statistical Learning and Introduction to Machine Learning.

edit those books are about practical applications of what we've learning to date from the neural network style of pattern classification. So it's not about modeling an actual biological neuron. For modeling of the biology, it's been a while since I futzed with that. But when I wrote a paper on modeling synaptic firing, Polymer Solutions: An Introduction to Physical Properties was the book for that class. Damned if I remember if that book has the details I needed or if I had to use auxiliary materials though.

u/sanedave · 10 pointsr/learnprogramming

Three books I have been using:

The Definitive Guide to How Computers Do Math

Web page here: http://www.diycalculator.com/

Assembly Language Step-by-Step: Programming with Linux

Hacking: The Art of Exploitation, 2nd Edition

The first uses a virtual machine running on Windows, with 5 registers, 65K of virtual memory, a debugger, and will give you a good basic understanding of what is going on. The second book uses Intel X86 on Linux, and gives a solid foundation of the most used instructions. The third book is just good.

Other favorites of mine include "The Art of Debugging" by Norm Matloff (google for his excellent web page) and "Professional Assembly Language" by Richard Blum.

Have fun!

u/Monguce · 10 pointsr/askscience

This is a really great book about the topic. It's much simpler than you might think but kind of tricky to explain unless you know a bit of back ground. The book costs less than a tenner and will give you a while different appreciation of how computers work. Well worth a read even if it starts out seeing rather simple.

https://www.amazon.co.uk/Code-Language-Computer-Hardware-Software/dp/0735611319

u/[deleted] · 9 pointsr/programming

You need to show that you know your stuff. Just because you're doing something more applied like Network Security in grad school doesn't mean that you won't have a base level of knowledge you're expected to understand. In that case, you need to learn some basic stuff a CS student at a good school would know. I'm not "dumbing down" anything on my list here, so if it seems hard, don't get discouraged. I'm just trying to cut the bullshit and help you. (:

  • Redo your introduction to Computer Science. If you finish this, picking up a new language is cake.

  • Discrete Mathematics, A.K.A. "Math for Computer Scientists" This is the standard text for this, but this is pretty good for a cheap book.

  • Algorithms

  • Compilers

  • Operating Systems

  • Networking

  • For basic CS theory, "Introduction to Theory of Computation by Michael Sipser" is what I used to recommend, but Amazon doesn't seem to have a sanely priced copy. Either buy that used, or get the classic "Cinderella Book". Get an older edition if you can!

    Again, don't be discouraged, but you'll need to work hard to catch up. If you were trying for something like mathematics or physics while doing this, I'd call you batshit insane. You may be able to pull it off with CS though (at least for what you want to study). Make no mistake: getting through all these books I posted on your own is hard. Even if you do, it might be the case that still no one will admit you! But if you do it, and you can retain and flaunt your knowledge to a sympathetic professor, you might be surprised.

    Best of luck, and post if you need more clarification. As a side note, follow along here as well.

    Netsec people feel free to give suggestions as well.
u/IamAlbertHofmann · 9 pointsr/learnprogramming

here you go

It's the 'hidden language', not 'secret'. Sorry about that.

u/beaverjacket · 9 pointsr/AskReddit

This book is a very good explanation of how computers work. It starts with explaining electromechanical switches, and how you can turn a couple switches into a logic gate. Then, it shows how you can put logic gates together to do arithmetic. It goes on like that until you reach programmable computers.

u/dhobsd · 9 pointsr/askscience

Hooray, a question I can answer!

One of the problems here is that the question is worded backwards. Binary doesn't combine to give us programming languages. So the answer to your question is somewhat to the contrary: programming languages were invented to ease the tedium of interfacing using binary codes. (Though it was still arguably tedious to work on e.g. punched cards.) Early interfaces to programming machines in binary took the form of "front panels" with switches, where a user would program one or several instructions at a time (depending on the complexity of the machine and the front panel interface), using the switches to signify the actual binary representation for the processor functions they desired to write.

Understanding how this works requires a deeper understanding of processors and computer design. I will only give a very high level overview of this (and others have discussed it briefly), but you can find a much more layperson accessible explanation in the wonderful book Code: The Hidden Language of Hardware and Software. This book explains Boolean logic, logic gates, arithmetic logic units (ALUs) and more, in a very accessible way.

Basically, logic gates can be combined in a number of ways to create different "components" of a computer, but in the field of programming languages, we're really talking about the CPU, which allows us to run code to interface with the other components in the system. Each implementation of a processor has a different set of instructions, known as its machine code. This code, at its most basic level, is a series of "on" or "off" electrical events (in reality, it is not "on" and "off" but high and low voltages). Thus, different combinations of voltages instruct a CPU to do different things, depending on its implementation. This is why some of the earliest computers had switch-interfaces on the front panel: you were directly controlling the flow of electricity into memory, and then telling the processor to start executing those codes by "reading" from the memory.

It's not hard to see how programming like this would be tedious. One could easily write a book to configure a machine to solve a simple problem, and someone reading that book could easily input the code improperly.

So eventually as interfacing with the machine became easier, we got other ways of programming them. What is commonly referred to as "assembly language" or "assembler" is a processor-specific language that contains mnemonics for every binary sequence the processor can execute. In an assembly language, there is a 1:1 correlation between what is coded, and what the processor actually executes. This was far easier than programming with flip-switches (or even by writing the binary code by hand), because it is much easier for a human to remember mnemonics and word-like constructs than it is to associate numbers with these concepts.

Still, programming in assembly languages can be difficult. You have to know a lot about the processor. You need to know what side-effects a particular instruction has. You don't have easy access to constructs like loops. You can't easily work with complex datatypes that are simply explained in other languages -- you are working directly with the processor and the attached memory. So other languages have been invented to make this easier. One of the most famous of these languages, a language called "C," presents a very small core language -- so it is relatively easy to learn -- but allows you to express concepts that are quite tedious to express in assembler. As time has gone on, computers have obviously become much faster, and we've created and embraced many languages that further and further abstract any knowledge about the hardware they are running on. Indeed, many modern languages are not compiled to machine code, but instead are interpreted by a compiled binary.

The trend here tends to be making it easier for people to come into the field and get things done fast. Early programming was hard, tedious. Programming today can be very simple, fun and rewarding. But these languages didn't spring out of binary code: they were developed specifically to avoid it.

TL;DR: People keep inventing programming languages because they think programming certain things in other ones is too hard.

u/mcscottmc · 9 pointsr/compsci

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=mp_s_a_1_1

This book explains how computers work from first principles (electricity and switches on up). Very easy to read. I am surprised it hasn’t been mentioned yet.

u/shwestrick · 9 pointsr/compsci

Sipser's Introduction to the Theory of Computation is an excellent book with three major parts:

  1. Automata and Languages
  2. Computability Theory
  3. Complexity Theory

    Each part builds on the previous. I highly recommend working through it.
u/galahadredgrave · 9 pointsr/ArtificialInteligence

I'm just beginning this journey myself, so judge what I say accordingly.

Artificial Intelligence: A Modern Approach seems to be the most popular textbook.

This article has some seemingly good advice, though it seems to be geared more toward Machine Learning (ML) than AI in general.

I think you'll want to learn a programming language. The above article recommends Python as it is well suited to ML.

There is (was?) a free online course on ML from Stanford by Andrew Ng. I started to take it a couple years ago but never finished. It is very accessible. The lectures appear to be on YouTube.

Grokking Algorithms is a highly regarded book on algorithms.

Make a free Amazon Web Services account and start playing with Sagemaker.

There really is no well defined path to learning AI, in my opinion. It is a highly interdisciplinary endeavor that will require you to be a self-starting autodidact. It's very exciting though. There is still plenty of new ground to be broken. Some might argue it is difficult for the little guy to compete with big labs at the big tech companies with their ungodly amounts of data to feed their AI, but I am optimistic.

u/weasler · 9 pointsr/compsci

Code is an absolute classic.

u/NotAGeologist · 9 pointsr/computerscience
u/edwilli222 · 9 pointsr/AskProgramming

This is kind of a weird one but I’d suggest Code. Very non-technical, no programming, but cool history and fundamentals.
Code: The Hidden Language of Computer Hardware and Software - Code: The Hidden Language of Computer Hardware and Software https://www.amazon.com/dp/0735611319/ref=cm_sw_r_cp_api_i_IiH2DbWSNWHMW

u/McFuckyeah · 9 pointsr/askscience

This is the book you want to read. It walks you through every bit of how a CPU works, in an incredibly approachable way. (If you can understand a light switch, you can understand this book.)

u/akmark · 9 pointsr/programming

I'll recommend Code, even though it isn't specifically theoretical. However, it does go over how code (semaphore, morse code) evolved over time. From someone who does program this is about a 'human' a book as they come which could fit exactly what you are looking for.

u/sleepingsquirrel · 9 pointsr/ECE
u/Daganar · 8 pointsr/programming

For anyone interested in this kinda stuff I would really recommend "Code: The hidden language of computer hardware and software"
https://www.amazon.co.uk/Code-Language-Computer-Hardware-Software/dp/0735611319

u/MrPhantomZz · 8 pointsr/6thForm

Could read a book based on your interests in computer science, e.g. AI, Machine Learning, data science etc.

A good book that I recently picked up was
[Code: The Hidden Language of Computer Hardware and Software by Charles Petzold] (https://www.amazon.co.uk/dp/0735611319/ref=cm_sw_r_cp_api_qB4TBbG90ANHN)

u/prego_no_pao · 8 pointsr/portugal

acabei de ler o Code (1999). É uma boa introdução ao funcionamento de um computador baseada na sua evolução na história.

u/fiskfisk · 8 pointsr/compsci

Code: The Hidden Language of Computer Hardware and Software from Charles Petzold does just that, starting from the simplest form and going through all the different steps we took to get to where we are today. Well worth a read!

u/codeificus · 8 pointsr/programming

The 86 stands for the instruction set for the cpu. Basically, every chip designed in the world accepts input and output, but in different ways (different numbers of connections, ordering). All of those chips have more or less backwards compatibility with regard to that, so it makes it easier for others to develop around that.

So there is a meaning conveyed, though it probably isn't important to you if you aren't developing hardware or writing assembly.

I strongly recommend Code by Charles Petzold which explains the origins of these chipsets. Basically Intel put out the 8080 in 1974 which was an 8-bit processor, then the 8086 in 1978 was a 16-bit processor, so they just ran with the number scheme (6 for 16 bit). The "80" from 8080 probably came from IBM punchcards which were used for the US census (since the 1920s!), which is actually how IBM started, basically as the child of Herman Hollerith who built automated tabulating machines in the late 19th century. Also this is to blame for the 80-character terminal convention. Blame IBM.

u/lightforce3 · 8 pointsr/tech

What you seem to be asking is "how do computers work?" At any rate, the interaction of hardware and software is fundamental to any computer system, whether it's your fitness band or your cell phone or a supercomputer or the computer in your car engine or The Next Big Thing.

How that works is a really, really big question. Rather than attempt to answer it, I'll suggest you check out the book Code by Charles Petzold. It explains how computer hardware and software work, starting with basic electrical circuits and building up layer by layer until you're running an operating system and application software. That might seem like a lot to cover, but Code does it simply and cleanly, in a way that just about anybody can digest and understand.

u/jeykottalam · 8 pointsr/compsci

Introduction to Algorithms by CLRS

TAOCP is a waste of time and money; it's more for adorning your bookshelf than for actually reading. Pretty much anyone who suggests TAOCP and is less than 55 years old is just parroting Standard Wisdom™.

Godel, Escher, Bach is a nice book, but it's not as intellectually deep in today's world as it was when first published; a lot of the memes in GEB have been thoroughly absorbed into nerd culture at this point and the book should be enjoyed more as a work of art than expecting it to be particularly informative (IMO).

If you're interested in compilers, I recommend Engineering a Compiler by Cooper & Torczon. Same thing as TAOCP applies to people who suggest the Dragon Book. The Dragon Book is still good, but it focuses too much on parser generators and doesn't really cover enough of the other modern good stuff. (Yes, even the new edition.)

As far as real programming goes, K&R's The C Programming Language is still unmatched for its quality of exposition and brevity, but these days I'd strongly suggest picking up some Python or something before diving into C. And as a practical matter, I'd suggest learning some C++ from Koenig & Moo's Accelerated C++ before learning straight C.

Sipser's Introduction to the Theory of Computation is a good theory book, but I'd really suggest getting CLRS before Sipser. CLRS is way more interesting IMHO.

u/Zaemz · 8 pointsr/programming

This is awesome! I've been slowly getting more and more interested in hardware, and this is something I would absolutely love to do. I just don't know where to start.

I've been reading a couple of books about learning lower level stuff, and planned on working my way up.

I'd really like to get out of webdev and into low-level programming, or even hardware design and implementation. There's sooooo goddamn much to learn, that I doubt I'll be ready without getting a BS in Comp. Engineering, and maybe a master's as well.

(I'm absolutely a beginner, and if anyone is interested in the books I've been reading, these are they:

  1. Code by Charles Petzold

  2. The The Elements of Computing Systems: Building a Modern Computer from First Principles by Noam Nisan and Shimon Schocken

  3. Computers as Components by Marilyn Wolf)
u/azimuth · 8 pointsr/compsci

Also Code: The Hidden Language of Computer Hardware and Software. It literally starts out with telegraphs, and shows how, if you are sufficiently crazy, they can be assembled into a working computer. Then it shows how you can write software for your telegraph-relay-cpu. A great read.

u/Sigb · 8 pointsr/emulation

Hyperthreading is also a way to utilize each core more effectively. Not all programs can run as many instructions simultaneously like emulators can, so hyperthreading is there to let you use two different threads on one core. It goes without saying that its not as effective as having more cores, but it helps a lot for mid range laptops doing typical user workloads.

State switching is also a feature that helps with typical workloads. The processor change (grossly simplified) to higher performance and lower performance and everything in between quicker and more effectively. So when games suddenly give the CPU more tasks to do, the CPU will be quicker to adjust and give the necessary "power". This improves times for things like wake from sleep, opening programs, much more, and reduces power usage and heat build up.

There are also a large amount of minor things like branch prediction and things we don't hear about as much (industry trade secrets), that does add up to very tangible improvements. In addition to me not knowing so much about them, many of them are by nature secretive since Intel has much competition.

My reason for writing just a few words about these things was that so you can understand that there is a plethora of ways CPU's can improve. And they do. And it might be impossible to get the typical consumer to do more than compare clockspeed (just get them to factor in core count is hard). In short, the topic is much less simple than it looks at first glance.

If you want to learn more about processors, start learning about a simple one can help a huge amount. A guy called J.C. Scott designed a fully working theoretical 8-bit CPU architecture similar to actual 8-bit CPU's just so he could in detail explain to people how CPU's "know" stuff. He starts off by showing you how you can build logic gates using transistors, then how you can build all of the components out of logic gates. And the book is very easy to follow. amazon link

u/bjzaba · 7 pointsr/ProgrammingLanguages

I thought the extreme use of comic sans on this site was pretty amusing. :D

The Lambda Cube is pretty cool - http://www.rbjones.com/rbjpub/logic/cl/tlc001.htm - I'm not sure the page does the best job at explaining it though. The first time I saw a good explanation was in Type Theory and Formal Proof: An Introduction.

u/dogewatch · 7 pointsr/learnprogramming

The Grokking Algorithms Book is good for beginners or for those who want a fun refresher. Obviously not too much depth but teaches it in a nice illustrated way. Code is also available on github.

u/InvalidGuest · 7 pointsr/computerscience

I'd recommend this one: https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

It's very enjoyable to read and definitely increases one's understanding of how computer conceptually function. Petzold also makes it very easy to understand what he is saying in his explanations.

u/SidewaysGate · 7 pointsr/compsci

When I was a young teenager I read (most of) a book called Code.

This was absolutely fantastic. It didn't just talk about programming or about software, it explained the concept of a code and the work that we do from the ground up. It literally started from light bulbs and switches and went to microprocessors and programming languages. This is the book that helped me bridge the software-hardware cognitive gap. Eventually it got to be too much for me, but in my defense I was 12-13 at the time. Even so, the parts that I did get through stuck with me.

I'm going to go back and reread it.

The book isn't going to cover design patterns or microservices, but IMO it's best thing to give computer scientists context on what we're doing here from an engineering perspective (with sipser as the book from the mathematical perspective)

u/unknowngp · 7 pointsr/AskComputerScience

>I want to be able to understand how computers work

Code: The Hidden Language of Computer Hardware and Software

I was on the search for the same as you a couple of weeks ago and people recommended the book above. I just recently started reading it but hopefully someone who has read it can chime in with their opinion.

u/deiphiz · 7 pointsr/learnprogramming

Okay, I'm gonna plug this in here. I hope this doesn't get buried because when I saw someone asking about low level stuff here, I couldn't help but make this recommendation.

For anyone that wants to learn more about low level computer stuff such as assembly code should read the book Code: The Hidden Language of Computer Hardware and Software by Charles Petzold. I've been reading it recently and I'm really glad I picked it up. It really delves into how a computer really works, like how languages relate to the circuits on a board.

So yeah, /u/DEVi4TION, I recommend picking this up if you wanna know about more stuff like this. Then maybe try your hand at actual 6502 programming later :P

u/groundshop · 7 pointsr/math

It's an introduction to some of the major concepts in Computer Science theory. If you have no background in CS, and a bit of background in math (mid-undergraduate level) it's an enjoyable way to get exposed to a few concepts from CS theory.

If you're really looking to put your head to the grindstone and learn CS theory, there are better books though. I learned from M. Sipser's Intro to Comp. Theory.

P.S. I did walk away from it with a novice appreciation for Bach.

u/zoombikini · 6 pointsr/programming

Ah...Sipser.

u/llimllib · 6 pointsr/programming
u/boojit · 6 pointsr/learnprogramming

Having never read a Deitel book myself, I'm honestly curious why 65x finds them so compelling.

I would point out that based on some of 65x's previous posts, he or she is not an expert. This isn't a criticism at all, but people should take 65x's endorsement as coming from a beginning programmer and not a teacher or professional.

(Full disclosure: I'm a professional software developer with over 20 years in the industry.)

EDIT: As far as recommendations, if you can give me a general area of interest and it's in my wheelhouse, I'd be happy to oblige. I often recommend this book, although it's not really germane to this discussion.

u/nobody102 · 6 pointsr/AskElectronics

There are plenty of tutorials on the net - http://computer.howstuffworks.com/pc.htm I would also recommend this book - https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

u/bhrgunatha · 6 pointsr/AskComputerScience

A famous artefact of early computing is the boot-strapping process where the goal is a self-hosting compiler - which lets you write the compiler for a new language in the new langauge. However to get to that point a lot of earlier innovations were needed.

Take all of this with a pinch of salt - the order and the details may be wildly inaccurate, but the overall ideas viewed from afar give an idea of how we got to the point that we can choose our own language to write a compiler for another language..

To start with, raw binary values had to be set in order to define and run a program. Those raw binary values represent instructions that tell the hardwaer what to do and data that the program needed to operate. This is now usually referred to as machine code.

At first you would enter values into computer storage using switches.

Since that's so tedious and error prone, puched cards were developed along with the necessary hardware to read them so you could represent lots of values that could be read toagether. They had their own problems but it was a step forward from switches.

After some time symbolic instructions were defined as a shortcut for several machine code instructions - now usually called assembly language. For example put the value 8 and store it into a memory location 58 could be written as ST 8, [58]. This might take 3 machine code instructions, one represents the store instruction, one the value 8 and one the location 58. Since now assembly language could be written down it was easier to understand what the computer is being instructed to do. Naturally someone had the bright idea to make that automatic so that for example you could write down the instructions by hand, then create punched cards representing those instructions, convert them to machines code and then run the program. The conversion from the symbolic instructions to machines code was handled by a program called an assembler - people still write programs in assembly code and use assemblers today.

The next logical step is to make the symbolic instructions more useful and less aimed at the mundane, physical processes that tells the computer exactly how to operate and more friendly for people to represent ideas. This is really the birth of programming languages. Since programming languages allowed you to do more abstract things symbolically - like saving the current instructions location, branching off to another part of the same program to return later, the conversion to machine code became more complex.Those programs are called compilers.

Compilers allow you to write more useful programs - for example the first program that allowed you to connected a keyboard that lets you enter numbers and characters, one connected to a device to print numbers and characters, then later to display them on another device like a screen. From there you are quite free to write other programs. More languages and their compilers developed that were more suitable to represent more abstract ideas like variables, procedure and functions.

During the whole process both hardware - the physical elctronic machines and devices and software, the instructions to get the machines to do useful work - were both developed and that process still continues.

There's a wonderful book called Code by Charles Petzold that details all of these developments, but actually researched and accurate.



u/gingenhagen · 6 pointsr/programming

Try this book: Code, which is a bottom-up approach. Depending on how rigorous your college CS curriculum was, it'll be either a good review of your college classes or mind-blowing, but I think that the approach that the book takes is really great.

u/jadae · 6 pointsr/compsci

I'd also recommend Code: The Hidden Language of Computer Hardware and Software by Charles Petzold. I recently finished reading this book after having it recommended by a post on Reddit a year or two ago. It starts off with a lot of basic information, covering Morse code and braille, and moves along in the development of code and hardware up until you actually create a functioning computer in the book. The later chapters were harder to get interested in, but the first 3/4 was very excellent and actually covered more than my computer architecture class in undergrad.

u/nbksndf · 6 pointsr/haskell

Category theory is not easy to get into, and you have to learn quite a bit and use it for stuff in order to retain a decent understanding.

The best book for an introduction I have read is:

Algebra (http://www.amazon.com/Algebra-Chelsea-Publishing-Saunders-Lane/dp/0821816462/ref=sr_1_1?ie=UTF8&qid=1453926037&sr=8-1&keywords=algebra+maclane)

For more advanced stuff, and to secure the understanding better I recommend this book:

Topoi - The Categorical Analysis of Logic (http://www.amazon.com/Topoi-Categorial-Analysis-Logic-Mathematics/dp/0486450260/ref=sr_1_1?ie=UTF8&qid=1453926180&sr=8-1&keywords=topoi)

Both of these books build up from the basics, but a basic understanding of set theory, category theory, and logic is recommended for the second book.

For type theory and lambda calculus I have found the following book to be the best:

Type Theory and Formal Proof - An Introduction (http://www.amazon.com/Type-Theory-Formal-Proof-Introduction/dp/110703650X/ref=sr_1_2?ie=UTF8&qid=1453926270&sr=8-2&keywords=type+theory)

The first half of the book goes over lambda calculus, the fundamentals of type theory and the lambda cube. This is a great introduction because it doesn't go deep into proofs or implementation details.

u/PM_ME_UR_OBSIDIAN · 6 pointsr/compsci

The first step to doing research is ingesting whatever knowledge already exists. With that in mind:

u/shred45 · 6 pointsr/gatech

So, when I was younger, I did attend one computer science related camp,

https://www.idtech.com

They have a location at Emory (which I believe I did one year) that was ok (not nearly as "nerdy"), and one at Boston which I really enjoyed (perhaps because I had to sleep on site). That being said, the stuff I learned there was more in the areas of graphic design and/or system administration, and not computer science. They are also quite expensive for only 1-2 weeks of exposure.

I felt it was a good opportunity to meet some very smart kids though, and it definitely lead me to push myself. Knowing and talking to people that are purely interested in CS, and are your age, is quite rare in high school. I think that kind of perspective can make your interests and hobbies seem more normal and set a much higher bar for what you expect for yourself.

On the other side of things, I believe that one of the biggest skills in any college program is an openness to just figure something out yourself if it interests you, without someone sitting there with you. This can be very helpful in life in general, and I think was one of the biggest skills I was missing in high school. I remember tackling some tricky stuff when I was younger, but I definitely passed over stuff I was interested in just because I figured "thats for someone with a college degree". The fact is that experience will make certain tasks easier but you CAN learn anything you want. You just may have to learn more of the fundamentals behind it than someone with more experience.

With that in mind, I would personally suggest a couple of things which I think would be really useful to someone his age, give him a massive leg up over the average freshman when he does get to college, and be a lot more productive than a summer camp.

One would be to pick a code-golf site (I like http://www.codewars.com) and simply try to work through the challenges. Another, much more math heavy, option is https://projecteuler.net. This, IMO is one of the best ways to learn a language, and I will often go there to get familiar with the syntax of a new language. I think he should pick Python and Clojure (or Haskell) and do challenges in both. Python is Object Oriented, whilst Clojure (or Haskell) is Functional. These are two very fundamental and interesting "schools of thought" and if he can wrap his head around both at this age, that would be very valuable.

A second option, and how I really got into programming, is to do some sort of web application development. This is pretty light on the CS side of things, but it allows you to be creative and manage more complex projects. He could pick a web framework in Python (flask), Ruby (rails), or NodeJS. There are numerous tutorials on getting started with this stuff. For Flask: http://blog.miguelgrinberg.com/post/the-flask-mega-tutorial-part-i-hello-world. For Rails: https://www.railstutorial.org. This type of project could take a while, there are a lot of technologies which interact to make a web application, but the ability to be creative when designing the web pages can be a lot of fun.

A third, more systems level, option (which is probably a bit more opinionated on my part) is that he learn to use Linux. I would suggest that he install VirtualBox on his computer, https://www.virtualbox.org/wiki/Downloads. He can then install Linux in a virtual machine without messing up the existing OS (also works with Mac). He COULD install Ubuntu, but this is extremely easy and doesn't really teach much about the inner workings. I think he could install Arch. https://wiki.archlinux.org. This is a much more involved distribution to install, but their documentation is notoriously good, and it exposes you to a lot of command line (Ubuntu attempts to be almost exclusively graphical). From here, he should just try to use it as much as possible for his daily computing. He can learn general system management and Bash scripting. There should be tutorials for how to do just about anything he may want. Some more advanced stuff would be to configure a desktop environment, he could install Gnome by default, it is pretty easy, but a lot of people really get into this with more configurable ones ( https://www.reddit.com/r/unixporn ). He could also learn to code and compile in C.

Fourth, if he likes C, he may like seeing some of the ways in which programs which are poorly written can be broken. A really fun "game" is https://io.smashthestack.org. He can log into a server and basically "hack" his way to different levels. This can also really expose you to how Linux maintains security (user permissions, etc. ). I think this would be much more involved approach, but if he is really curious about this stuff, I think this could be the way to go. In this similar vein, he could watch talks from Defcon and Chaos Computer Club. They both have a lot of interesting stuff on youtube (it can get a little racy though).

Finally, there are textbooks. These can be really long, and kinda boring. But I think they are much more approachable than one might think. These will expose you much more to the "Science" part of computer science. A large portions of the classes he will take in college look into this sort of stuff. Additionally, if he covers some of this stuff, he could look into messing around with AI (Neural Networks, etc.) and Machine Learning (I would check out Scikit-learn for Python). Here I will list different broad topics, and some of the really good books in each. (Almost all can be found for free.......)

General CS:
Algorithms and Data Structures: https://mitpress.mit.edu/books/introduction-algorithms
Theory of Computation: http://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X
Operating Systems: http://www.amazon.com/Operating-System-Concepts-Abraham-Silberschatz/dp/0470128720

Some Math:
Linear Algebra: http://math.mit.edu/~gs/linearalgebra/
Probability and Stats: http://ocw.mit.edu/courses/mathematics/18-05-introduction-to-probability-and-statistics-spring-2014/readings/

I hope that stuff helps, I know you were asking about camps, and I think the one I suggested would be good, but this is stuff that he can do year round. Also, he should keep his GPA up and destroy the ACT.

u/TheJonesJonesJones · 6 pointsr/BlackPeopleTwitter

This is the joke behind the title of this book about how computers work!

u/developero · 6 pointsr/learnprogramming

Code is a great book that helped me understand programming on an abstract level

u/tech-ninja · 6 pointsr/ProgrammerHumor

Depends what you want to learn. Some of my favorites are

  • Code by Charles Petzold if you want to know how your computer works under the hood.

  • Peopleware if you want to learn how to manage knowledge workers.

  • Clean Code by Uncle Bob if you want to learn about good practices and program structure. Impressive content, covers much more than I expected.

  • Don't Make Me Think if you want to learn about usability.

  • Algorithms by Robert Sedgewick if you want to learn about DS & algorithms.

  • The Art of UNIX Programming by Eric S. Raymond if you want to learn about the unix philosophy. Lots of hidden gems in there. Have you ever heard: write programs that do one thing and do it well; don't tune for speed until you've measured; imagine all this knowledge distilled to you in one book.

    This a good list to get you started :) most of my favorite books are not language specific.
u/Omnipotent0 · 6 pointsr/educationalgifs

This is the best video on the subject I've ever seen. http://youtu.be/VBDoT8o4q00
Of you want to learn more I very very very strongly recommend this book. http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

u/novembersierra · 6 pointsr/cscareerquestions

Code: The Hidden Language of Computer Hardware and Software

This book starts with flashlights and Morse code and Braille, goes to telegraphs and electricity, works it way up to Boolean logic gate circuits (still using the components from telegraphs!) and then goes all the way to programming languages and computer graphics.

u/yoodenvranx · 6 pointsr/de

Falls du wissen willst wie eine CPU funktioniert und was Assembler ist und vor allem wo das alles her kommt und warum es funktioniert, dann kann ichdir Charles Petzold - Code: The Hidden Language empfehlen. Er fängt mit einfachen Morsesignalen an, leitet dann Logikgatter her und am Ende des Buches hast eine eine funktionerende CPU.

u/armchair_viking · 6 pointsr/smashbros

"Code" by Charles Petzold is a great book that does that. Starts with a simple flashlight switch, and builds on that example until you have a working processor.

Code: The Hidden Language of Computer Hardware and Software https://www.amazon.com/dp/0735611319/ref=cm_sw_r_cp_awd_FeDzwbRQ2ZDDP

u/Rikkety · 6 pointsr/AskComputerScience

Check out The Annotated Turing by Charles Petzold. It's Turing's paper on the Entscheidungsproblem which introduces Turing Machines, annotated with a lot of background information and some stuff about Turing's career. Very interesting stuff.

I can also recommend Code, by the same author which describes how a computer works from basic principles. It's doesn't have a lot of material on Turing, but it's certainly an interesting read for anyone interested in Comp Sci.

u/com2kid · 6 pointsr/learnprogramming

Read Code by Petzold

You'll have a far greater understanding of how things work at a basic level than everyone else.

u/gineton2 · 6 pointsr/ComputerEngineering

For a gentle introduction, CODE: The Hidden Language of Computer Hardware and Software is a really pleasant read. It works its way up gradually, so maybe not the best fit for a physics student or someone who already understands the fundamentals. For someone new to the subject it's a great fit, however. Otherwise I see Patterson and Hennessy recommended.

u/allforumer · 6 pointsr/programming

You might like this book -

Feynman Lectures on Computation

u/ephrion · 6 pointsr/haskell

TAPL can be really hard to grok if it's your first exposure to the notation. I recommend Type Theory and Formal Proof as a first step; the first 100 pages will get you introduced to all of the notation and concepts you need for typing and understanding extensions to the lambda calculus.

TAPL takes over from there with more details on implementing type system features in languages.

u/umib0zu · 5 pointsr/AskComputerScience

You should probably start with this book called Code and work your way up from there. It's actually pretty hard to find a single book that describes the history and the concepts, and even if you did find one, most of the topics would be hard to grasp on a first read. Code is usually a great starter book and might give you a few pieces of what your looking for. After you finish it, maybe check out a software book and dive into some of the concepts.

u/addcn · 5 pointsr/explainlikeimfive

All of these answers answer your question on a general level, but I would really recommend reading Code: The Hidden Language of Computer Hardware and Software by Charles Petzold for a deeper understanding. He talks about how the first computers were built and how they were programmed, and he does it in a way that's understandable even to a person that doesn't know a thing about computers.

u/intertroll · 5 pointsr/compsci

If you don’t want to real an actual textbook, this one will do the job (without skimping on details) and is more laypeople friendly:
https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

Just as an aside, I had a non techy friend who had a similar sense of mystification as OP’s but really wanted to understand them better, so I pointed him at this book. The next time I saw him we had a great conversation about logic gates and data representation. So it works! It was actually almost a cathartic experience, taking a person who doesn’t get to someone who does, since as a developer you often have to deal with users who don’t know and don’t care.

u/serimachi · 5 pointsr/computerscience

It's so great you're being so proactive with your learning! It will definitely pay off for you.

I like other's suggestion of Clean Code, but I fear as a first year that it may have mostly flew over my head--not that it would at all hurt to read. For a first year student specifically, I'd recommend either of two books.

Structure & Interpretation of Computer Programs, also known as The Wizard Book and free on the link I just sent you, is a famous textbook formerly used in MIT's Intro to Computer Science course. However, it's conceptually useful to programmers on any level. If you really, seriously read it and do the exercises, it's gonna give you a rock-solid foundation and shoot you ahead of your peers.

It uses Scheme, a quote-on-quote "useless" programming language for any real-world purpose. That's arguable, but the important thing about the book is that it's really edifying for a programmer. The skill it helps you develop is not the kind that will directly show on your resume, it's nothing you can point to, but it's the kind of skill that will show in your code and how you think and approach problems in general. That said, the book has exercises and the MIT site I linked you to has labs that you could potentially show off on your github.

Code: The Hidden Language of Hardware and Software is much more approachable, is not marketed specifically for programmers, and does not contain any exercises. Read it, though, and you'll find you have a huge boost in understanding the low-level computing classes that your classmates will struggle with. What is basically does is show the reader how one can build a computer, step by step, from the very basics of logic and switches. It's readable and written for a casual audience, so you may find it easier to motivate yourself to finish it.

SICP and Code, despite both being extremely popular, can be a bit difficult conceptually. If you don't fully understand something, try reading it again, and if you still don't understand it, it's fine. Everyone experiences that sometimes. It's okay to move forward as long as you feel like you mostly get the topic. Don't let the perfect be the enemy of the good.

Best of luck to you, and be excited! It's thrilling stuff.

u/cyberbemon · 5 pointsr/hardware

This is a great start, as it explains and goes into great detail regarding cpu/gpu architectures: Computer Architecture, Fifth Edition: A Quantitative Approach

Another one that goes to low level is: Code: The Hidden Language of Computer Hardware and Software

>"He starts with basic principles of language and logic and then demonstrates how they can be embodied by electrical circuits, and these principles give him an opening to describe in principle how computers work mechanically without requiring very much technical knowledge"

-wiki

u/Not0K · 5 pointsr/learnpython

If you would like a really in-depth explanation, check out Code.

u/ewiethoff · 5 pointsr/books

Petzold's Code: The Hidden Language of Computer Hardware and Software. Fascinating book about logic gates, character encoding, and so on.

u/el3r9 · 5 pointsr/explainlikeimfive

I would this in a top level comment but it’s against the rules of the sub to do so, but OP can check out this book, called “Code” is a great, truly ELI5 intro to computers. If someone is interested they can check it out.

u/AlSweigart · 5 pointsr/learnprogramming

Patternson's Computer Architecture: A Quantitative Approach was a pretty good book. I remember mostly teaching myself from that textbook since the prof I had wasn't a great lecturer.

You can probably find a PDF of it online easily enough.

EDIT: If you want a reasonable sized book instead of a big textbook, I'd recommend reading Petzold's Code, it's a fun read.

u/tolos · 5 pointsr/IWantToLearn

First, there are two requests: one from your title, and one from your description. The request from your title is a bit easier, though I'm afraid I won't be able to answer it satisfactorily. As far a real world example, that may be a bit harder because modern CPUs are pretty complicated. (When I was learning computer architecture at university, we never really discussed how an intel or AMD cpu worked -- just learned about the MSP430 and a couple of hypothetical CPUs. If you really want to see how a real CPU works, I'd suggest looking into a microprocessor to get started.)

A quick and dirty summary for MIPS CPU datapath Computer Organization and Design 4th ed page 315:

  1. The Program Counter (PC) loads the next instruction
  2. PC is incremented
  3. Instruction is parsed and the correct registers are loaded
  4. Registers are fed into ALU if necessary or
  5. Registers are passed into data memory (for read/write)
  6. Results from ALU/data memory are loaded back into registers
  7. next instruction

    Note that the MIPS example doesn't use pipe-lining, which generally makes things a bit faster. And there's only one code path executing. And there's no look ahead. Which isn't the case for modern CPUs.

    For further reading I highly recommend Code by Charles Petzold. I think it helped prepare me before going to college (for computer engineering).

    For video learning, a quick google search shows some videos that would probably be helpful (I haven't watched any of these).

    Sorry for the rushed response, I can expand on this more later if there's interest.
u/charlesbukowksi · 5 pointsr/learnprogramming

I liked it. I would also recommend reading CODE: http://www.amazon.com/exec/obidos/ASIN/0735611319

Between MIT's Python course, CS50 and that you'll have an excellent grounding in CS

u/PunsForHire · 5 pointsr/math

It sounds like you might perhaps want a background in Number Theory and/or Basic Logic and/or Set Theory. The thing about math is that there is a lot...

My advice for a text that might serve you well is N.L. Biggs' Discrete Mathematics (http://www.amazon.com/Discrete-Mathematics-Norman-L-Biggs/dp/0198507178). If you are at all interested in computer science, this is also a great book for that because it introduces some of the mathematical rigor behind it. Some people have a smidgen of difficulty with this text because it doesn't give some names to proofs/algorithms that maybe you've heard whispered (e.g. Dijkstra's shortest path and Prim's minimal spanning tree). A text that I tend to think is on par with Biggs', but many think is vastly superior (I love both, but for different reasons) that covers some (most) of the same topics is Eccles' An Introduction to Mathematical Reasoning (http://www.amazon.com/Introduction-Mathematical-Reasoning-Peter-Eccles/dp/0521597188/ref=pd_sim_b_4?ie=UTF8&refRID=1BB6VKRP59S2420M132F). This book has a wonderful focus on building from the ground up and emphasizes clearly worded and mathematically rigorous proofs.

You seem genuinely interested in mathematics, but I do want to warn you about some more ahem esoteric (read: improperly worded, perhaps?) problems that ask such things as why 1 is greater than 0. The mathematics here is largely armchair - lacking any fundamental logic. There would be no issue with redefining a set of bases such that "0" is greater than "1". However, if you want to have rationale of the concept of things being greater than another, that's more like number theory. You can learn the 10 axioms of natural numbers and then build from there.

Both of the books I mentioned will cover stuff like this. For example, they both (unless I'm not remembering correctly) delve into Euclid's proof of infinite primes, something which may interest you.

Briefly (and not so rigorously), assume that the number of primes, p1, p2, p3, ..., pN, is finite. Then there exists a number P which is the product of these primes. Based on the axioms of natural numbers, since all primes p1,p2,...,pN are natural numbers P is a natural number and so is P+1. Consider S = P+1. If S is prime than our list is incomplete, assume S isn't prime. Then some number in our list, say pI, divides S because any natural number can be written as the product of primes. pI must also divide P because P equals the sum of all primes. Therefore if pI divides S and pI divides P, then pI divides S-P = 1. That's a contradiction because no prime evenly divides 1.

Stuff like this is super cool, super simple, and super beautiful and you absolutely can learn it. These two books would be a great place to start.

u/vogonj · 5 pointsr/compsci

I'm quite a fan of Sipser's Introduction to the Theory of Computation: http://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/053494728X

It's not a full-on algorithms book but formal models were always the most interesting part of theoretical computer science to me. vOv

u/mpdehnel · 5 pointsr/computerscience

How formal do you mean? If you're interested in the theory of computer science, have a read of Sipser's Introduction to the Theory of Computation (or on Amazon - get it 2nd hand). This is a very theoretical book though, and most CS undergrad courses will only cover this type of content as a small part of the subject matter taught, so don't be put off if it doesn't immediately appeal or make sense!

Edit - links.

u/bonesingyre · 5 pointsr/webdev

Sure! There is a lot of math involved in the WHY component of Computer Science, for the basics, its Discrete Mathematics, so any introduction to that will help as well.
http://www.amazon.com/Discrete-Mathematics-Applications-Susanna-Epp/dp/0495391328/ref=sr_sp-atf_title_1_1?s=books&ie=UTF8&qid=1368125024&sr=1-1&keywords=discrete+mathematics

This next book is a great theoretical overview of CS as well.
http://mitpress.mit.edu/sicp/full-text/book/book.html

That's a great book on computer programming, complexity, data types etc... If you want to get into more detail, check out: http://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/0534950973

I would also look at Coursera.org's Algorithm lectures by Robert Sedgewick, thats essential learning for any computer science student.
His textbook: http://www.amazon.com/Algorithms-4th-Robert-Sedgewick/dp/032157351X/ref=sr_sp-atf_title_1_1?s=books&ie=UTF8&qid=1368124871&sr=1-1&keywords=Algorithms

another Algorithms textbook bible: http://www.amazon.com/Introduction-Algorithms-Thomas-H-Cormen/dp/0262033844/ref=sr_sp-atf_title_1_2?s=books&ie=UTF8&qid=1368124871&sr=1-2&keywords=Algorithms




I'm just like you as well, I'm pivoting, I graduated law school specializing in technology law and patents in 2012, but I love comp sci too much, so i went back into school for Comp Sci + jumped into the tech field and got a job at a tech company.

These books are theoretical, and they help you understand why you should use x versus y, those kind of things are essential, especially on larger applications (like Google's PageRank algorithm). Once you know the theoretical info, applying it is just a matter of picking the right tool, like Ruby on Rails, or .NET, Java etc...

u/jonride · 5 pointsr/askscience

If you're interested to learn the basic physicality of a computer, I'd recommend checking out a book by Charles Petzold: "Code: The Hidden Language of Computer Hardware and Software."

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

It's easy to read and provides a lot of insight into how circuitry embodies and propagates information!

u/Mattakinz · 5 pointsr/compsci
u/atommclain · 5 pointsr/apple

For the 'computers in general' side of things: Code: The Hidden Language of Computer Hardware and Software

u/UncleMeat · 5 pointsr/compsci

I cannot recommend the book Code by Charles Petzold highly enough. This is the book that solidified my love of computer science and hits most of the major topics in CS in an easy to understand and thoroughly entertaining way. By the end of the book you have walked through the fundamentals of how to build and program a rudimentary computer and had fun why doing it!

http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

u/DrAmbulanceDriver · 5 pointsr/learnprogramming

I'm assuming you just want to learn the basic information about how computers work and the principles behind programming them, right?

In that case, I'd recommend Code by Charles Petzold

Are you looking to actually learn how to program and write code in a specific language? If so, then I'd recommend Automate the Boring Stuff with Python by Al Sweigart. It covers the basic principles of writing functions and how computer logic works, and you'll actually be able to apply it to some practical uses. And since its Python, it'll run on a lot of different platforms. If you like it, you may want to get into working with the Raspberry Pi. Javascript is another good language to start with, but as a book, I really like this one.

If you already know a bit about programming, and just want a general reference book, then Computer Science Illuminated by Dale and Lewis is pretty good.

u/CharlieBlix · 5 pointsr/askscience

You should give this book a read Code: The hidden Language Of Computer Hardware and Software By Charles Petzold

It does a great job of explaining how it all works. Loved it and I don't know how to program (Yet).

u/ArthurAutomaton · 5 pointsr/math

It's Problem 7.29 in the third edition of Michael Sipser's Introduction to the Theory of Computation (you can find it by searching for "coloring" when using the preview function on Amazon). It's also in The design and analysis of computer algorithms by Aho, Hopcroft and Ullman as Theorem 10.12, though the proof there seems a little different from what you've sketched. (I hope that the Google Books link works. Sometimes it won't show the right preview.)

u/fbhc · 5 pointsr/AskComputerScience

My compilers course in college used the Dragon Book, which is one of the more quintessential books on the subject.

​

But you might also consider Basics of Compiler Design which is a good and freely available resource.

​

I'd also suggest that you have familiarity with formal languages and automata, preferably through a Theory of Computation course (Sipser's Introduction to the Theory of Computation is a good resource). But these texts provide a brief primer.

u/tbid18 · 5 pointsr/math

What do you mean by "I want to be computer scientist?" Do you want to do research for a living, e.g, work in academia or for a lab? Or is your goal more along the lines of, "I want to learn more about computer science?" If the former, you're not going to get far without a degree, usually a Ph.D is necessary. If the latter is your goal then the 'traditional' math subjects would be 'discrete' subjects like probability and combinatorics. Linear algebra is heavily used in machine learning, and I believe PDEs are used as well.

On the computer science side, computability theory and computational complexity are necessary. Sipser is the standard for computability theory, and I like Arora/Barak for complexity (I don't necessarily recommend buying on amazon; that price for Sipser is outrageous).

u/proproseprowess · 5 pointsr/adventuretime
u/akame_21 · 5 pointsr/learnprogramming

Despite their age, the MIT lectures were great. If you're good at math and enjoy proofs this is the class for you. Same thing with the CLRS book. One of the best books on DS & Algos out there, but it's so dense it'll make your eyes glaze over, unless you love proofs and highly technical reading.

To get your feet wet, Grokking Algorithms is a good book.

A lot of people recommend Princeton's Algorithm Course. I took Algorithms in school already, but I'm probably going to take this course to round out my knowledge.

EDIT: special shout out to geeks for geeks. Great Website

u/chakke_ooch · 4 pointsr/mbti

> Would you say there's more opportunity working exclusively front end and design to exercise nfp creativity or novelty?

NFP creativity and novelty in the sense that Ne has free range, period? Sure, you get more of that in web design and even more of that as to step further and further away from the sciences. There is tons of creativity in real software engineering where you can be creative to solve actually challenging problems, not figuring out what color you'd like a button to be. To me, that's not creativity – or it's a lesser version. Creativity in problem solving is much more interesting. The way I see it is like when I was in music school and all the SFs were bitching about music theory and how they thought it limited their ability to "be creative". Such bullshit. It only exposes their lack of creativity. So you're saying that someone like Chopin who wrote amazing pieces and abided by the rules of music theory wasn't being creative? Hardly.

> Are you a web dev?

No, I'm a software engineer at an astrodynamics company; I do a lot of orbital mechanics, back-end work with web services, high performance computing, etc.

> By hardcore I meant requiring being meticulous, detail oriented.

I think that the lack of attention to detail is never permissible in either back-end software engineering or front-end web development, honestly.

> One thing I've realized is how shit my high school was at explaining math conceptually. Which I think lead to misconceptions about its use in programming

Well, then read some books on computer science and/or mathematics like this.

u/Birkal · 4 pointsr/UCSD
u/DashAnimal · 4 pointsr/compsci

I know this is a pretty common recommendation and you've probably already heard of it
or even read it, but can I recommend the book Code: The Hidden Language of Computer Hardware and Software? I think having a history of how we got to where we are today (written in an entertaining way) is a good starting point, even though it barely scratches the surface of computer science.

u/_9_9_ · 4 pointsr/learnpython

This book is supposed to be good: https://www.amazon.com/dp/B00JDMPOK2/

I've yet to read it. I've been messing with computers for a long, long time. But, at some point I think most people agree that they are magic.

u/AnalyzeAllTheLogs · 4 pointsr/learnprogramming

Although more about product delivery and lifecycle management, i'd recommend:

https://www.audible.com/pd/Business/The-Phoenix-Project-Audiobook/B00VAZZY32

[No audiobook, but worth the read] The Mythical Man-Month, Anniversary Edition: Essays On Software Engineering https://www.amazon.com/dp/B00B8USS14/

[No audiobook, but about 1/3 the price at the moment for kindle and really good]
Code: The Hidden Language of Computer Hardware and Software (Developer Best Practices) https://www.amazon.com/dp/B00JDMPOK2/


https://www.amazon.com/Dreaming-Code-Programmers-Transcendent-Software/dp/B00AQ5DOCA

https://www.amazon.com/Scrum/dp/B00NHZ6PPE

u/MrAckerman · 4 pointsr/AskProgramming

I really enjoyed Code.

I feel it's a really accessible summary of what is going on under the hood of a computer.

u/Bizkitgto · 4 pointsr/learnprogramming

The book you are looking for is called Code: The Hidden Language of Computer Hardware and Software by Charles Petzold!

u/neop · 4 pointsr/compsci

I'm also a math major who turned into CS. There are already a lot of good recommendations here so I won't add much, but I suggest reading Code: The Hidden Language of Computer Hardware and Software by Charles Petzold.

It's not very technical and it's not in-depth, but I think it's an amazing book. You probably won't learn anything you're actually going to use by reading it, but I think this book has a unique ability for expressing the underlying facts that make us all find computer science so fascinating. It's a very fun read and it will give you a very broad overview of how computers work and how software gets compiled and ultimately ends up moving electrons around to make the magic happen.

u/ismtrn · 4 pointsr/compsci

This book(Code: This hidden language of computer hardware and software): http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_1?ie=UTF8&qid=1333490779&sr=8-1

It explains it all! The title makes it sound like it is about code, but it is really about how a computer works(code is of course a part of it). It is very easy to read and does not really require any prior knowledge, it actually starts by explaining how a flashlight works and builds on that.

I simply can't describe how awesome it is, you should really read it!

u/mattandersen · 4 pointsr/compsci

You may be beyond this book, or it may not be full of the harder science of logic design but every CE or CS student should have a copy of CODE from Charles Petzold. It will probably fill in a lot of gaps of a formal classroom discussion of processor architecture, and it provides a great set of tools to explain the concepts to others. Which for me has always been the benchmark of understanding. http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

u/ZeroPintEnergy · 4 pointsr/ProgrammerHumor
u/claytonkb · 4 pointsr/AskComputerScience

I'm partial to Hopcroft and Ullman. I'm usually not the "open a book, read the explanations..." type of learner but Hopcroft&Ullman is very clear and concise, so I found myself doing just that. No need to skip ahead to figure out the point of it all, they just explain it in logical order. This text goes quite a bit deeper than most algorithms courses will go but it's a great way to pump iron with your brain and everything you learn will be applicable to whatever curriculum is taught in your algorithms course.

u/Isenhatesyou · 4 pointsr/compsci

Sipser's Introduction to the Theory of Computation is somewhat of a classic in the field. I just really hate his notation.

u/diablo1128 · 4 pointsr/cscareerquestions

Yes I took "Theory of Computation" as well. It's was one of those classes where I went where the average grade is an F and everything is just scaled up. It really kick my ass sadly. I think I took it the same semester as compilers as it was not a pre-req at my school.

This is the book we had I believe is: https://www.amazon.ca/Introduction-Theory-Computation-Michael-Sipser/dp/053494728X

u/awj · 4 pointsr/programming

It may be a big of a tough slog, but Sipser's Introduction to the Theory of Computation is great. The stuff on computability theory might be right up your alley, and even if you only make it through the chapter on deterministic finite automata you likely will be better at crafting a regular expression than many of my CS student peers.

Surprisingly enough, the book should be able to help you make sense out of that last sentence within 100 pages requiring only a bit of understanding of proofs. I think if you've got predicate logic under your belt you pretty much have all you need.

u/rohit275 · 4 pointsr/hardware

I haven't read it, but it looks pretty good. I can personally vouch for this book however:

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=pd_sim_14_2?_encoding=UTF8&psc=1&refRID=C8KMCKES83EHGXS3VWSQ

It's truly amazing. I'm currently an EE PhD student but I had a pretty limited background in digital hardware and computer architecture, and I read most of this book just out of interest a little while ago and frankly learned quite a bit. It's written at a very readable level for anyone with almost no prior knowledge, yet gets technical when it needs to. It's very thorough, but approaches the topics at a wonderful and easy pace with very clear explanations. The author even says you can skip some of the more technical details if they're not of interest to you, and you'll still end up learning quite a lot. The book you posted looks pretty similar, so I'd say it's worth a shot.

u/mwassler · 4 pointsr/webdev

Everyone seems to have good things to say about khan academy's comp sci courses.

A few good lower level books in my opinion are this one which is maybe less technical but a good lower foundation and then From Mathematics to Generic Programming by Alexander Stepanov.

I think your probably just experiencing outliers in your job search. If you keep at it your luck will probably turn around.

u/dr_dalek · 4 pointsr/explainlikeimfive

Take a look at this book: Code The book starts off with a switch and builds a whole computer from there.

u/nattoninja · 4 pointsr/learnprogramming

Code is a really good book that goes into how it all works, from the basics of binary and electrical signals and builds from there. The text is very straightforward and there are lots of good illustrations.

u/p7r · 4 pointsr/NoStupidQuestions

I've taught a lot of people how computers work, or more precisely how to program them. I am sure you can learn too.

First, let's make it fun.

There is a lot of material for people who like the Raspberry Pi out there that is fun and simple. You don't even need to own a Raspberry Pi to understand what they're talking about.

It's fun and simple because it's designed for youngsters who find long/complex books a bit too boring. I think you might enjoy it, because you've said you've found the books you've tried too boring.

Here is a load of magazines about the Pi - on each issue you can click on "Get Issue" and then under the cover "download the PDF" and read it and see if you enjoy that.

Next, have a play with Scratch. It's designed for kids but the exact same concepts are in professional programming languages.

The reason I recommend it is not because I think you are a child, but because it's a lot of fun and makes a lot of the dull and boring bits of programming go away so you can focus on the fun bits.

You have to remember all the things going on inside a computer are basically the things going on in there - just a lot more complex.

If you ever want to learn a programming language that professional developers use, I think you'll like Ruby.

It's very forgiving for new developers, but still lets you do what we would call "production grade" code. It's what I work in most days.

Also, why's poignant guide is quite funny, but you might find it a bit weird and confusing - I know I did the first time I read it. :-)

I also recommend this book to you: Code by Charles Petzoid. The first few chapters don't seem like they're about computers, because they talk about flags and electrical circuits - that's because you need to understand those things first.

If you can read and understand the whole thing you will know more about how computers work than half of the professional software engineers out there. And they're normally quite a clever bunch.

If you find it too difficult, slow down and think. Each paragraph has something in it worth thinking about and letting it mull over in your mind.

IQ is not a measure of how much you can learn, but perhaps how quickly it can see patterns and understand things.

You having a lower IQ than somebody else does not mean you can't see those patterns or understand things, it just means it might take you a little more thinking to get there. I'm sure you will.

If you ever have any questions about computers, I'd love to try and help answer them - feel free to ask me.

u/autophage · 4 pointsr/IWantToLearn

Lots of people are recommending ways to learn a language, and I just want to pop in to say: most popular languages are much more alike than they are different. Learn any one of them for a few months (until you're no longer looking up references for how to write a for loop or getting confused by the language's comparison operators), then try your hand at a different language.

If you find something hard to grasp in one language, it's probably about equally hard to grasp in another language - so don't just think "Hmm, well, maybe this is easier in other-language" and switch over to that one instead. (There are a few exceptions - for example, you don't have to worry about memory management in Java the same way that you do in C. You can still get memory leaks in Java, but the fact that you've got garbage collection makes memory management on the whole far simpler.)

In terms of getting into hacking - the first step, hands down, is to read this book. It will teach you the really really basic stuff, on a far deeper level than most laymen ever think about, in a very gentle and even fun way. After that, start getting your hands on networking texts, security texts, and just plain writing a lot of code. Get the source to some popular open source projects (Apache, for example) and run it in a debugger, watching how the values change and looking for unexpected things.

u/-___I---I-___ · 4 pointsr/learnprogramming
  1. topic name: Fundementals, discrete math, algorithms, a good book to start with, there are tons of free courses and lectures on the internet, but you will have to type in the specific search terms

  2. idk

  3. idk
u/obscure_robot · 3 pointsr/occult

HTML represents a particular line of thinking about how best to present information for machine processing and ultimately rendering so that a human can read it. The weight that HTML carries is entirely due to its success in the marketplace, it isn't a particularly good or bad exemplar as to how things should be done.

Some languages use paired symbols to indicate the beginning and end of a block of code, such as { and } or even begin and end. Others don't have an explicit beginning, but whatever valid command is first encountered is the beginning, and the end-of-file marker (a thing that no one but programmers and other very curious people ever see) signifies the end.

Hopcroft & Ullman's book on Automata Theory is a great place for someone with a magickal background to dive into computer science.

If you just want to see the many different ways of composing a very simply program (one that prints "hello, world!"), check out this list at Wikipedia.

I'm happy to answer more specific questions here or via private messages.

u/ffualo · 3 pointsr/askscience

Hi RandomNumber37,

So here's a little bit about me first; I don't want to misrepresent myself. My background is in economics and political science, where I was interested in statistical models that predict rare international events like war and state failure. It's here I became obsessed with statistics, machine learning, etc. Also, I've been programming in many languages since I was a kid, so after my undergraduate work in the social sciences and statistics, I took a job with a bioinformatics group doing coding. I thought this would be a temporary job until graduate school in economics or quantitative political science.

However working with large-scale biological and sequencing data was way more awesome than I expected. This caused me to shift focus. I also did a fair amount of work on computational statistics, i.e. ways of trying to make R better, understanding compiler technologies, etc. So after, I became more purely interested in statistics and computational biology, and I thought I would go to graduate school for pure statistics so I could also devote some time to computational statistics. However, now I work in a plant breeding lab (which I absolutely love). I will do this about another 2-3 years before I transition into a graduate program. This would mean I've worked in the field about 6 years before applying to graduate programs.

So, with that out of the way here are answers to your questions and some advice I offer:

  1. How much of your time is spent working with the plants themselves vs with computer-organized data?

    Being that my background isn't in biology, I don't currently work with plants much. However, this is why I moved towards plant biology. Before getting obsessed about social science methods, I loved plants. I worked at an orchid greenhouse, and actually went to UC Davis thinking I'd study plant biology (until an awesome political science professor got me excited about science applied to political data). However, the scientists I work with are often not doing too much work with plants: many grow the plants, do the wet lab work, then spend more than half the time (sometimes up to 90%) analyzing the huge amount of data. I spend my full day in front of a computer, except when a colleague wants me to check out something cool in the lab, etc.

  2. With what kind of operations does your computer aid you?

    Everything. We get raw sequencing data, I have to analyze it from start to finish. Or, from raw sequencing files until the point where the numbers behind it tell a story. I also spend a huge amount of my time writing programs that do certain things for biologists in our group. Everything — protein prediction, data quality analysis, statistical modeling, etc.

  3. Do you see a full cycle... from plant, to data, to application of knowledge to your specimens (and back to data)?

    Yes, at this current position I am starting to (which I why I sought work in plant biology). It depends on what plant you work with (Arabidopsis = short life cycle, you can do lots of stuff, vs citrus tree = long life cycle, you can't do lots of stuff). But some of the more awesome longer term projects will take 4 years to fully materialize.

    So now, what steps were more important? I will tell you the three things that have helped me the most. As a point of how much they've helped me, I'll just mention that despite that not having a Phd (yet), or much of a background in biology other than what I've taught myself or learned on the job (which is actually quite a lot after 4 years in the field), I've had (and continue to receive) really nice job offers.

  4. Learn programming really, really, really well. If you want to be a step above the rest, learn python and R. Perl is huge in bioinformatics, but it's a disgusting ugly language that's dying out in my opinion. It sucks for reproducibility; no one can read anyone else's code. It was great when everyone was racing to get the human genome sequenced and had to write quick scripts constantly. Now, we have larger software platforms for that stuff, and what will count most in the future is the distribution of your scientific code. Reproducibility problems will soon be primarily dry lab, not wet lab. If you doubt that, read the "Forensic Bioinformatics" paper (http://projecteuclid.org/euclid.aoas/1267453942) which was a game changer for me. I've always been passionate about open science and reproducibility, but that made me realize that we'll have a huge problem in a few years if we're not careful.

    Anyways, I'd recommend learning:

  • Python (with BioPython). Also, with Django if you're building web apps to interface with scientific databases.
  • R (with Bioconductor).
  • Unix command line (sed/awk, bash)
  • Know your editor. I use emacs. Even if it takes you 80 hours to learn emacs or your editor well, you will regain that time over a year of work. I promise. People watch me use emacs and they say it makes them dizzy because they can't keep up. That's dozens of hours saved each week.

    Now, optionally (but highly, highly recommended):

  • C. Absolutely necessary to debug compiling programs or writing high-usage programs that need to be fast.
  • SQL. You'll be storing biological data in databases. SQL is important. Use SQLite a lot. People like huge PostgreSQL or MySQL databases for even small things, but this is a waste of time IMO if you're just going to be the one accessing it. Bioconductor leverages huge amount of SQLite because it's so easy and awesome.

    Now, even more optionally:

  • Lisp. Lisp will change the way you think about programming. It's also used with AraCyc, MetaCyc, and PlantCyc data. I've used it extensively in these applications. The ratio of how Lisp has changed my thinking to how much I use it in production code is HUGE. Learn functional programming concepts; then concepts like map/reduce will fall easily into place. Know object orientation too.

  • Javascript. I love JS. It's doing amazing things too. And part of being a very effective bioinformatician/statistician is being able to easily convey your data. There is no easier and more interactive medium than a browser. Check out d3.js. Even old scientists can click a link and interact with data via Javascript. In contrast, they wouldn't want to install some old dusty Java application. Of course, with this comes HTML, XML, JSON, etc, etc. so learn those too.

  1. Learn statistics REALLY WELL. Honestly, try to pick up a statistics minor (over a CS minor IMO). Lots of brilliant programmers buy the Cormen algorithm book and are set for data structures and algorithms. But understanding statistics at a deeper level — that takes intimate study via courses. I would recommend taking courses on probability theory and mathematical statistics. I took two courses as part of our mathematical statistics series and I cannot even begin to emphasize how helpful they were. I hear a quote once: at Google they use Bayes theorem like other programmers use "if" statements. Same thing in bioinformatics. Look at the best SNP callers, software, etc, and they're using population genetics models and Bayes approaches. Know math stats early, and it will permeate your thinking in the best ways.

    Another quick story: I had a statistics graduate student come tell me he was working for a rather well known genomics professor on campus. He asked me how to analyze RNA-seq data. He said he wanted to use ANOVA. Even though he was a statistics graduate student, he went immediately to normality-assuming models, which was definitely not the case with this data the case. So know your Poisson, negative binomial, gamma, etc distributions. A probability course should introduce them all to you. It will also means when you start learning more theoretical population genetics, you'll be set.

    Also, buy a book on machine learning (Elements of Statistical Learning II is good, and a free PDF is available). ESL II is good, but dense; don't let it discourage you. I also like this book. But again, this is dense stuff, don't let it discourage you.

  2. Learn data structures and algorithms well. I think a single course, or doing this on your own is sufficient. However, if you want to do what Heng Li does (author of BWA, samtools, and fermi assembler) you need much, much more. Compression-based data structures are huge in bioinformatics now. I love this stuff, but it's too removed from the biology to be very interesting to me. But if that's the direction you want to move into, hang around CS department more.

  3. Learn to code well. This is vastly underemphasized in the sciences. Learn about test-driven development. Get the habit of writing unit tests early, and writing good documentation. Learn Git too — this is a must.

u/inataysia · 3 pointsr/compsci

course name: randomized algorithms

course link: don't have one

text book: randomized algorithms (link)

My professor was terrible but the material blew my mind. I was looking forward to possibly taking this course again someday from the author (as a non-degree course at stanford), but he passed away in 2009.

fun fact: after staring at this book for too long I can only see the words "orgasm" and "hamster"

u/wilywes · 3 pointsr/programming

The goto theory book by Sipser.
Excellent for C programming.
Programming in general.
My favourite.
You can probably find all of these at a library.

u/shimei · 3 pointsr/compsci

Michael Sipser's Introduction to the Theory of Computing is another good book on this topic. Very readable and short.

u/tryx · 3 pointsr/math

I would recommend (and I find that I recommend this book about every 3rd thread which should say something) a book on theoretical computer science. The book is all about the beautiful mathematics that underlie all of computing.

Computers keep getting faster and faster, but are there are any questions that we can never answer with a computer no matter how fast? Are there different types of computers? Can they answer different types of questions?

What about how long it takes to answer a question? Are some questions fundamentally harder than others? Can we classify different problems into how hard they are to solve? Is it always harder to find a solution to a problem than to check that a solution is correct? (this is the gist of the famous P=NP problem )

The book has essentially no prerequisites and will take you through (initially) basic set theory moving on to logic and then the fun stuff. Every proof is incredibly clearly written with a plain English introduction of what the author is about to do before the technical bits.

The only downsides are that is quite expensive and you might have trouble finding it unless you have access to a university library/bookshop.

Good luck with your love of mathematics!

Edit: lol the book... Introduction to the theory of computation - Sipser

u/space_lasers · 3 pointsr/answers

I'm sure any computational theory book will work for you. Here's the one I used: http://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/0534950973

It goes through deterministic and nondeterministic automata, context free grammars, turing machines, and all that stuff.

u/CSHunter33 · 3 pointsr/cscareerquestions

Congrats! I'm on such a programme at the University of Bath in the UK right now. Bath's programme teaches C and Java, with a little Python in some elective modules. There are also theory of computation modules with a bunch of discrete maths in.


If you let me know which specific course you're attending, what your undergrad was, and how technical a career you might want I can give more tailored advice.


I did the following for prep, after researching what modules I'd be taking:

  • MITx's Intro to CS MOOC (amazing)
  • Read and did all exercises from several relevant chapters from a "Discrete Maths for Computing" textbook I got second hand for a fiver (helped a lot in a maths-heavy module)
  • read the oft-recommended book Code (useful for awareness, but not essential, especially since we do no Comp Arch at Bath)
  • did some algorithm challenges at places like leetcode.com and firecode.io once I had done the Intro to CS MOOC


    Conversion masters are generally very intense so doing prep now and over summer is a great idea. The stuff I listed helped immensely and I would do the same again - perhaps I would switch the MITx MOOC to Harvard's CS50 instead since CS50 has a bigger spread of languages. If I had had more time on my hands, proceeding from here to do a Java MOOC would have been really useful also.

    Working on the Leetcode/Firecode challenges helped a lot for general programming practice, and will also be helpful prep for job hunting later.
u/lespea · 3 pointsr/programming

I can easily recommend the book Code: The Hidden Language of Computer Hardware and Software ; fascinating stuff!

u/shrapnull · 3 pointsr/programming

For anyone that hasn't read it, [Charles Petzold's "Code"] (http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_1?ie=UTF8&qid=1407846258&sr=8-1&keywords=code+charles+petzold) is a great read through the history of computing and programming.

u/curious_webdev · 3 pointsr/compsci

Not all on topic as "CS" books, more general programming, but here's a short list. I also suggest the opening chapter or two of a lot of books for stuff you don't know but are interersted in. They're generally just nice easy to read introductions.

u/Helix_van_Boron · 3 pointsr/compsci

Code: The Hidden Language of Computer Hardware and Software by Charles Petzold. This is the book that made me decide to switch my college major to CS. It gave me great insight to what I was manipulating inside of a computer. It might not be very helpful to an experienced computer scientist, but I recommend it to anybody that's interested in getting into CS. And even if you understand all of the concepts in it, it's still an interesting read.

u/pgvoorhees · 3 pointsr/c_language

As /u/Shok3001 said, your course textbook is going to be a good place to start.

As an additional note, it might be really beneficial to know how computers work in general. To this end, read Code by Charles Petzold. Once you see the underlying mechanics of a computer, you will see more easily why things are the way they are in the language and how to manipulate data inside a computer.

u/Summerdown · 3 pointsr/askscience

I think this book is exactly what you're looking for. I bought it recently and am now half-way through it, and it's fascinating.

u/ULICKMAGEE · 3 pointsr/AskEngineers

Honestly just purchase this book it's exactly what you're looking for and will do a far better job than a bunch of condensed replys to your inbox.

It's a really good book.

u/emcoffey3 · 3 pointsr/learnprogramming

Definitely take as many web-related classes as possible and at least one database class. The other two focus areas you mentioned are important, but probably not as important as the first two. Maximize your time on learning the important stuff; the other stuff can be learned later.

If you want to learn about UML diagrams, check out Martin Fowler's UML Distilled; it's an easy read and handy as a reference. Likewise, Charles Petzold's Code is one of my favorite hardware-related books.

u/Cogniphile · 3 pointsr/learnprogramming

What is their approximate age?

Highest education level achieved?

Any prior experience in programming or computer science?

What is your budget?

Do you have experience programming?

The hardest part is finding something that is interesting and will keep an inexperienced reader motivated. Otherwise you could throw some undergrad level textbooks at them and they'd come out better than most college students who don't actually study.

I highly suggest you stay away from front-end programming because it will be very frustrating to learn about making interfaces when you can't make an interface. This means stay away from html, css, and such.

One possibility is having the person write programs on paper and forward them to you. You can type them into a pc, check for bugs, and even print out the results to send back.

Also I've had this book recommended but haven't read it myself:

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319


This also might work:

https://www.amazon.com/gp/aw/reviews/1593276664/ref=cm_cr_dp_mb_see_rcnt?ie=UTF8&s=sd

u/CaRDiaK · 3 pointsr/learnprogramming

If you're interested in the history then Code by Charles Petzold it's great. It explains how code has existed for hundreds if not thousands of years in many different forms.. He takes you right the way through from people signalling using lights, morse code, relays to modern day processors. What's cool about it is you can just pick it up and put it down which I get the feeling is what you're looking for ; https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_1?ie=UTF8&qid=1467960451&sr=8-1&keywords=code+charles+petzold

u/fj333 · 3 pointsr/compsci

Read either of these books, and your CS education will become much much easier. The first book is lighter reading, the second book is intended to be studied over two full semesters, but I finished it in about 2 months because I enjoyed it so much.

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/

https://www.amazon.com/Elements-Computing-Systems-Building-Principles/dp/0262640686/

Truth charts should be fairly easy to understand conceptually. To understand how they relate to hardware may be more difficult at first. In the second book about, you are told at the start that a NAND gate is a magical piece of hardware that can do one very simple boolean operation. From that magical gate, you can then learn how to build every other kind of boolean gate, from which you can build ALU's, RAM, CPU's, and a computer. A NAND gate can be made out of many things; you could make one out of legos. The reason NAND in computer hardware is so cool and so scalable is because it is tiny.

u/Great_Lord_Kek · 3 pointsr/EngineeringStudents

If they just explain them as a logic table it makes no sense. looking at them as they perform in a circuit is much more intuitive and clear; i'd recommend a book (it's pretty old at this point) by Charles Petzold called "Code: the hidden language of computer hardware." https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

That book goes over circuits from the bare bones (lighting up a lightbulb) all the way to latches and RAM arrays. It's not dumbed down either.

u/johnsibly · 3 pointsr/programming

I'd certainly recommended Charles Petzold's "Code" as a source of material:
http://www.amazon.co.uk/Code-Hidden-Language-2nd-DV-Undefined/dp/0735611319/ref=sr_1_1?ie=UTF8&s=books&qid=1265836950&sr=1-1
Great explanations of all the points you mention

u/KyleRochi · 3 pointsr/ComputerEngineering

Codecadmy! I recommend python or ruby. They are pretty easy languages to pick up, so you will have a good understanding of programming concepts when you start doing C/C++ or java. Also for digital logic I recommend picking up a copy of [Code](Code: The Hidden Language of Computer Hardware and Software https://www.amazon.com/dp/0735611319/ref=cm_sw_r_cp_api_nxOAyb12J4N87) by Charles Petzold. It is by no means a comprehensive guide, but you will be familiar with everything when you take a logic class and while most of the class is trying to figure out what an adder is you will already know and be focusing on how and why it works

u/wannabeproprogrammer · 3 pointsr/computerscience

Computer processors have historically been in powers of 2 in terms of instruction sets because of boolean logic and how it relates to binary numbers. If you were building a very rudimentary computer, i.e a circuit which is just on and off then you can say that the circuit only represents 2 states. This can be encoded as just 0 or 1. In fact this is what transistors do, they hold either an on or off state. Now let's say you want to represent more than two states? How would you go about that with what you already have? Well you introduce another transistor. Now you can represent 4 states. This is done following the same logic as before, so these 4 states can be 00, 01, 10, 11, where each digit corresponds to the on or off state of one of the transistors. In fact you can repeat this pattern ad-infinitum and keep adding more and more on-off holding transistors. What you'll find is when you do this, is that the number of states that your rudimentary CPU can hold will always correspond to the number of memory units that can be accessed at any time. This will be 2\^N hence the 32-bit and 64 bit numbers which are powers of two. The 32 and 64 correspond to the number of unique states that can represent a 0 or a 1 or an off or on, or rather the memory that can be accessed at any time.

​

Now in modern CPU's this can get a lot more complicated in terms of architecture as in reality CPU's may have million of transistors and have multi-cores but are still considered 32-bit or 64-bit. In reality the 32-bit and 64-bit in this context relates to the instruction set architecture of the CPU. This relates to the format of instructions that the CPU handles to perform operations i.e. in a 32 bit architecture, the CPU will handle instructions of 32 bits long which determine whether you're accessing memory, writing a value, triggering an interrupt signal and so on. If you really want to understand how CPU's work I recommend reading this book. It explains how CPU's work from a very rudimentary base all the way up to how machine code translates to actual CPU instructions. Hope this helped.

u/redcalcium · 3 pointsr/webdev

I'm not really good at explaining stuff, but this book is a great resource regarding turing machine and turing completeness: Introduction to the Theory of Computation

Turing machine is a model first proposed by Alan Turing. It's similar to Finite Automaton, but it has unlimited tape memory. Basically, modern computers is an implementation of the Turing Machine. One characteristics of a turing machine is it's possible to use a turing machine to simulate another turing machine.

A programming language is said to be turing complete if it can be used to simulate a turing machine. If you've written a program in a turing complete language, you should be able to translate that program to any turing complete programming language.

So if css become a turing complete language, I imagine people will use it to create crazy stuff such as an x86 emulator.

u/falafel_eater · 3 pointsr/AskComputerScience

Computer Science is a pretty big field, so "strong foundation" can mean different things to different people.
You will definitely want the following:

  1. Introduction to Algorithms and Data Structures
  2. Introduction to Computability
  3. Introduction to Operating Systems

    For algorithms and data structures, a very commonly used textbook is Cormen.
    For computability, Sipser.

    Operating Systems I don't remember off the top of my head.
    That said, you are probably much better off finding a high-quality university course that is based on these textbooks instead of trying to read them cover-to-cover yourself. Check out lecture series from places like MIT on youtube or whatever.

    After that, you can take an Intro to Artificial Intelligence, or Intro to Communication Networks, or any other intro-level course to a more specific sub-area. But if you lack basis in computability to the point where you don't know what an NP-Complete problem is, or have no idea what a Binary Search Tree is, or do not know what an Approximation Algorithm is, then it would be hard to say you have a strong foundation in CS.
u/dionyziz · 3 pointsr/cryptography

Hi,

You should already know most of the math you need to know from your math major. It helps to know number theory, group theory, and algebraic curves, depending on what you do. Important knowledge is also discrete mathematics: Discrete probability theory, Markov chains, graph theory, logic, proof methods, solving recurrences, etc. are all helpful tools.

In terms of computer science, it's imperative to know how to program. You can learn a programming language such as Python. Project Euler is a good place to start for a mathematician. Knowledge of algorithms is also important, and you must understand computability and complexity theory.

Stanford's Cryptography I is a good place to start learning cryptography. I think you can go ahead and start the course without attempting prerequisites and see where you get stuck to go and learn what is required of you.

u/ogrisel · 3 pointsr/MachineLearning

Off-course R is used for machine learning. It's probably the most popular language for interactive exploratory and predictive analytics right. For instance most winners of kaggle.com machine learning competitions use R at one point or another (e.g. packages such as randomForest, gbm, glmnet and off-course ggplot2). There is also a recent book specifically teaching how to use R for Machine Learning: Machine Learning for Hackers.

My-self I am more a Python fan so I would recommend python + numpy + scipy + scikit-learn + pandas (for data massaging and plotting).

Java is not bad either (e.g. using mahout or weka or more specialized libraries like libsvm / liblinear for SVMs and OpenNLP / Standford NLP for NLP).

I find working in C directly a bit tedious (esp. for data preparation and interactive analysis) hence better use it in combination with a scripting language that has good support for writing C bindings.

u/root_pentester · 3 pointsr/blackhat

No problem. I am by no means an expert in writing code or buffer overflows but I have written several myself and even found a few in the wild which was pretty cool. A lot of people want to jump right in to the fun stuff but find out rather quickly that they are missing the skills to perform those tasks. I always suggest to people to start from the ground up when learning to do anything like this. Before going into buffer overflows you need to learn assembly language. Yes, it can be excellent sleep material but it is certainly a must. Once you get an understand of assembly you should learn basic C++. You don't have to be an expert or even intermediate level just learn the basics of it and be familiar with it. The same goes for assembly. Once you get that writing things like shellcode should be no problem. I'll send you some links for a few books I found very helpful. I own these myself and it helped me tremendously.

Jumping into C++: Alex Allain

Write Great Code: Volume1 Understanding the Machine

Write Great Code: Volume2 Thinking Low-Level, Writing High Level

Reversing: Secrets of Reverse Engineering

Hacking: The Art of Exploitation I used this for an IT Security college course. Professor taught us using this book.

The Shellcoders Handbook This book covers EVERYTHING you need to know about shellcodes and is filled with lots of tips and tricks. I use mostly shells from metasploit to plug in but this goes really deep.

.

If you have a strong foundation of knowledge and know the material from the ground-up you will be very successful in the future.

One more thing, I recently took and passed the course from Offensive Security to get my OSCP (Offensive Security Certified Professional). I learned more from that class than years in school. It was worth every penny spent on it. You get to VPN in their lab and run your tools using Kali Linux against a LOT of machines ranging from Windows to Linux and find real vulnerabilities of all kinds. They have training videos that you follow along with and a PDF that teaches you all the knowledge you need to be a pentester. Going in I only had my CEH from eccouncil and felt no where close to being a pentester. After this course I knew I was ready. At the end you take a 24-long test to pass. No questions or anything just hands on hacking. You have 24 hrs to hack into a number of machines and then another 24 hours to write a real pentest report like you would give a client. You even write your own buffer overflow in the course and they walk you through step by step in a very clear way. The course may seem a bit pricey but I got to say it was really worth it. http://www.offensive-security.com/information-security-certifications/oscp-offensive-security-certified-professional/

u/DevilsWeed · 3 pointsr/darknetplan

As someone with zero programming experience, thank you for the reading list. I was just planning on trying to learn python but I don't know if that's the best language to start with. Would you recommend just reading those books and starting with C?

Also, since I have no experience a technical answer would probably go right over my head but could you briefly explain how someone would go about messing around with an OS? I've always wondered what people meant by this. I have Linux installed on a VM but I have no idea what I could do to start experimenting and learning about programming with it.

Edit: Are these the books you're talking about? Physical Computing, C programming, and Writing Great Code?

u/ospatil · 3 pointsr/algorithms

Learning JavaScript Data Structures and Algorithms - Second Edition is a really good book with clear explanations and code examples.

Grokking Algorithms is also a wonderful book especially for non-CS people and beginners. The examples are in Python but it shouldn't be a problem given your Ruby and JavaScript background.

u/rispe · 3 pointsr/javascript

Congratulations! That's a big step. Be proud that you were able to make the switch. Not many people manage to transform ideas into results.

I think there are four areas on which you need to focus, in order to go from mediocre to great. Those areas are:

  1. Theoretical foundation.
  2. Working knowledge.
  3. Software engineering practices.
  4. Soft skills.

    Now, these areas don't include things like marketing yourself or building valuable relationships with coworkers or your local programming community. I see those as being separate from being great at what you do. However, they're at least as influential in creating a successful and long-lasting career.

    Let's take a look at what you can do to improve yourself in those four areas. I'll also suggest some resources.

    ​

    1. Theoretical foundation

    Foundational computer science. Most developers without a formal degree have some knowledge gaps here. I suggest taking a MOOC to remediate this. After that, you could potentially take a look at improving your data structures and algorithms knowledge.

  • CS50: Introduction to Computer Science
  • Grokking Algorithms
  • Algorithms by Sedgewick

    ​

    2. Working knowledge.

    I'd suggest doing a JavaScript deep-dive before focusing on your stack. I prefer screencasts and video courses for this, but there are also plenty of books available. After that, focus on the specific frameworks that you're using. While you're doing front-end work, I also suggest you to explore the back-end.

    ​

  • FunFunFunction on Youtube
  • You Don't Know JS
  • JavaScript Allonge
  • JavaScript Design Patterns

    3. Software engineering practices.

    Design patterns and development methodologies. Read up about testing, agile, XP, and other things about how good software is developed. You could do this by reading the 'Big Books' in software, like Code Complete 2 or the Pragmatic Programmer, in your downtime. Or, if you can't be bothered, just read different blog posts/Wikipedia articles.

    ​

    4. Soft skills.

  1. Actively seek to mentor and teach others (perhaps an intern at work, or someone at a local tech community, or create blog posts or videos online).
  2. Get mentorship and learn from others. Could be at work, or open source.
  3. Go to programming meetups.
  4. Try public speaking, go to a Toast Masters meetup.
  5. Learn more about and practice effective communication.
  6. Learn more about business and the domain that you're working in at your company.
  7. Read Soft Skills or Passionate Programmer for more tips.

    ​

    Some closing notes:

    - For your 'how to get started with open source' question, see FirstTimersOnly.

    - If you can't be bothered to read or do large online courses, or just want a structured path to follow, subscribe to FrontendMasters and go through their 'Learning Paths'.

    - 4, combined with building relationships and marketing yourself, is what will truly differentiate you from a lot of other programmers.

    ​

    Sorry for the long post, and good luck! :)
u/WineEh · 3 pointsr/WGU_CompSci

Since the course is in Python you could also always take a look at Grokking Algoriths. I think this is a great book that people overlook. It’s easy to understand and a (fairly) quick read. https://www.amazon.com/Grokking-Algorithms-illustrated-programmers-curious/dp/1617292230

Common sense is also a great book. If you really want to brush up on DS&A you could check out Tim Roughgarden’s coursera courses and the related books.

I will point out though that at least from my experience you can access the course materials even for courses you transferred in so once you get access to you courses or even when you start DS&A at WGU you can always refer back if you’re struggling.

u/k4z · 3 pointsr/UCONN

probably is. don't have it handy.

since you're learning data structures and algorithms in python, any general data structures and algorithm course should work; just implement them in python.

it's hard to suggest [a good resource] off the top of the head, that isn't a mere udemy shill or incredibly dense like stanford's algo course. grokking algorithms was okay, while people might suggest introduction to algorithms (but there's a reason why it's 1k pages and pure madness to "refresh" your knowledge).

doing projects (crash course, automate) would help to refresh using python.

u/ThinqueTank · 3 pointsr/programming

I've actually been getting some interest lately because I just started to attend meetups last week and let some engineers see what I've done so far.

Really appreciate the algorithms/data structures advice. I picked up this book to get a light overview of it first before I really dive into something more formal:
Grokking Algorithms: An illustrated guide for programmers and other curious people

I also have enough college credits to take a Data Structures course called Discrete Structures for Computer Science and my math up to Linear Algebra completed. Here's the description for the community college course:

> This course is an introduction to the discrete structures used in
Computer Science with an emphasis on their applications. Topics
covered include: Functions; Relations and Sets; Basic Logic; Proof
Techniques; Basics of Counting; Graphs and Trees; and Discrete
Probability. This course is compliant with the standards of the Association
for Computing Machinery (ACM).

Is the above what you're referring to more or less?

Are there any books and/or online courses you'd personally recommend for data structures and algorithms?

u/KlaytonWade · 3 pointsr/java

Grokking Algorithms.
This is the best one yet.

u/schreiberbj · 3 pointsr/compsci

This question goes beyond the scope of a reddit post. Read a book like Code by Charles Petzold, or a textbook like Computer Organization and Design or Introduction to Computing Systems.

In the meantime you can look at things like datapaths which are controlled by microcode.

This question is usually answered over the course of a semester long class called "Computer Architecture" or "Computing Systems" or something like that, so don't expect to understand everything right away.

u/FearMonstro · 3 pointsr/compsci

Nand to Tetris (coursera)

the first half of the book is free. You read a chapter then you write programs that simulate hardware modules (like memory, ALU, registers, etc). It's pretty insightful for giving you a more rich understanding of how computers work. You could benefit from just the first half the book. The second half focuses more on building assemblers, compilers, and then a java-like programming language. From there, it has you build a small operating system that can run programs like Tetris.

Code: The Hidden Language of Hardware and Software

This book is incredibly well written. It's intended for a casual audience and will guide the reader to understanding how a microcontroller works, from the ground up. It's not a text book, which makes it even more more impressive.

Computer Networking Top Down Approach

one of the best written textbook I've read. Very clear and concise language. This will give you a pretty good understanding of modern-day networking. I appreciated that book is filled to the brim of references to other books and academic papers for a more detailed look at subtopics.

Operating System Design

A great OS book. It actually shows you the C code used to design and code the Xinu operating system. It's written by a Purdue professor. It offers both a top-down look, but backs everything up with C code, which really solidifies understanding. The Xinu source code can be run on emulators or real hardware for you to tweak (and the book encourages that!)

Digital Design Computer Architecture

another good "build a computer from the ground up" book. The strength of this book is that it gives you more background into how real-life circuits are built (it uses VHDL and Verilog), and provides a nice chapter on transistor design overview. A lot less casual than the Code book, but easily digestible for someone who appreciates this stuff. It culminates into designing and describing a microarchitecture to implement a MIPS microcontroller. The diagrams used in this book are really nice.

u/YuleTideCamel · 3 pointsr/learnprogramming

Sure I really enjoy these podcasts.

u/nonenext · 3 pointsr/explainlikeimfive

If you want to know how computer is made, this amazing book explains so clearly from scratch in order so you can understand next chapter to the end.

It explains in scratch from Morse code, to electricity circuit with battery + flashlight, to telegraphy and relays with more advanced electricity circuit, to how numbers are understood in logic sense, to binary digits (0 and 1), to explaining how you can do so much with just binary digits and how barcode works, to logic and switches in algebra and advanced electricity circuits with binary/boolean, to logic gates, more advanced electricity circuits stuff, to bytes and hexes, how memory functions, to automation... ah this is halfway through the book now.

The way how he writes is very clear, understandable, and everything what he wrote has a meaning for you to be capable to understand what he wrote further in the book.

You'll know EVERYTHING about electricity and behind-the-scene how computer works, how RAM works, how hard drive works, how CPU works, how GPU works, everything, after you finish this book.

u/TheAdventMaster · 3 pointsr/learnprogramming

Something like Code: The Hidden Language of Computer Hardware and Software may be up your alley.

So may be From NAND 2 Tetris, a course where you build a computer (hardware architecture, assembler, OS, C-like compiler, and programs to run on the OS / written in the compiler) starting with just NAND.

At the end of the day though, the way things work is like this: Protocols and specifications.

Everything follows the same published IPO (input, processing, output) standards. Stuff is connected to and registers expected values on expected peripherals. The CPU, motherboard, graphics card, wireless modem, etc. all connect in the right, mostly pre-ordained places on the hardware.

In this vein, there's firmware level APIs for then communicating with all of these at the BIOS level. Although as far as I'm aware, "actual" "BIOS" is no longer used. UEFI is instead: https://en.wikipedia.org/wiki/Unified_Extensible_Firmware_Interface

This is what Firmware is / is built on-top of. Operating systems build on top of these. System calls. Operating systems communicate under the hood and expose some number of system calls that perform low-level actions like talking to devices to perform things like file access or network I/O. A lot of this stuff is asynchronous / non-blocking, so the OS or system will then have to respond to an interrupt or continuously check a registry or some other means of getting a response from the device to see if an operation completed and what its result was.

Loading the OS is one thing the BIOS is responsible for. This is through the bootstrapping process. The OSs are located at very specific locations on the partitions. In the past, the only command you had enough room for within BIOS / pre-operating system execution was to load your OS, and then the OS's startup scripts had to do everything else from there.

Once you have an operating system, you can ask the OS to make system calls and invoke low-level API requests to get information about your computer and computer system, such as the file system, networks, connected drives and partitions, etc. These calls are usually exposed via OS-specific APIs (think the win32 API) as well as through a command-line interface the OS provides.

New devices and I/O from/to those devices communicate through firmware, and interrupts, and low-level system calls that are able to communicate with these firmware APIs and respond to them.

Just about anything you can think of - graphics, audio, networking, file systems, other i/o - have published standards and specifications. Some are OS-specific (X windowing system for Linux, DirectX win32 API or GDI on Windows, Quartz on Mac, etc.). Others are vendor-specific but don't seem to be going anywhere (OpenGL, then nVidia vs AMD driver support which varies across operating systems, etc.).

The biggest hardware vendors and specification stakeholders will work with the biggest operating system vendors on their APIs and specifications. It's usually up to device manufacturers to provide OS-compatible drivers along with their devices.

Drivers are again just another specification. Linux has one driver specification. Windows has another. Drivers are a way that the OS allows devices and users to communicate, with the OS as a middle-manager of sorts. Drivers are also often proprietary, allowing device manufacturers to protect their intellectual property while providing free access to use their devices on the OS of your choice.

I'm not an expert in how it all works under the hood, but I found comfort in knowing it's all the same IPO and protocol specifications as the rest of computing. No real hidden surprises, although a lot of deep knowledge and learning sometimes required.

When we get to actually executing programs, the OS doesn't have too much to work with, just the hardware... So the responsibility of slicing up program execution into processes and threads is up to the OS. How that's done depends on the OS, but pretty much every OS supports the concept in some sense.

As far as how programs are multi-tasked, both operating systems and CPUs are pretty smart. Instructions get sent to the chips, batched and divided by them and the computational results placed into to registries and RAM. Again, something I'm not a huge expert in, and it honestly surprised me to find out that the OS is responsible for threading etc. I for some reason always thought this was at the chip level.

When you include libraries (especially system / OS / driver libraries) in your code, you're including copies of or references to OS native functions and definitions to help you reference these underlying OS or system calls to do all the cool things you want to do, like display graphics on the screen, or play audio. This is all possible because of the relationship between OS's and device manufacturers and the common standards between them, as well as the known and standard architectures of programs designed for OS's and programs themselves.

Inter-program compatibility is where many things start to become high level, such as serialization standards like JSON or XML, but not always. There are some low-level things to care about for some programs, such as big- vs little-endian. Or the structure of ASM-level function calls.

And then you have things like bitcode that programs like Java or JavaScript will compile to, which are a system-independent representation of code that most often uses a simple heap or stack to describe things that might instead be registry access or a low-level heap or stack if it had been written in ASM. Again, just more standards, and programs are written according to specifications and know how to interface with these.

The modularity of programming thanks to this IPO model and the fact that everything follows some standards / protocols was a real eye opener for me and made me feel like I understood a lot more about systems. What also helped was not only learning how to follow instructions when setting up things on my computer or in my programs, but learning how to verify that those instructions worked. This included a lot of 'ls' on the command-line and inspecting things in my debugger to ensure my program executed how I expected. These days, some might suggest instead using unit tests or integration tests to do the same.

u/mivfx · 3 pointsr/programming

Yes. The best book i read that explains "computer" from really "ground up" is Charles Petzold' Code. Even my literature-graduate girlfriend understood it.

u/dwitman · 3 pointsr/learnprogramming

> C.O.D.E

This book?

u/frostmatthew · 3 pointsr/WGU

tl;dr version:

  1. yes
  2. no, but that will be the case at any school

    Quick background to validate the above/below: I was a 30y/o banquet manager when I decided to change careers. I had no prior experience [unless you want to count a single programming class I took in high school] but did get a job in tech support at a medium size startup while I was in school and wrote a couple apps for our department. Just before I graduated I started working at a primarily Google & Mozilla funded non-profit as their sole software engineer. I moved on after a little over two years and am now a software engineer at VMware.

  3. The degree is a huge boost in getting past HR and/or having [good] recruiters work with you. You'll also learn the skills/knowledge necessary to get hired as a developer, which is obviously the more important part - but for the most part this is all stuff you can learn on your own, but you'll greatly reduce the number places that will even give you a phone screen if you don't have a degree [I'm not saying this is how it should be, but this is how it is].

  4. I typed out a lot before remembering New Relic had a great blog post a few months ago about all the stuff you don't learn in school [about software development], ha. So I would highly recommend you not only read it but also try to learn a little on your own (especially regarding SQL and version control) http://blog.newrelic.com/2014/06/03/10-secrets-learned-software-engineering-degree-probably-didnt/ Being a good developer (or good anything) takes time/experience - but knowing what they don't cover in school (and trying to learn it on your own) will help.

    Two books I'd suggest reading are The Pragmatic Programmer and Code: The Hidden Language of Computer Hardware and Software. Pragmatic Programmer is one of those classics that every good dev has read (and follows!). Code is great at giving you some insight into what's actually happening at a lower level - though it gets a bit repetitive/boring about halfway through so don't feel bad about putting it down once you reach that point.

    The best thing you can do to help you land a job is have some open-source side-projects (ideally on GitHub). Doesn't have to be anything major or unique - but it will help a lot for potential employers to see what your code looks like.

u/audionautics · 3 pointsr/videos

For the latter half, "the CPU executes instructions", there's a fantastic book called Code: the hidden language of computers, that, through a series of scenarios, takes you all the way from talking with your friend through a string and two tin cans, to flash lights, to morse code, to logic gates, transistors, and finally encoding information in bits and executing it on a CPU.

It's a super fun read.

u/nekochanwork · 3 pointsr/learnprogramming

> I assumed calculus would somehow be the building block of where all computer systems are based.

I'm afraid I don't know what expression "building block of where all computer systems are based" means, but if it helps at all, Petzold's book The Hidden Language of Computer Hardware and Software explains how computers work from the ground up.

If you were to ask me, I would say "building blocks" of computers is Boolean algebra. Boolean algebra can defines simple logic gates, which in turn can be realized physically through circuits and relays. You can combine gates to form simple adders and multipliers; you can feed the output of a relay as an input back into itself to create a flip-flop gate, which can be used to store state; you can set up a relay to disconnect from the circuit as soon as it receives power, and reconnect when there is no power, resulting in a simple oscillator; etc. etc etc.

I'm positive a lot of smart people used calculus to shrink circuits and relays down to solid-state transistors, which in turn implement a von Neumann machine architecture that all modern software depends on. But at the root of it all, the movement of information through your computer is modeled by naive propositional logic and the rules of Boolean algebra.

u/SevenGlass · 3 pointsr/learnprogramming

Petzold's Code is the book you are looking for.

u/jimschubert · 3 pointsr/csharp

I recommend starting by teaching some version control basics. Focus on git and quickly cover others like TFS and subversion. You can read Pro Git for free.

If you teach a hardware/software course, CODE is an excellent book.

I also recommend C# in Depth. I would also think it'd be cool to offer points for contributing to StackOverflow or the new open source .NET projects on GitHub.

If you teach design/analysis or other classes focused on architecture, Adaptive Code via C# is pretty good. I'm only a few chapters in, but it discusses Scrum methodology, layering and tiers, as well as how to follow practices for writing great code.

I would also suggest a course on JavaScript. I have had to train too many Junior and Senior developers on how to write JavaScript. It's scary that many web developers don't even understand fundamentals.

u/Triapod · 3 pointsr/askscience

Consider implementing an ALU which does various things like add and subtract and bit shifts. So, you have inputs A and B from your memories and your program instructs the computer to add them. The instruction for "add" is also sent to the ALU. So how does the ALU "change" it's function to add this time and subtract the next? Look at how a multiplexer works. So for a simple implementation, your ALU can compute both A+B and A-B (in parallel using separate logic gates) and then at the end, based on the instruction, select which to output. You can also try to imagine how a multiplexer can be used to implement various boolean operations by thinking about the truth tables :).

So, if we can build the kind of ALU logic above, the key becomes addressable memories (note that addresses are themselves numbers which can be manipulated using ALU). When you compile and run a program, it is loaded into memory. The instruction and data are read and then fed into logic like above.

If you are interested and have the time, the book Code presents the material quite well and accessibly.

u/katyne · 3 pointsr/learnprogramming

First of all you will never understand everything on the level that you think you want to understand it. That just doesn't happen. Even with an advanced degree, even working for the industry for 10+ years you'll end up specializing in this or that and sort of having some idea about the other things. There's just too much stuff to learn. Those black boxes people talk about - they're called "abstractions", a way to simplify complex details in order to understand a concept. And in the beginning all you'll be doing is trying to understand concepts. When you were learning to drive a car you didn't need to know the very last mechanical detail of its engine, right? You just had an abstract idea of it, you knew you had to put in fuel and turn on ignition and why you had to shift gears and stuff, and that was enough. Same here. First learn to hold the wheel and steer, then choose the field you'd like to specialize in. But it will take you a lifetime to learn everything about everything - and that's if all you'll be doing is learning, not making anything of your own.

If you're like me you still need some introduction to "why" before you start learning "how" I would recommend this book - it's very approachable and sort of encompasses the general topics without going into much detail. Other than that it's hard to say anything because we don't know what your goal is, do you want to work as a web dev, a system programmer, or make games, write for mobile maybe? front end, backend, databases? It's like saying "I want to learn how to sport", well what kind of sport. They're all very different directions of virtually unlimited depth.

u/chunyukuo · 3 pointsr/TranslationStudies

First of all, congrats on the promotion and the learning spirit. I wish more managers had your attitude.

I had a similar situation where I went from in-house linguist to loc manager, and I wonder if my experiences might be of use to you. Like you, I definitely did not describe myself as "into programming." I'm still not into that sort of thing. But learning as much of it as I could had a direct benefit to a lot of my daily tasks, and I would recommend at least giving the more learner-friendly tutorial sites a try.

I finished a lot of modules on codecademy.com and genuinely enjoyed them because they were not particularly difficult and also allowed me automate a lot of things and also gain a deeper understanding of how things work. I went through Learn Python the Hard Way and gained a lot from that, especially since subsequent projects included quite a lot of assets in Python. I went so far as to plow through the first half of Code: The Hidden Language of Computer Hardware and Software (the latter half too arcane for me) and found that quite useful as well, although in hindsight it was a bit overkill.

Even after my department was given an actual programmer to code up solutions for us, I at least was able to understand how a good amount of it worked. Coding aside, a localization manager is the person that the linguists and testers go to when things break, and man do they break a lot. That said, I would also recommend training of some sort in SDL and Kilgray's products if you use them. In my experience as manager, both broke often or were fussy at best.

A few years later, I haven't really read much about code, but I still try to ask developers as many questions as I can about the technical aspects of their products and find it really helpful to follow up on Stack Overflow or just Wikipedia.

Good luck with your new position!

u/rekav0k · 3 pointsr/blackhat
u/herky_the_jet · 3 pointsr/math

You might enjoy the book "Code" by Charles Petzold (http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319), in combination with the "nand 2 tetris" course developed by MIT (http://www.nand2tetris.org/)

u/wgren · 3 pointsr/dcpu_16_programming

Code: The Hidden Language of Computer Hardware and Software,The Elements of Computing Systems and Inside the Machine were recommended on Hacker News.

I have the last one, I will re-read it over Easter holidays...

u/Psylock524 · 3 pointsr/compsci
u/ummonommu · 3 pointsr/technology

Code: The Hidden Language of Computer Hardware and Software

A good read if you want history and understanding binary code, among others. Not exactly about politics, but more on the easy-to-read technical side.

u/Lesabotsy · 3 pointsr/learnprogramming
u/anachronic · 3 pointsr/AskNetsec

> I have zero Linux experience. How should I correct this deficiency?

First, install a VM (Oracle OpenBox is free) and download a linux ISO and boot from it. Debian and Ubuntu are two of my favorites. Both are totally free (as are most linux distros). Once installed, start reading some beginner linux tutorials online (or get "Linux In A Nutshell" by O'Reilly).


Just fuck around with it... if you screw something up, blow it away and reinstall (or restore from a previous image)

> Is it necessary? Should I start trying to make Linux my primary OS instead of using windows, or should that come later?

It's not necessary, but will help you learn faster. A lot of security infrastructure runs on Linux and UNIX flavors. It's important to have at least a basic understanding of how a Linux POSIX system works.

> If you can, what are some good books to try to find used or on PDF to learn about cissp and cisa? Should I be going after both? Which should I seek first?

You don't need to worry about taking & passing them until you've been working in the field for at least 3-5 years, but if you can get some used review materials second-hand, it'll give you a rough idea what's out there in the security landscape and what a security professional is expected to know (generally)


CISSP - is more detailed and broader and is good if you're doing security work day-to-day (this is probably what you want)


CISA - is focused on auditing and IT governance and is good if you're an IT Auditor or working in compliance or something (probably not where you're headed)


> What are good books I can use to learn about networking? If you noticed I ask for books a lot its because the only internet I have is when I connect my android to my laptop by pdanet, and service is sketchy at my apartment.

O'Reilly is a reliable publisher of quality tech books. An amazon search for "O'Reilly networking" pull up a bunch. Also, their "in a nutshell" series of books are great reference books for Windows, Linux, Networking, etc... You can probably find older/used copies online for a decent price (check ebay and half.com too)

> How would you recommend learning about encryption? I just subscribed to /r/crypto so I can lurk there. Again, can you point me at some books?

Try "The Code Book" for a very accessible intro to crypto from ancient times thru today
http://www.amazon.com/The-Code-Book-Science-Cryptography/dp/0385495323


Also, for basics of computer architecture, read "CODE", which is absolutely excellent and shows how computers work from the ground up in VERY accessible writing.
http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

u/Packet_Ranger · 3 pointsr/askscience

You'd be well served by reading a book called "CODE - the Hidden Language of Computer Hardware". It starts all the way from the simplest electronic circuits, explains how a powered signal amplifier can be turned into an electronic switch (e.g. telegraph relays, and later, transistors), how those switches can be chained together to form logic gates per /u/Corpsiez, how those logic gates can be chained together to form arithmetic units and memory, and finally how to make a simple 8080 CPU and implement ASCII inputs and outputs.

u/randrews · 3 pointsr/csbooks

Code, by Charles Petzold is pretty much exactly what you want.

u/Mazer_Rac · 3 pointsr/compsci

Code by Charles Petzold. It starts with Morse code and works up to a fully functional computer processor. All while written in a prose style. Very nontechnical and a great read.

http://www.amazon.com/gp/aw/d/0735611319?pc_redir=1410687129&robot_redir=1

u/Xavierxf · 3 pointsr/explainlikeimfive

Code is what helped me wrap my head around this.

You might have to read it a couple times to understand it, but it's really good.

u/luciano-rg · 3 pointsr/IWantToLearn

For an introduction on how computers work the book "Code" by Charles Petzold is very informative. It starts from rudimentary circuits of blinking lights to the complexity of modern computers. I found this book to close the gap between the concepts of software and hardware.
Amazon link: https://amzn.com/0735611319

u/Augur137 · 3 pointsr/compsci

Feynman gave a few lectures about computation. He talked about things like reversible computation and thermodynamics, quantum computing (before it was a thing), and information theory. They were pretty interesting. https://www.amazon.com/Feynman-Lectures-Computation-Frontiers-Physics/dp/0738202967

u/factorysettings · 2 pointsr/pics

I'm a self taught programmer, so I don't know what CS degrees entail, but I highly recommend the book Code and also another one called The Elements of Computing Systems.

The former pretty much teaches you how a computer physically works and the latter teaches you how to build a processor and then write an OS for it. After reading those two books you pretty much know how computers work at every level of abstraction. I think that's the way programming should be taught.

u/autisticpig · 2 pointsr/Python

Welcome! Have you read code the hidden language ?

u/thegunn · 2 pointsr/learnprogramming

This is only 8 pages. If you're wanting to do some more reading I can't suggest this book enough. It keeps everything fun and accessible while letting you know what's going on and why.

u/Harkonnen · 2 pointsr/programming

This book is awesome. I learnt a lot from it.

u/ljcoleslaw · 2 pointsr/learnprogramming

There is a great book called Code: The Hidden Language of Computer Hardware and Software which I give to people who are just starting out and are struggling with the bigger picture on computers.

It works from the bottom up without assuming you know anything. It's not a textbook on computer architecture, but it should satisfy your need for a high-level picture of what's going on and will be much easier to read and enjoy than looking up definitions people give you and piecing it together yourself.

u/Serpilliere-a-biere · 2 pointsr/france

Je suis en train de lire ça en ce moment : https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

Pour l'instant j'adore. Il vise une compréhension profonde en partant de rien du fonctionnement d'un ordinateur (il t'explique ce qu'est l'électricité et comment ça fonctionne au début). C'est synthétique et très informatif.

u/ryantriangles · 2 pointsr/webdev

> Recently released books? Udemy courses? Free stuff online like W3Schools or CodeAcademy?

Mozilla Developer Network has fantastic documentation for JS, HTML, CSS, and the browser APIs, and a section intended to guide you through them for the first time in a comfortable order.

If JS is your first language, I'd recommend checking out the book "Code" by Charles Petzold, a great and relatively short book that answers the fundamental beginner questions like "So what's a CPU actually doing in there?", "How does source code make stuff happen?" and "How can music be 1s and 0s?"

There's a great series of books, available for free online, called "You Don't Know JS", that teach you how the language operates. It might be too involved if it's the first thing you read, but definitely at least start it and bookmark the later volumes to come back to.

Marijn Haverbeke's book "Eloquent JavaScript" is available to read freely online. I'd really recommend that one, too. And when you want to stop reading theory and start working on actual projects, grab "JavaScript Cookbook" by Shelley Powers.

> Did I choose right languages for what I wish to create? Maybe I should also use something else, like Bootstrap, Node.js or Typescript?

Don't worry about those for now. Node will be useful if you decide you need a server component to your game. Bootstrap is nice, and it's helpful, but if you're still trying to learn HTML and CSS yourself, it will only get in the way and obscure things. TypeScript is something you'd look at only once you're comfortable and confident with regular old JS. Stick to plain old HTML/CSS/JS of your own to start.

u/Senipah · 2 pointsr/learnprogramming

This is a very broad question which requires a somewhat nuanced answer. If you are genuinely interested I highly recommend this book:

https://www.amazon.co.uk/Code-Language-Computer-Hardware-Software/dp/0735611319

It’s very accessible and does not require any technical knowledge.

That being said, perhaps the single most important concept to understand with regards to how new languages are made nowadays is the concept off bootstrapping.

https://en.m.wikipedia.org/wiki/Bootstrapping_(compilers)

u/Kerbobotat · 2 pointsr/videos

I highly, highly recommend [CODE - The hidden language of Hardware and Software](https://www.amazon.com/dp/0735611319/) (amazon link, non-referral)

This book explains how computers work from the fundementals, communicating with flashlights, morse code, tin cans on string, all the way up to modern computer hardware. Its really a great read.

I hope this helps a little bit on the way to understanding stuff. If you have any others you'd like to share as well, let me know!

u/forzrin · 2 pointsr/technology

It was a few years ago, but lots of entire series of lectures are online. I got through most of the "core" CS concepts in ~2 years of doing intense academic learning alongside actually writing code (and releasing some stuff to small groups or to app stores). Picking straightforward, small projects with a specific academic challenge like a game with a simple concept but needs pathfinding algorithms (and implementing them myself instead of using a framework)

e.g. Data Structures (YouTube)

You can also find series of lectures like the above on algorithms. Than do basic research on what the most common/industry standard textbooks for these topics are, like Introduction to Algorithms (Amazon link) and buy them or download PDFs or whatever.

The important thing is to actually do the work, suggested tasks/projects, etc. Personal accountability is the driver, here.

Then there are one off books like Code: Hidden Language (Amazon link) that explore specific topics or walk you through certain ideas and concepts at a kind of introductory level. If you find the topic interesting or it is important for your work, it's a good starting point to learn about the lowest level stuff.

u/bwhauf · 2 pointsr/cscareerquestions

If you want to understand how a computer works from the ground up, this book is pretty great: https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

u/pixel_sharmana · 2 pointsr/compsci

Code: The Hidden Language of Computer Hardware and Software
by Charles Petzold

The book starts with very simple binary logic, then how to create your own transistor out of bent wire and batteries, building logic gates out of transistors, creating your own assembly language, then implementing BASIC on top of that, and finally screens and pixels

u/fossuser · 2 pointsr/compsci

If you're looking for a detailed, but approachable overview I'd highly recommend picking up this book: https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

I did CS and this book does a great job explaining how it all works from the ground up as well as how it was figured out. Putting everything in a historical context makes things a lot easier to understand and remember. It doesn't suffer from the common high level explanations that are useless to understand how something really works.

It's also just a fun read.

u/Lemminsky · 2 pointsr/compsci
u/lethargilistic · 2 pointsr/IWantToLearn

I am not great with hardware, so I can't help with actually building a computer.

Nonetheless, a book that's been really helpful to me and a lot of other people is Charles Petzold's Code: The Hidden Language of Computer Hardware and Software. Basically, it's about machine organization. It walks you through how computers work by building a theoretical computer with you, starting from how to encode information and building to logic gates, relays, and circuitry.

It's also hilarious, written for the layman, and informative enough for serious students. It's incredible. Absolute, unreserved recommendation. Petzold is a master. I recommend the (published later) paperback version because its preface give the book great context.

u/Tibio · 2 pointsr/learnprogramming

I found this book to be a great overview from ground up. I have the physical copy but if you want to get it for free you know how..

http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

It's a fairly easy read except the middle 40% about logic gates. Prepare to read through that part multiple times if you want to really understand it. But as you said that would be unnecessary, just for fun. I read it for the same purpose you did wanting some foundation.

u/anossov · 2 pointsr/learnprogramming

This very clearly explains everything from relays to a whole computer:
https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/

u/michael0x2a · 2 pointsr/learnprogramming

I've seen a lot of people recommending Code: The Hidden Language of Computer Hardware and Software

It is about 15 years old though, so it might seem a little out-of-date in some places/might appear to omit some modern developments in technology and computer science, but the existing content should still be pretty solid.

That being said, I do agree that the best thing to do is to just jump straight in. The best way to gain a good mindset for programming is to just start programming. You'll run head-first into obstacles and bugs, and figuring out how to fix those bugs/avoid those bugs is pretty much how you acquire that sort of mindset.

u/Scoutdrago3 · 2 pointsr/pcmasterrace

Heres some book I have bought in the past on Programming. You can just download a PDF/e-reader file if you dont want to spend the money, but I would recommend supporting the author.

Programming:

Python Game Developement

Pro Python

Java For Dummies

How to Program Java

Networking:

Networking All-In-One

Networking: A Beginner's Guide

General:

Computer Repair with Diagnostic Flowcharts

Code: The Hidden Language of Computer Hardware and Software

u/mcflufferbits · 2 pointsr/hardware

I'm looking to buy this one if I can't find any decent websites

https://www.amazon.ca/Code-Language-Computer-Hardware-Software/dp/0735611319

u/Truth_Be_Told · 2 pointsr/explainlikeimfive

You need to absolutely read this book (used copies are just a couple of bucks);

Code: The Hidden Language of Computer Hardware and Software

The book is very very accessible and written brilliantly. The only thing it doesn't cover, is the Physics behind the implementation of Electronics but the basics of that, you have probably studied in high-school and undergraduate classes. What you are looking for is the logical abstraction behind the application of Electronics.

The above book will clarify that like no other book i have read.

u/jaydoors · 2 pointsr/learnpython

Other libraries and functions!

I think I have a similar perspective to you, of having loose ends in my mind until I know what's underneath all of it. It helped me a lot to learn about what computers are, underneath. It's 'just' simple electronics - circuits which can be powered on or off, and which can affect other circuits. I highly recommend this book for a good explanation right from the bottom, all the way up.

In terms of making windows - when you get down to it: the screen is a bunch of pixels, each of which is represented in the computer's memory, and which the screen hardware effectively reads, in order to generate an image (eg first pixel colour while, second colour blue, etc). Some program on your machine will have arranged for this memory to hold information that gives the right colour pixels in the right place to give the window you expect.

u/DbftbsDjqifs · 2 pointsr/learnprogramming
u/ryanplant-au · 2 pointsr/learnprogramming
u/bdol · 2 pointsr/ECE

This is an amazing book that describes how computers work from the ground up. Petzold basically starts with switches and relays and moves all the way up to processors and displays. At the end of that book, you'll have the same general knowledge as a second year EE/CE.

u/Tyaedalis · 2 pointsr/learnprogramming

I just began reading CODE and it talks about the lowest level of computing mechanisms. This could be something of interest, although it wont teach you how to program specifically.

For that, I propose to you -- as others have -- Learn to Code the Hard Way. I would recommend the Python version, but he is working on a C version that is being completed. Another great contribution is How to Think Like a Computer Scientist, another book that focuses on Python.

I guess I could best help if I knew what your goals and intentions are. If you want to learn the basics, you can't go wrong with installing a virtual machine with some simple virtual hardware and code at the hardware level. You could even go so far as to build a computer from individual components connected in a specific circuit and hard-code the hardware itself. If you want to learn the more modern, abstract methods, I would strongly suggest Python, C#, or Java. There are many good books on each subject.

u/Thunderducky · 2 pointsr/IWantToLearn

If you're interested in looking at how computers work on a fairly deep level, I'd recommend finding a copy of Code: The Hidden Language of Computer Hardware and Software. I thought it seemed very approachable.

u/MoxMono · 2 pointsr/computerscience

If you want a really good basic grasp of the concepts, CODE by Charles Petzold is a great read.

http://www.amazon.co.uk/Code-Language-Computer-Hardware/dp/0735611319

u/geoff- · 2 pointsr/answers

This book explained it in a way that finally made sense to me, and got me started in IT.

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

He begins with two neighborhood kids shining flashlights at each other's windows to communicate, then advances through Morse code and Braille to the telegraph system to finally teaching you about a basic adder (the foundational concept of a CPU)

This question isn't easily answerable in just a few paragraphs. This book, while obviously not a quick little pamphlet, is not a long read by any means and I blasted through it and felt like I finally "got it."

u/JMV290 · 2 pointsr/promos

I dislike asking for things but uh:

Code: The Hidden Language, something by Philip K. Dick, or some other book that I'm blanking on idklol.

u/RoamingChromeLoam · 2 pointsr/networking

"Code" starts by explaining binary and ends with a tour of the undersea cable network. It's a fascinating 10000-foot look at the concepts and tech underlying the Internet

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

u/sky111 · 2 pointsr/AskTechnology

There's a book about that:
http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

It has very good explanations about everything computer related.

u/ho11ywood · 2 pointsr/hacking

My one-liner answer:
It was a slow process of steady improvement that started with physical logic gates and/or commands directly to your hardware.

My longer answer:
Go read this book....
https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/

u/SarasaNews · 2 pointsr/askscience

If you're really interested in this you might want to check out the book "Code", it leads you from the invention of electricity up to how computer programming languages work, step by step, including the stuff that's being talked about on this thread.

u/j_s_lebach · 2 pointsr/lectures

For a lighter resource, I would recommend Code: The Hidden Language of Computer Hardware and Software.

u/le848dave · 2 pointsr/compsci

This book does a great job of explaining the innards of how a CPU works from the ground up. It is a fantastic read

http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_1?ie=UTF8&qid=1422718405&sr=8-1&keywords=Code

u/abudabu · 2 pointsr/programming

Lots of good suggestions here. Let me add what the dormouse said.

And Petzold's "Code":


u/Serious_Callers_Only · 2 pointsr/learnprogramming

Read this book It basically goes from assuming no previous knowledge of computer anything, to how to build your own computer from scratch, and does it in such a way that my parents could understand.

After that, your goal is to work your way up to Structure and Interpretation of Computer Programs

u/guardianofmuffins · 2 pointsr/learnprogramming
u/IrishTheHobbit · 2 pointsr/explainlikeimfive

If you are truly interested in how the computer performs these functions, this is a GREAT book. I found it easy to understand, and I think it will answer the question you have.

u/SteelNeckBeard · 2 pointsr/programming

I feel like Structure Computer Organization could be replaced with this book, Code: The Hidden Language of Computer Hardware and Software:

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

u/shhh-quiet · 2 pointsr/learnprogramming

Read the FAQ.

Also, I've heard good things about this book, Code: The Hidden Language of Computer Hardware and Software.

Seems like a solid 400 page's worth of framing your entire experience with programming and computers in a useful context.

It's very easy to go down useless rabbit holes while you're busy learning-by-doing. Sometimes it's wise to sit down with an effective book and try to understand more deeply from the ground up.

Same with Python. The FAQ I believe contains some good book recommendations.

u/rocketraider · 2 pointsr/learnprogramming

CODE.

http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_1?ie=UTF8&qid=1369240171&sr=8-1&keywords=CODE

It's the first book that is not dry like a textbook, but gives a thorough understanding of computer architecture. My professor in my Computer Architecture and Assembly class gave the recommendation. I read it in 5 days following the Spring semester.

u/DuoThree · 2 pointsr/learnprogramming

'Code: The Hidden Language of Computer Hardware and Software' by Charles Petzold

http://www.amazon.com/gp/aw/d/0735611319/ref=mp_s_a_1_1?qid=1414026845&sr=8-1&pi=SY200_QL40

u/Mirrory · 2 pointsr/synthesizers

Yes, books! Stacks of them. This is one of the best, IMO

That book will teach you everything about how hardware/software works and more importantly, why it is the way it is. Technically, you know you could use a synth to make a modem squeal and ring a dial up server. Remember that old dial up noise? You were actually listening to the raw handshake. If you play that through a speaker, it's the voltage of the handshake that generates that unique sound. It's probably nothing new to you, but synths are exactly that: machines to control voltage to speakers. They have a bit of logic built into them as well, to handle timing and everything. There's only a handful of transistors in most analog synths I've seen. Much different than a modern computer with billions of transistors.

u/binary_search_tree · 2 pointsr/movies

Actually, I was just reading a great book called Code: The Hidden Language of Computer Hardware and Software, in which the reader builds an imaginary computer composed of nothing but telegraph relays and light bulbs - all tech from the 1800s.

And Morse code is actually very interesting. It's a very clever way to encode data, and quite an efficient means of communication, especially if you happen to live in the 1800s. It's touched upon in the book.

So yeah, I suppose I am fascinated by those things.

u/TezlaKoil · 2 pointsr/compsci

I think he may have meant the other Feynman lectures.

u/donald-pinckney · 2 pointsr/types

Type Theory and Formal Proof covers metatheory all the way up to calculus of constructions, but I think it is at a fairly introductory level and tbh not that insightful from a mathematical perspective. However, it does give a pretty clear exposition of the type checking rules, and is easily converted to an implementation.

u/IjonTichy85 · 2 pointsr/compsci

I think before you start you should ask yourself what you want to learn. If you're into programming or want to become a sysadmin you can learn everything you need without taking classes.

If you're interested in the theory of cs, here are a few starting points:

Introduction to Automata Theory, Languages, and Computation

The book you should buy

MIT: Introduction to Algorithms

The book you should buy


Computer Architecture<- The intro alone makes it worth watching!

The book you should buy

Linear Algebra

The book you should buy <-Only scratches on the surface but is a good starting point. Also it's extremely informal for a math book. The MIT-channel offers many more courses and are a great for autodidactic studying.

Everything I've posted requires no or only minimal previous education.
You should think of this as a starting point. Maybe you'll find lessons or books you'll prefer. That's fine! Make your own choices. If you've understood everything in these lessons, you just need to take a programming class (or just learn it by doing), a class on formal logic and some more advanced math classes and you will have developed a good understanding of the basics of cs. The materials I've posted roughly cover the first year of studying cs. I wish I could tell you were you can find some more math/logic books but I'm german and always used german books for math because they usually follow a more formal approach (which isn't necessarily a good thing).
I really recommend learning these thing BEFORE starting to learn the 'useful' parts of CS like sql,xml, design pattern etc.
Another great book that will broaden your understanding is this Bertrand Russell: Introduction to mathematical philosophy
If you've understood the theory, the rest will seam 'logical' and you'll know why some things are the way they are. Your working environment will keep changing and 20 years from now, we will be using different tools and different languages, but the theory won't change. If you've once made the effort to understand the basics, it will be a lot easier for you to switch to the next 'big thing' once you're required to do so.

One more thing: PLEASE, don't become one of those people who need to tell everyone how useless a university is and that they know everything they need just because they've been working with python for a year or two. Of course you won't need 95% of the basics unless you're planning on staying in academia and if you've worked instead of studying, you will have a head start, but if someone is proud of NOT having learned something, that always makes me want to leave this planet, you know...

EDIT: almost forgot about this: use Unix, use Unix, and I can't emphasize this enough: USE UNIX! Building your own linux from scratch is something every computerscientist should have done at least once in his life. It's the only way to really learn how a modern operating system works. Also try to avoid apple/microsoft products, since they're usually closed source and don't give you the chance to learn how they work.

u/dfmtr · 2 pointsr/MachineLearning

You can read through a machine learning textbook (Alpaydin's and Bishop's books are solid), and make sure you can follow the derivations. Key concepts in linear algebra and statistics are usually in the appendices, and Wikipedia is pretty good for more basic stuff you might be missing.

u/flebron · 2 pointsr/compsci

I studied from Introduction to Automata, Languages, and Computation, and used their definitions here :)

Certainly one can define FSMs to always have two tapes, and just not use the output one.

u/nhjk · 2 pointsr/AskComputerScience

Thanks, I already have a book on this I was planning on reading: Introduction to Automata Theory, Languages, and Computation. I just started reading CLRS though, do you think it would be helpful to finish it or are the two mostly unrelated?

u/Noobflair · 2 pointsr/learnprogramming

Hey don't underestimate the theoretical side of computer science, how about answering the age old question [What are the fundamental capabilities and limitations of computers?] (http://www.amazon.in/Introduction-Automata-Theory-Languages-Computation/dp/0321455363) or how about [ design and workings of computers and software] (http://www.goodreads.com/book/show/44882.Code)

u/treerex · 2 pointsr/csbooks

Introduction to Automata Theory, Languages, and Computation is the standard text. Jeff Ullman has a page for the book, and has taught the subject a couple of times on Coursera.

u/sachal10 · 2 pointsr/learnmath

since you are a computer science student, you can start with proofs in Discrete Mathematics fo this you can look at Kenneth Rosen's book, it can help you with a lot of basic concepts, constructing proofs. Its a good book for those who want to go in algorithms or theoretical cs or a even want to work on pure maths problems. I had this same confusion I wanted to do maths but also cs with it. After this you can try "The art of computer programming"(this has 4 volumes) by Donald Knuth but CLRS is a must along with Rosen's if you want to take cs and maths side by side. If you want to explore further you can look at Design of Approximation Algorithms and Randomised Algorithms. These book can help you with concepts of probability, number theory, geometry, linear algebra etc. But then if you want pure math problems then search for them, go though different journals, SIAM and Combinatorica are really good ones, search them pick a problem you like, then find text relevant to problem and try to give better solutions.

u/mahalo1984 · 2 pointsr/philosophy

This book will get you started:

Introduction to the Theory of Computation https://www.amazon.com/dp/053494728X/ref=cm_sw_r_other_apa_EuBuxbYS2QXF3

But if your understanding of the foundations of math and logic are not strong, you may wish to begin with Language, Proof, and Logic by Barwise or a more historical treatment from the book, A Profile of Mathematical Logic by Howard Delong. For a bit more light-hearted and thought-provoking, read Godel, Escher, Bach: An Eternal Golden Braid.

To connect this material to philosophy of mind, get David Chalmers' introductory textbook.

The scope of your question does not fit nicely into a reddit comment. But if you request, I will go into greater detail.

u/just_doug · 2 pointsr/learnprogramming

I highly recommend Michael Sipser's Introduction to the theory of computation. I found it engaging enough that I actually read it in bed. Had to stop because it kept me from sleeping.

u/tronadams · 2 pointsr/learnprogramming

I don't know about other schools but my CS program required discrete math and automata theory to complete the major. I really enjoyed automata theory but I can imagine it being kind of tough to get into outside of a classroom setting. Having said that, I would highly recommend this book if you're trying to learn some of this stuff on your own.

u/takemetothehospital · 2 pointsr/computers

Code: The Hidden Language of Computer Hardware and Software is a great book that starts from the bottom up, and explains the very basics in an understandable manner. It will give you an easily graspable outline of everything you need to build a basic computer from scratch. You may need to fill in some gaps if you actually want to go ahead with a homebrew computer project, but I find that it's more than enough to scratch the theoretical itch.

u/CSMastermind · 2 pointsr/AskComputerScience

Senior Level Software Engineer Reading List


Read This First


  1. Mastery: The Keys to Success and Long-Term Fulfillment

    Fundamentals


  2. Patterns of Enterprise Application Architecture
  3. Enterprise Integration Patterns: Designing, Building, and Deploying Messaging Solutions
  4. Enterprise Patterns and MDA: Building Better Software with Archetype Patterns and UML
  5. Systemantics: How Systems Work and Especially How They Fail
  6. Rework
  7. Writing Secure Code
  8. Framework Design Guidelines: Conventions, Idioms, and Patterns for Reusable .NET Libraries

    Development Theory


  9. Growing Object-Oriented Software, Guided by Tests
  10. Object-Oriented Analysis and Design with Applications
  11. Introduction to Functional Programming
  12. Design Concepts in Programming Languages
  13. Code Reading: The Open Source Perspective
  14. Modern Operating Systems
  15. Extreme Programming Explained: Embrace Change
  16. The Elements of Computing Systems: Building a Modern Computer from First Principles
  17. Code: The Hidden Language of Computer Hardware and Software

    Philosophy of Programming


  18. Making Software: What Really Works, and Why We Believe It
  19. Beautiful Code: Leading Programmers Explain How They Think
  20. The Elements of Programming Style
  21. A Discipline of Programming
  22. The Practice of Programming
  23. Computer Systems: A Programmer's Perspective
  24. Object Thinking
  25. How to Solve It by Computer
  26. 97 Things Every Programmer Should Know: Collective Wisdom from the Experts

    Mentality


  27. Hackers and Painters: Big Ideas from the Computer Age
  28. The Intentional Stance
  29. Things That Make Us Smart: Defending Human Attributes In The Age Of The Machine
  30. The Back of the Napkin: Solving Problems and Selling Ideas with Pictures
  31. The Timeless Way of Building
  32. The Soul Of A New Machine
  33. WIZARDRY COMPILED
  34. YOUTH
  35. Understanding Comics: The Invisible Art

    Software Engineering Skill Sets


  36. Software Tools
  37. UML Distilled: A Brief Guide to the Standard Object Modeling Language
  38. Applying UML and Patterns: An Introduction to Object-Oriented Analysis and Design and Iterative Development
  39. Practical Parallel Programming
  40. Past, Present, Parallel: A Survey of Available Parallel Computer Systems
  41. Mastering Regular Expressions
  42. Compilers: Principles, Techniques, and Tools
  43. Computer Graphics: Principles and Practice in C
  44. Michael Abrash's Graphics Programming Black Book
  45. The Art of Deception: Controlling the Human Element of Security
  46. SOA in Practice: The Art of Distributed System Design
  47. Data Mining: Practical Machine Learning Tools and Techniques
  48. Data Crunching: Solve Everyday Problems Using Java, Python, and more.

    Design


  49. The Psychology Of Everyday Things
  50. About Face 3: The Essentials of Interaction Design
  51. Design for Hackers: Reverse Engineering Beauty
  52. The Non-Designer's Design Book

    History


  53. Micro-ISV: From Vision to Reality
  54. Death March
  55. Showstopper! the Breakneck Race to Create Windows NT and the Next Generation at Microsoft
  56. The PayPal Wars: Battles with eBay, the Media, the Mafia, and the Rest of Planet Earth
  57. The Business of Software: What Every Manager, Programmer, and Entrepreneur Must Know to Thrive and Survive in Good Times and Bad
  58. In the Beginning...was the Command Line

    Specialist Skills


  59. The Art of UNIX Programming
  60. Advanced Programming in the UNIX Environment
  61. Programming Windows
  62. Cocoa Programming for Mac OS X
  63. Starting Forth: An Introduction to the Forth Language and Operating System for Beginners and Professionals
  64. lex & yacc
  65. The TCP/IP Guide: A Comprehensive, Illustrated Internet Protocols Reference
  66. C Programming Language
  67. No Bugs!: Delivering Error Free Code in C and C++
  68. Modern C++ Design: Generic Programming and Design Patterns Applied
  69. Agile Principles, Patterns, and Practices in C#
  70. Pragmatic Unit Testing in C# with NUnit

    DevOps Reading List


  71. Time Management for System Administrators: Stop Working Late and Start Working Smart
  72. The Practice of Cloud System Administration: DevOps and SRE Practices for Web Services
  73. The Practice of System and Network Administration: DevOps and other Best Practices for Enterprise IT
  74. Effective DevOps: Building a Culture of Collaboration, Affinity, and Tooling at Scale
  75. DevOps: A Software Architect's Perspective
  76. The DevOps Handbook: How to Create World-Class Agility, Reliability, and Security in Technology Organizations
  77. Site Reliability Engineering: How Google Runs Production Systems
  78. Cloud Native Java: Designing Resilient Systems with Spring Boot, Spring Cloud, and Cloud Foundry
  79. Continuous Delivery: Reliable Software Releases through Build, Test, and Deployment Automation
  80. Migrating Large-Scale Services to the Cloud
u/terryducks · 2 pointsr/computerscience

Start with this book CODE

It lays the groundwork to understand how everything works. From numbering systems to digital gates to how a computer works.

If you liked that, great continue on. If not, CS may not be the right spot for you.

CS is algorithms and problem solving. It's working in teams and communicating. It's writing. It's dealing with complexity and decomposing it to very simple steps that the "idiot computer" can do.

i've spent 20+ years as a code slinger.

u/KingMaple · 2 pointsr/boardgames

If I would recommend a book that can bridge the gap somewhat, it is called Code. This one: https://www.amazon.co.uk/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_1?ie=UTF8&qid=1518447991&sr=8-1&keywords=code

The reason I am recommending this is that it MIGHT (unsure, since I am not mechanical engineer myself) bridge the gap between software and hardware and lead to next steps.

u/Shinigamii · 2 pointsr/mildlyinteresting

That book sounds interesting. Is it this one?

u/kelinu · 2 pointsr/askscience
u/peschkaj · 2 pointsr/compsci

Check out Charles Petzold's Code. It starts with some basic ideas and moves through digital communication and then into the wonderful world of computering.

u/vincenz93 · 2 pointsr/learnprogramming

"Code" by Charles Petzold is a great resource.

Code: The Hidden Language of Computer Hardware and Software https://www.amazon.com/dp/0735611319/ref=cm_sw_r_cp_api_QrnuxbBB5A8CF

u/reddit_user_---_---_ · 2 pointsr/webdev
u/terminalmanfin · 2 pointsr/explainlikeimfive

The single best resource I've found for this is the book Code by Charles Petzold

http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

He walks you through how computers work from the formation of Telegraphs, to logic circuits, to small memory, math, and near the end a small computer with a custom assembly language.

u/cloakdood · 2 pointsr/learnprogramming

There's a fantastic book that explains how a computer works from the circuits up. I think it would greatly aid in your understanding of how computers work.

Code: The Hidden Language of Computer Hardware and Software

u/wcbdfy · 2 pointsr/askscience

If you really want to understand how all things (binary coding, electronic representation, logic gates, microprocessors) come to together I cannot recommend this book enough - Code - By Charles Petzold

u/theinternetftw · 2 pointsr/askscience

The Turing machine answer is a fantastic theoretical one, but if you want to see a practical answer for "how do you build a computer (like most people would think of a computer) from scratch", which seems to be what you were looking for when you wrote this:

> What is going on at the lowest level? How are top-level instructions translated into zeroes and ones, and how does that make the computer perform an action?

...then this book is a fantastical down-to-earth, extremely approachable first read for such things (and designed such that you don't need *any* prior knowledge to start reading it).

Seriously, if you want to dive a little bit deeper, I highly recommend it.


edit: seems someone already recommended Code. Still, can't give it enough praise. Or The Elements of Computing Systems (TECS) which a (only *slightly*) more technical read designed around building everything that a computer "is", piece by piece...

Edit2: And as for "what's going on with the Minecraft ALU", TECS is a good read there as well, since the machine described in that book is what I based the ALU on. Also, the fact that Minecraft can simulate logic gates is what links the "real world" and the "minecraft world" together, because logic gates are all you need to build any computer (that's how Minecraft can let you build Turing Complete devices)

u/NotCoffeeTable · 2 pointsr/Minecraft

Yeah... if you want something outside of Minecraft I'd read "Code"

u/ceol_ · 2 pointsr/TheoryOfReddit

Fantastic! I've gotta be honest, though: you're not going to learn a lot of "programming"; you're going to learn a lot of computer science. That's not a bad thing. You'll learn things like sorting algorithms, complexity, and discrete math.

A great language to start out with for this kind of thing is Python. Read Dive Into Python and Think Python to get you started. If you're having trouble wrapping your head around some concepts, I'd suggest Code: The Hidden Language. It's a great introduction to how computers work which should give you a bit of a kick into development.

Here's a quick example of using reddit's API to grab the last comment someone posted using Python:

import urllib2
import json

url = 'http://www.reddit.com/user/ceol_/comments.json'

request = urllib2.Request(url)
resource = urllib2.urlopen(request)
content = resource.read()
decoded = json.loads(content)

print decoded['data']['children'][0]['data']['body']

You can fool around with the reddit API here and see what it returns in a nice hierarchy.

Hope this helps!

u/MrQuimico · 2 pointsr/compsci

Code: The Hidden Language of Computer Hardware and Software
by Charles Petzold


It's a great overview of CS, very easy to understand.

u/hackinthebochs · 2 pointsr/philosophy

I would suggest staying away from The Dragon Book (a CS book on compilers) or anything deeply technical. I don't think that's needed for what you're trying to accomplish. If you do have time I would suggest Code by Charles Petzold. It gives an introduction to modern computing from transistors on up that is understandable by the motivated layperson. I think this book will give you all the intuition you need to formulate your ideas clearly and accurately.

u/cletusjenkins · 2 pointsr/learnprogramming

Charles Petzold's Code is a fun read. It deals with very little actual code, but gives the reader a good understanding of what is going on in a computer.

u/kirang89 · 2 pointsr/AskComputerScience
u/Electric_Wizard · 2 pointsr/learnprogramming

My advice is to not worry too much about your programming experience or lack of it. While I didn't do a CS bachelor's myself my understanding is that, in the UK at least, almost all courses will start with the basics and go from there. As there isn't much Computing taught in schools here (until recently) there isn't much courses can assume about people's background in programming, so they will normally start from Hello World in a language like Java or Python and then go from there.

In fact given your background you will probably be about average if not above average in the amount of experience you have already. Of course everything I'm saying here might not apply to you if you're taking a different course which does require some background, but for a regular degree this is normally the case.

That said, it's not going to hurt if you do some more programming or reading on the side, but don't stress too much about it as any extra work you do will be above and beyond what they'll expect you to know when starting.

Aside from programming, one thing which really helped my understanding of things was to read this book, it covers what's actually going on in terms of the hardware and software in a computer from first principles, and should help your understanding and complement what you'll be taught in your course.

u/bithush · 2 pointsr/explainlikeimfive

You want to read Code by Charles Petzold. It is a modern classic and takes you from a flash light to a modern CPU. One of the best books computer books I have ever read. It is so good it never leaves my desk as I love to read it randomly. Pic!

u/jedwardsol · 2 pointsr/learnprogramming
u/EnjoiRelyks · 2 pointsr/computerscience
u/ChrisAshtear · 2 pointsr/gaming

congrats on choosing something you want to do!

btw i recommend reading CODE by charles petzold.
http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_1?ie=UTF8&qid=1330580000&sr=8-1

its a book that shows you how everything in a computer works on the electric level, and then shows you machine code & assembly. Not that you really need to program in assembly, but it gives you a good mindset as to how programming works.

u/tempforfather · 2 pointsr/atheism
u/Thanks-Osama · 2 pointsr/learnprogramming

If your not afraid of math then I would recommend the books by Robert Sedgewick. His java book really shows off Java. His Algorithms book is a religious experience. And if your feeling masochistic, the Sipser book is well suited.

u/cbarrick · 2 pointsr/computing

Sipser's Introduction to the Theory of Computation is the standard textbook. The book is fairly small and quite well written, though it can be pretty dense at times. (Sipser is Dean of Science at MIT.)

You may need an introduction to discrete math before you get started. In my udergrad, I used Rosen's Discrete Mathematics and Its Applications. That book is very comprehensive, but that also means it's quite big.

Rosen is a great reference, while Sipser is more focused.

u/shared_tango_ · 2 pointsr/AskReddit

Here, if you can find it somewhere on the internet (cough), this book covers it nicely and is widely used (at least in German universities)

https://www.amazon.de/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X

u/nerga · 2 pointsr/math

this is the book you are talking about I am assuming?

I think I am looking for a more advanced book than this. I have already learned most of these topics in my own classes in school. I suppose I would be looking for the books that would be recommended after having learned this book.

u/astebbin · 2 pointsr/computervision

I'd say that the answer to your question depends on the problem. For certain problems, such as detecting faces, there are functions out there that do everything for you. For other problems, such as circle detection, combinations of existing functions will get the job done (as MakingMacaroni describes in another comment). Then for some problems, such as abandoned luggage detection in airports, you really do need to be up on the current research and have a solid grasp of the mathematics involved.

I'd say that the task you're describing is probably in the second or third category. You might try thresholding optical flow over time, as RGKaizen suggests. Depending on how much training data you have to work with, you might also try training a machine learning classifier on one or more visual features to generate profiles of "normal" and "emergency" situations. If you expect big green tanks to appear or fires to break out, blob detection with color histogram analysis might even do the trick. They key is to make the problem as easy for the computer as possible, and figure out which of the functions OpenCV gives you are best suited for your particular situation.

Best of luck! If you go forward on this project, please let us know what you come up with!

EDIT: Here are a few resources for figuring out which functions to use, what math to apply, etc.:

u/restorethefourthVT · 2 pointsr/learnprogramming

Here is a really good book if you want to get into the nitty-gritty stuff.

Write Great Code Volume 1

Volume 2 is good too. It's not just a rewrite of Volume 1 either.

u/playaspec · 2 pointsr/embedded

Not a direct answer, but a fantastic resource for dealing with every datatype on small and large systems. Check out:

Write Great Code: Volume 1: Understanding The Machine

Of all the computer books I've read, this is my #1 MUST read for ANY programming discipline. Chapter 4 covers the various floating point formats and includes C code examples. For more in depth coverage, check out Don Knuth's "The Art of Computer Programming: Volume Two".

u/clavalle · 2 pointsr/zeroxtenc

This is a great resource.

Ninja Edit: This book is also good, and free! (PDF warning)

u/rdguez · 2 pointsr/algorithms

There’s a book called Grokkin Algorithms, which illustrates some of those concepts quickly. I also liked another book called Cracking the Code Interview.

u/hilduff5 · 2 pointsr/OSUOnlineCS

I took 325 last winter and it was a bit rough. Join the slack group for the class and the class will be tad bit easier and less frustrating. also supplement the class with the Grokking Algorithms book (link below). I guy who graduated a year ago from the program wrote about it in his blog. Like him, the book helped me out tremendously.

Grokking Algorithms

u/jeebusfeist · 2 pointsr/learnprogramming

I'd recommend Grokking Algorithms

u/Mydrax · 2 pointsr/learnprogramming

Inside the Machine, visually illustrates concepts within a computer system so beautifully that it will make you cry.
Also, TeachYourselfCS, not really a book but a list of links of videos and books that will help you grasp various sections of CS.

Grokking Algorithms, not a book based on CS, but it's a really good book on algorithms with funny illustrations that will help you through.

The other books that have been mentioned like Clean Code for example are a must read!

u/MrMoustach3 · 2 pointsr/portugal

Este é mais indicado.

u/BookOfFamousAmos · 2 pointsr/IWantToLearn

I would suggest Grokking Algorithms in Python:

https://www.amazon.com/Grokking-Algorithms-illustrated-programmers-curious/dp/1617292230

It explains them in an accesible, easy to understand format. Whether or not you like or know python, it's a good learning resource. I highly recommend it.

u/fernly · 2 pointsr/learnpython

On Amazon you can preview quite a bit, although not the recursion chapter which looks pretty short:

https://www.amazon.com/Grokking-Algorithms-illustrated-programmers-curious/dp/1617292230

u/barlister · 2 pointsr/webdev

I really like the Write Great Code series. The first one contains pretty crucial information about low level stuff that webdev folks frequently don't learn, which kind of leaves us out to dry when it comes to number theory and other stuff that comes up occasionally.

http://www.amazon.com/gp/product/B0096FEJGQ?btkr=1

All of the volumes are very good though.

u/kaluuee · 2 pointsr/learnprogramming

You need to read this
Code: The Hidden Language of Computer Hardware and Software (Developer Best Practices) https://www.amazon.com/dp/B00JDMPOK2/ref=cm_sw_r_cp_apa_i_v3xwDbV1177NS

u/Anaufabetico · 2 pointsr/brasil

Não li, mas já botei na minha lista porque sou programador nerdão assumido e entusiasmado. Obrigado pela dica.


"Code", do Charles Petzold faz a mesma coisa e também é muito bom.

u/BitterFortuneCookie · 2 pointsr/explainlikeimfive

The above answers were really good. I recommend a look at the book Code: The Hidden Language of Computer Hardware and Software to get a sense of the history of how computer languages evolved into how we build applications today.

https://www.amazon.com/dp/B00JDMPOK2/ref=dp-kindle-redirect?_encoding=UTF8&btkr=1

u/urnlint · 2 pointsr/computerscience

I do not read textbooks as a hobby like some people seem to, but this book seems to have a large chunk of my 5 years of college (yeah for a bachelor) into a single book. Code

u/ConstantScholar · 2 pointsr/csbooks

Code: The Hidden Language of Computer Hardware and Software is a really good and very readable computer architecture book.

u/fav · 2 pointsr/argentina

No hay nada como la universidad. Si querés aprender por tu cuenta, seguí un plan similar. Empezá por lo más básico de la teoría y andá subiendo en complejidad. Una vez que tenés los conceptos, aprender un lenguaje de programación es sencillo.

Si no tenés idea o pensás que lo computable es algo más que un montón de conjuntos de números naturales, empezá por algo como code.

Sobre algoritmos y estructura de datos hay un montón de libros y cursos (coursera, khan, etc. Empezá por los teóricos, huí de los que te enseñen un lenguaje particular.).

Luego paradigmas de lenguajes de programación, teoría de lenguajes, y estás hecho. :)

u/Notlambda · 2 pointsr/dataisbeautiful

Sure. Without anything to go on, I'll just recommend some of my favorites. :)

  • Godel Escher Bach - Mindbending book that delves into connections between art, music, math, linguistics and even spirituality.
  • Code - The Hidden Language of Computer Hardware and Software - Ever wondered how the black box you're using to read this comment works? "Code" goes from transistor to a fully functioning computer in a sequential way that even a child could grasp. It's not at the "How to build your own computer from Newegg.com parts". It's more at the "How to build your own computer if you were trapped on a desert island" level, which is more theoretical and interesting. You get strong intuition for what's actually happening.
  • The Origin of Consciousness in the Breakdown of the Bicameral Mind - An intriguing looking into the theory that men of past ages actually hallucinated the voices of their gods, and how that led to the development of modern civilization and consciousness.
u/semperlol · 1 pointr/web_design

Oh my god you're a fucking moron. Did you even read my comment? If you are discussing theory and this is your reply to my comment, you have a fundamental misunderstanding of the theory. The other explanation is you read something incorrectly, which wouldn't be such a problem but then you adopt such a cunt tone in your reply.

In theory

>Anything that can be done with a regex can be done with a finite automaton, and vice versa

Where did I state that recognising an email is impossible with finite automata? If something can be recognised by a finite automaton, it can be done with a regex.

Your original comment said that you cannot do this with regex but can with finite automata, but in theory

>They are equivalent in their expressive power, they both recognise the set of regular languages.

Anybody who has a semblance of an idea of what they're talking about will agree that they are in theory equivalent. So you can do it with regex, in theory.

Your article that you linked but didn't read carefully, states this same fact.

>And can you fully implement the complex grammars in the RFCs in your regex parser in a readable way?

It talks about the practical issues, e.g. being able to do it in a readable way with regex, because in fucking theory they are equivalent in their expressive power.

You may find the below useful:

https://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X

Alternatively:

https://www.amazon.com/gp/product/B00DKA3S6A/ref=s9_acsd_top_hd_bw_b292I_c_x_5_w?pf_rd_m=ATVPDKIKX0DER&pf_rd_s=merchandised-search-3&pf_rd_r=DQJA7YYF6XRPQ9DCCW1S&pf_rd_t=101&pf_rd_p=b949820f-ff03-5be8-b745-f0a5e56b98c9&pf_rd_i=511394

https://www.amazon.com/gp/product/B001E95R3G/ref=s9_acsd_top_hd_bw_bFfLP_c_x_1_w?pf_rd_m=ATVPDKIKX0DER&pf_rd_s=merchandised-search-4&pf_rd_r=MXQ2SVBM01QEAAET2X18&pf_rd_t=101&pf_rd_p=c842552a-f9c9-5abd-8c7d-f1340c84cb6d&pf_rd_i=3733851

u/eigenheckler · 1 pointr/compsci

>If you want to go more theoretical, look into set theory, regular languages, automata and state machines.

Sipser covers these with a theoretical-leaning book: https://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X

u/IAmNotFromKorea · 1 pointr/learnmath

Then you could take Linear Algebra, Real Analysis or Abstract Algebra.

Or you could read books like Introduction to the Theory of Computation by Michael Sipser

u/umaro900 · 1 pointr/math

Can I say Sipser's Introduction to the Theory of Computation? I know there are issues people have with the book, but in terms of accessibility and ease of reading, I think this text is second to none. I mean, though it says in the preface it's designed for upper-level undergraduates or fresh grad students, it makes no assumptions on the reader's level of knowledge, and as such I would feel comfortable recommending it even to some high school students.

u/cunttard · 1 pointr/C_Programming

Start with getting the XCode developer tools and the command-line package.

C is an important language in Computer Science because it is pretty much the language for heavy duty Operating Systems, the type you see in Desktop OSes, Network OSes (the type that runs on a networking router/switch), Server OSes (Linux, BSD, Windows, etc.).

I think C is a hard language to learn, but it is a great first serious language while also simultaneously learning an easier language like shell or Python to make yourself more efficient/productive.

However fundamental to CS is about the theory of comptuation not really languages. Languages are just a way to express computation. Some languages are better than others for expressing computation to solve certain problems. I would highly encourage also looking into understanding computation from first principles, a great introduction is Theory of Computation (2nd edition is really really cheap used). The only background knowledge you need to know is highschool mathematics.

u/Holy_City · 1 pointr/learnprogramming

You'd like this book a lot.

Another one that's more hardcore EE than computer science is The Mathematical Theory of Communication. Shannon's 1948 paper this book highlights is the foundation of information theory.

u/metaobject · 1 pointr/csbooks

Introduction to the Theory of Computation by Michael Sipser: http://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X

Edit: wow, that book is more expensive than I remember. I have the 2nd edition, which can be found for a fraction of the price of the latest new edition. I'm not sure how they compare in content, though.

u/white_nerdy · 1 pointr/learnprogramming

> I want to be able to create functional programs with a bucket transistors, a cup of magnets, pen, and paper

I've heard good things about nand2tetris which goes from logic gates to a complete system with simple assembler, compiler and OS.

One good exercise might be to create an emulator for a simple system, like CHIP-8 or DCPU-16.

If you want to go deeper:

  • If you want to build compilers, the dragon book is the go-to resource.

  • If you want to start learning about theory, I recommend Sipser.
u/_--__ · 1 pointr/compsci

Wow. That's quite a range - normally those would be covered in two, three, or even four courses. Any wonder the pass rate is so low.

It's difficult to recommend anything sensible for the whole course - I'd have suggested Rosen - I have found it somewhat useful as a reference text later on - but it seems to be unpopular with amazon reviewers. For the latter part (finite-state machines etc), Sipser is a common course textbook - but it may be too advanced for a starting point (though might be worth getting so you can get ahead before you get to that part of the course, which will almost certainly be the trickiest).

u/bitcoinagile · 1 pointr/Bitcoin

From deleted twitter account

Dr_Craig_Wright twitted at 2015-05-10 02:06:51.000:

For those who wonder just how far you can "push" the scripting language in Bitcoin... http://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X

u/guiraldelli · 1 pointr/compsci

Excellent! I'm glad to know the concept is clear to you.

I would recommend you to using the Lewis & Papadimitriou book as well as the Sipser one: for me, the former is more formal than the latter, that is more didatic (specially for undergraduate students); however, both use a simple language and are very didatic.

My advice is: take both books and keep studying by them. I've learned Theoretical Computer Science by the Lewis & Papadimitriou book, but always I couldn't get a concept, I went to Sipser. And vice-versa.

At last, the 2012 (3rd) edition of the Sipser is so beautiful, with good automata drawings to understand Pushdown Automata. :)

u/gnuvince · 1 pointr/programming

The classic Introduction to the Theory of Computation is quite excellent, though very pricey. Also, I had the pleasure of having a university class that used this book, so having a professor that can clear things up from the book was a big help.

u/shaggorama · 1 pointr/Python

Well, "machine learning" is a pretty big topic. For a lightweight crash course, I'd recommend checking out the coursera machine learning course provided through stanford taught by Andrew Ng, or check out the Will it Python? series of blog posts which ported the exercises in the book Machine Learning for Hackers from R to Python.

I'd say if you dedicated a full semester to the topic you'd still only barely be scratching the surface, so don't expect to come away with too much expertise in just a week. I definitely support pursuing machine learning though.

u/MicturitionSyncope · 1 pointr/MachineLearning

There have already been a few books listed focusing on theory, so I'll add Machine Learning for Hackers to the list.

It doesn't cover much of the theory, but it's a nice start to getting the programming skills you need for machine learning. When you start using these techniques on real data, you'll quickly see that it's almost never a simple task to go from messy data to results. You need to learn how to program to clean your data and get it into a usable form to do machine learning. A lot of people use Matlab, but since they're free I do all of my programming in R and Python. There are a lot of good libraries/packages for these languages that will enable you to do a lot of cool stuff.

u/stateful · 1 pointr/programming

Some great responses here everyone, thank you. The book Write Great Code: Volume 1: Understanding the Machine helped me understand.

u/bombsa · 1 pointr/learnprogramming

The UK amazon doesn't have any versions directly from you just resellers: https://www.amazon.co.uk/Grokking-Algorithms-illustrated-programmers-curious/dp/1617292230/ref=sr_1_1?ie=UTF8&qid=1464778753&sr=8-1&keywords=Grokking+algorithms
Any chance you will be adding new versions available through Amazon?

u/Viginti · 1 pointr/learnprogramming

Some have mentioned in replies to my post that the algorithms presented in the book are either too abstract or simple. To that I say read this, grokking algorithms: An illustrated guide for programmers and other curious people https://www.amazon.com/dp/1617292230/ref=cm_sw_r_cp_apa_KG7CAbE9E4243. It's written well and explains some more complex ideas in an easy to understand manner .

u/Ogi010 · 1 pointr/Python

can confirm, using this book in my MSCS program (my Uni doesn't have undergrad programs); I'm implementing a lot of the algorithms using custom objects in Python, so it's doing wonders for my object oriented game...

The book is also relatively straight forward to read (again, keep in mind I'm a graduate student), but other books such as the Algorithm Design Manual have been known to have easier to understand explanations and not be as theoretically deep.

Also lastly, for when I was testing my toes in the water, I flipped through Grokking Algorithms which for a short summary, and explanations on how things work is a great place to start (but of course you won't see this book in academic environments).

u/frompdx · 1 pointr/gamedev

You bet, although it uses python rather than C/C++, I think Grokking Algorithms is a good read that covers both linked lists and hash tables amazon link.

u/redditfend · 1 pointr/learnprogramming

There is this awesome book called Grokking Algorithms.

This should help you really well. Every concept is taught in a clear, concise manner with lots of illustrations.

u/Tiramelacoma · 1 pointr/learnprogramming

I would recommend Wengrow's or Bhargava's to learn the basics in a more pleasant way and and then continue with others that dig deeper (Cormen, Sedgewick, etc.).

I'm just following that plan actually.


^((Sorry for my bad english))

u/opaz · 1 pointr/learnjava

Thank you for suggesting this! The style of learning the book serves is right up my alley. Do you have any other suggestions of books that are just like this? One book I enjoyed reading in a similar fashion was Grokking Algorithms

u/Haatveit88 · 1 pointr/learnprogramming

I understand how you feel, honestly - as someone who did poorly in school, and I am somewhat dyscalculic, I really feel like I can relate!

The important thing for you, in my opinion, based on your explanation there, is to look for learning materials that suit you. Some people learn easily from really academic materials, some (like me) don't - and fortunately, there are lots of materials out there trying different approaches to teaching this kind of stuff. It gets easier as you go, as well - once the ball starts rolling you find it much easier to grasp future concepts. I got a massive 1300 page book called "An introduction to Algorithms" many years ago... Introduction my arse. It might as well have been alien language to me. But now, years later, I can actually understand its contents. It definitely was not an introduction (but it is a great book, both physically and literally).

A few recommendations for actual introductory books on these subjects:

"A Common-Sense Guide to Data Structures and Algorithms" by Jay Wengrow (2nd Edition coming 2020)

This book says the following in the opening chapter:

>"The problem with most resources on these subjects is that they're...well...obtuse. Most texts go heavy on the math jargon, and if you're not a mathematician, it's really difficult to grasp" . . . "Because of this, many people shy away from these concepts, feeling like they are simply not 'smart' enough to understand them."

It's not a perfect book, but it goes into a lot of basic data structures and explains them in a not-insane way. It helped me a lot! Understanding not just how they work, but why they are useful, is so helpful.

"Grokking Algorithms: An illustrated guide for Programmers and other curious people" by Aditya Y. Bhargava

A similar book, however, more algorithm and less data structure focused, and it goes into somewhat more depth, although usually the extra material is considered optional. The author here expresses a similar concern that books and learning materials on these concepts are often very hard to understand, and it need not be that way!

You can learn these things, you just need to find the right book/method that works for you! It can take some searching to find it. I know from experience!

Read the books, try to implement some of their concepts, and then try applying those things to real problems (i.e. from HackerRank or similar sites, try more than just HR). Read the book again. Repeat. You will understand a bit more each time. That was what worked for me, at least.

u/ruski550 · 1 pointr/iastate

I highly recommend this book. It takes a lot of key concepts on 228 and 311 a d dumbs it down into pictures with psuedocode. I would definitely use this along with the help room if you can.

https://smile.amazon.com/dp/1617292230/ref=cm_sw_r_sms_apa_i_6KATDbSJF44PD

u/ShenmeNamaeSollich · 1 pointr/cscareerquestions

Why stay at a school where you're not studying what you want, and which doesn't even offer what you want nor one of the most popular in-demand majors??

Anyway, there are any number of online courses/tutorials about Data Structures, and how to build/use them in various languages. You can use C++ for them, or try to learn something else too. For speed & simplicity in interviews, a lot of people seem to prefer Python for discussing DS&A, but by their nature the concepts are fairly language-agnostic.

Try visualalgo for one ... there are plenty of others.

Since a lot of algorithms require/suggest the use of specific data structures to make them work, it's probably better to learn what those are first, and then try to tackle the algorithms that rely on them.

Grokking Algorithms - illustrated & pretty basic intro to concepts

Common Sense Guide to Data Structures and Algorithms - slightly less so, but still pretty basic intro to concepts

CTCI - problems covering arrays, linked lists, stacks & queues, trees, graphs ... Actually kind of useless if you don't already know what those are though.

Introduction to Algorithms (CLRS) - 1 of 2 standard U.S. college-level algorithms textbooks

Algorithms, 4th Ed. - the other standard U.S. college-level textbook, w/free online "book site", code, and a free Coursera course to go along with it.

u/ell0wurld · 1 pointr/cscareerquestions

Grokking Algorithms: An illustrated guide for programmers and other curious people https://www.amazon.com/dp/1617292230/ref=cm_sw_r_cp_api_i_LPkTCb09XVDTR

u/loops_____ · 1 pointr/cscareerquestions

>The Algorithm Design Manual

Is this another algorithms book? How is it compared to Cormen's Introduction to Algorithms or Grokking Algorithms? I tried Cormen's and Grokking's, but it was a hard read and I generally prefer Youtube videos (mycodeschool) and so on. Is The Algorithm Design Manual similar?

>EPI, and PIE

What are these?

u/MerlinTheFail · 1 pointr/gamedev

This is really cool! Thank you.

>A common question is whether the book is still relevant. After all it's over ten years old

I find that some old(ish) books can really hold some great significance, for example: Effective C++ and Clean code have both given me some brilliant tips on making better code. I'm also readingWrite Great Code. If you have any more books i'd love to see them :) Thank you, again.

u/SofaAssassin · 1 pointr/cscareerquestions

What kind of jobs are you applying for? Low-level stuff is typically applicable for things like engine work, graphics, optimization, networking and audio. Okay, that covers a lot of the game development process, but there are certainly jobs that aren't deep into that, like peripheral tooling (making tools for developers to use) or working on stuff like the webservices that powers the online community.

However, if your goal really is core game development, then you need to be a lot more targeted in how you learn. I have interviewed for and was hired by a game company that worked in C++, and have also worked in distributed, networked military simulations (think of it like boring, more realistic Starcraft), so here is how I gained the various knowledge I had in getting through those types of interviews (including a 90-minute written test for the game company where I had to debug C++ code on paper, answer various gotchas, etc.).

I don't know how far you have covered, but this is how I would approach the learning now, were I to start over again.


  • Become really good at C++ - During my first job, I mostly used Java with Python/C++/Perl/TCL on the side. I learned a lot of C++ in short order to prepare for interviews and move jobs (to simulation).

  • Read Accelerated C++ and/or C++ Primer. These are probably the best books for getting introduced to C++ and starting off in a good place (as in, not learning C++ in the form of C), getting familiar with using the OO system of C++ and using the standard library. Also remember to do the exercises to really reinforce the concepts.

  • Read Effective C++ SUPER COLLECTION - In honesty, you can make do with just Effective C++, Volume 1, but these cover good practices for using C++.

  • Read the C++ FAQ - lots of gotchas there and corner cases of C++.

  • If you want to go beyond those books and resources, there are Herb Sutter's Exceptional C++ books.

  • Understand the machine - this covers the low level component. Helping you to understand the machine itself, how your code runs, how it's executed.
  • Read Randall Hyde's Write Great Code - This is one of my favorite technical books, and is language agnostic.

    It covers low-level concepts like CPU pipelining, memory, and how code interacts with the machine. I read this years after I started my job building simulations, and it reinforced a lot of what I learned previously and in college. I also recommended this book to a friend of mine who credits it with giving him an edge over his fellow college grads (he's years younger than I am) in low-level knowledge. If you don't know concepts like cache locality, cache lines and how memory is allocated, this book will cover that and more.

  • Read Randall Hyde's Art of Assembly Language - I have only briefly touched upon this book, but it takes a unique approach to introducing you to x86 ASM (by using a higher-level form of ASM).


  • Understand the algorithms and data structures - I took multiple classes in this in college, as well as periodically read CLRS to refresh my knowledge. But CLRS is too mathematically rigorous and theoretical here if you just want to get familiar with algorithms.

  • Skeina's Algorithm Design Manual is a more practical approach to refreshing yourself on algorithms and also learning complexity theory.

  • Skeina's Data Structures Lectures are helpful for data structures. In general, though, know these (I include whatever C++ has as well):
    • Dynamic array - std::vector<T> in C++.
    • Associative structures - std::map and std::unordered_map
    • Sets - std::set and std::unordered_set
    • Linked List - std::list<T> and std::forward_list<T>
    • Stacks and Queues - std::stack and std::queue
    • std::deque - The C++ implementation of a double-ended queue.
    • Trees - binary trees, red-black, heaps, tries (no standard C++ implementations of these, though stuff like std::set is typically implemented with a red-black tree behind the scenes)
    • Graphs

    • Understand the complexities of actions on each data structure (insertion, deletion, modification, searching, etc.)

  • Read the wiki on Pathfinding, because this class of algorithms is very important in game development, as well as network communication.

    -----

    The above covers the 'core' stuff you'd have to learn. If you wanted to get into stuff like network programming or graphics programming rather than just core gameplay development, I can expound further.
u/Haghiri75 · 1 pointr/computerarchitecture

This book :

https://www.amazon.com/But-How-Know-Principles-Computers-ebook/dp/B00F25LEVC

is a great point to start. I remember two semesters before computer architecture course, I read this. This book opened my eye to the "Digital Electronics" world and I learned a lot. I Highly recommend that.

u/infinitelyExplosive · 1 pointr/pcmasterrace

Here are some different sources for different aspects of computers.

The book Code: The Hidden Language of Computer Hardware and Software is an excellent introduction into the low-level concepts which modern CPUs are built on.

Link hopping on Wikipedia is a totally viable method to learn many aspects of computers. Start at some page you know about, like Graphics Cards or Internet, and just keep reading and clicking links.

Hacking challenges are a great way to learn about how computers work since they require you to have enough knowledge to be able to deliberately break programs. https://picoctf.com/ is an excellent choice for beginner- to intermediate-level challenges. https://overthewire.org/wargames/ also has some good challenges, but they start off harder and progress quickly. Note that these challenges will often require some programming, so learning a powerful language like Python will be very helpful.

This site is not very active anymore, but the old posts are excellent. It's very complex and advanced though, so it's not a good place to start. https://www.realworldtech.com/

In general, google will be your best friend. If you run into a word or program or concept you don't know, google it. If the explanations have more words you don't know, google them. It takes time, but it's the best way to learn on your own.

u/GilgamEnkidu · 1 pointr/explainlikeimfive

I HIGHLY recommend the book ["CODE" by Charles Petzold] (http://www.amazon.com/Code-Developer-Practices-Charles-Petzold-ebook/dp/B00JDMPOK2/ref=sr_1_1?ie=UTF8&qid=1412051282&sr=8-1&keywords=code). It explains how computers and programming languages are built starting with the simplest pieces (circuits, telegraph relays, transistors binary, assembly, functional languages) up through almost modern day. He also puts it all into historical context. I'm in the middle of it now and it is thoroughly interesting and elucidating.

u/sacredsnowhawk · 1 pointr/explainlikeimfive

If anyone is interested in learning about this stuff in-depth, I really recommend the book 'Code' by Charles Petzold. You won't feel like a computer moron again.

http://www.amazon.com/Code-Developer-Practices-Charles-Petzold-ebook/dp/B00JDMPOK2/ref=sr_1_1

u/toddspotters · 1 pointr/askscience

I strongly recommend that you read Petzold's book, Code: The Hidden Language of Computer Hardware and Software. It walks through the process of building circuits, using those circuits to represent information, and then gradually building those up into more complex systems, including programming languages.

u/maholeycow · 1 pointr/SoftwareEngineering

Alright man, let's do this. Sorry, had a bit of a distraction last night so didn't get around to this. By the way, if you look hard enough, you can find PDF versions of a lot of these books for free.

Classic computer science principle books that are actually fun and a great read (This is the kind of fundamental teachings you would learn in school, but I think these books teach it better):

  1. https://www.amazon.com/Code-Language-Computer-Developer-Practices-ebook/dp/B00JDMPOK2 - this one will teach you at a low level about 1's and 0's and logic and all sorts of good stuff. The interoperation of hardware and software. This was a fun book to read.
  2. https://www.nand2tetris.org/book - This book is a must in my opinion. It touches on so many things such as boolean logic, Machine language, architecture, compiling code, etc. And it is f*cking fun to work through.

    Then, if you want to get into frontend web development for example, I would suggest the following two books for the fundamentals of HTML, CSS, and JavaScript. What I like about these books is they have little challenges in them:

  3. https://www.amazon.com/Murachs-HTML5-CSS3-Boehm-Ruvalcaba/dp/1943872260/ref=sr_1_2_sspa?keywords=murach%27s+html5+and+css3&qid=1557323871&s=books&sr=1-2-spons&psc=1
  4. https://www.amazon.com/Murachs-JavaScript-jQuery-3rd-Ruvalcaba/dp/1943872058/ref=sr_1_1_sspa?keywords=murach%27s+javascript&qid=1557323886&s=books&sr=1-1-spons&psc=1

    Another great book that will teach you just fundamentals of coding using an extremely flexible programming language in Python, how to think like a programmer is this book (disclaimer: I haven't read this one, but have read other Head First books, and they rock. My roommate read this one and loved it though):

  5. https://www.amazon.com/Head-First-Learn-Code-Computational/dp/1491958863

    Let me know if you want any other recommendations when it comes to books on certain areas of software development. I do full stack web app development using .NET technology on the backend (C# and T-SQL) and React in the frontend. For my personal blog, I use vanilla HTML, CSS, and Javascript in the frontend and power backend content management with Piranha CMS (.NET Core based). I often times do things like pick up a shorter course or book on mobile development, IoT, etc. (Basically other areas from what I get paid to do at work that interest me).

    If I recommended the very first book to read on this list, it would be the Head First book. Then I would move over to the first book listed in the classic computer science book if you wanted to go towards understanding low level details, but if that's not the case, move towards implementing something with Python, or taking a Python web dev course on Udemy..

    Other really cool languages IMO: Go, C#, Ruby, Javascript, amongst many more

    P.S. Another book from someone that was in a similar situation to you: https://www.amazon.com/Self-Taught-Programmer-Definitive-Programming-Professionally-ebook/dp/B01M01YDQA/ref=sr_1_2?keywords=self+taught+programmer&qid=1557324500&s=books&sr=1-2
u/terraneng · 1 pointr/learnprogramming

Code by Charles Petzold

Pretty low level but a nice read and very informative. I read it on a plane last year.

u/ceciltech · 1 pointr/AskElectronics

If you really want to understand how a processor works I recommend Code: The Hidden Language of Computer Hardware and Software. This is a great book that works its way through the history of the development of processors and the code that runs them, easy read and so well written!

u/di0spyr0s · 1 pointr/resumes

Thanks so much!

Where do hobbies and interests go? Below Education somewhere? Sample stuff I could add:

  • I started sewing this year and have achieved my goal to knit and sew all my own clothes for 2015.
  • I play guitar, drums, and piano, and I'm learning to play bass. A friend and I started a band called OCDC, because we're n00bs and play the same thing over and over a lot.
  • I read insatiably. Most recently Code: The Hidden Language of Computer Hardware And Software and A Guide to the Good Life: The Ancient Art of Stoic Joy, but also the backs of cereal packages and the "In case of fire" escape instructions on doors if there's nothing else.
  • I'm from New Zealand and can, if necessary, butcher a sheep/pig/deer/rabbit, build a fence, milk a cow by hand (or milk several hundred, given a decent sized milking shed), TB test deer, fell trees, and use the word "munted" in a sentence.
  • I've ridden horses all my life and still volunteer occasionally as an equine masseuse for some of the carriage horses in Central Park.
  • I love automating stuff and am working on fully automating my home aquaponics set up: a combination of an aquarium and a grow bed which currently produces great quantities of grass for my cats to puke up.

    I had sort of planned to put all this stuff in my personal website - write ups of personal projects, a good reads feed, an "About me" section, and maybe a page of my sewing/knitting creations.

    I'll certainly look into adding some more personality into the resume design, it is currently the result of a google template, which is pretty blah.

    Again, Thanks so much for your feedback! It's been really helpful!
u/NothingWasDelivered · 1 pointr/computerscience

If you want a good, understandable explanation of this, read [Code]( Code (Developer Best Practices) by Charles Petzold http://www.amazon.com/dp/B00JDMPOK2/ref=cm_sw_r_udp_awd_7uYQub1VZ80TX) by Charles Petzold. He basically walks you through building a CPU from the ground up.

It's an excellent laypersons explanation of how computers work at a very fundamental level. How you can use relays (and transistors, their analog) to read 1's and 0's and make decisions based on that. From there you get to machine language (physically encoded into the chip), and everything above that is basically abstraction.

u/kecupochren · 1 pointr/computerscience

I’m self-taught so I have lot of gaps. This book really helped with the hardware part though, it goes from morse code all the way up to full blown PC - https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

u/yaymountainbiking · 1 pointr/learnprogramming

This, like all things, contains a varying amount of truthiness. I agree that beginners shouldn't use an IDE simply because they need to learn how to build an application from the ground up. However, professionally my work definitely benefits from an IDE, especially with larger code bases or when I get stuck something stupid like Java or Groovy. And sooner or later you WILL get stuck with something stupid like Java or Groovy.

Perhaps the best advice is to learn how a computer actually works. C, for example, is more representative of how a computer works than something like Javascript. Assembly, x86 or otherwise, is even better. Code by Charles Petzold is a good primer.

u/Bozo_The_Brown · 1 pointr/computerscience

Let's convert 65 to A:

01000001 // 65

00110000 // "A" bitmap 8x8 pixels
01111000
11001100
11001100
11111100
11001100
11001100
00000000

This can be accomplished in hardware with an EEPROM, similar to the way you implement the 7-segment display in Ben's series. But instead of 4 bits representing a number decoded to 7 bits for the display (4 to 7), this is 7/8 bits representing the char code decoded to 8x8 bits for the display (7/8 to 64).

You could have 8x EEPROMS that each represent a row of output. Feed the same 7/8-bit address (char code) into all of them, and connect the outputs to a grid of lights. This is just for one character.

For a more practical idea, maybe connect the char code output to an arduino that does the decoding and renders on your computer monitor.

Conversion in software follows the same principle, just mapping the char codes to bitmaps. Not something that would be fun at the assembly level XD

Code by Petzold
and N2T for more fun.

u/ChickeNES · 1 pointr/programming

This is a very good book on the subject: http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/

Myself, I wanted to know how things work so I wrote a NES emulator in C, starting with just the hardware docs and not even really knowing C yet. A bit of a crazy way to start out, but then again now I'm writing an operating system from scratch to learn how they work, so I guess the craziness never really left.

u/KaiserTom · 1 pointr/factorio

If you want to understand it, read through Code: The Hidden Language of Computer Hardware and Software. It's not a giant textbook, just a regular book (with figures to help explain subjects too) which means it is concise in its material and not overly verbose. It will give you a complete rundown of what a computer is and how it works at its most base level, starting from the concepts of morse code and electricity up to binary and mechanical relays and then the instruction codes and massive banks of transistors which end up forming what we can definitely call "a computer". It's a great book for anyone trying to delve into computer science or even just trying to understand it as a hobby.

u/JimWibble · 1 pointr/Gifts

He sounds like a younger version of myself! Technical and adventurous in equal measure. My girlfriend and I tend to organise surprise activities or adventures we can do together as gifts which I love - it doesn't have to be in any way extravegant but having someone put time and thought into something like that it amazing.

You could get something to do with nature and organise a trip or local walk that would suit his natural photography hobby. I love to learn about new things and how stuff works so if he's anything like me, something informative that fits his photography style like a guide to local wildflowers or bug guide. I don't know much about parkour but I do rock climb and a beginners bouldering or climbing session might also be fun and something you can do together.

For a more traditional gift Randall Munroe from the web comic XKCD has a couple of cool books that might be of interest - Thing Explainer and What If. Also the book CODE is a pretty good book for an inquisitive programmer and it isn't tied to any particular language, skillset or programming level.

u/EibeMandel · 1 pointr/ProgrammerHumor

I can highly recommend the book Code: The Hidden Language of Computer Hardware and Software, it helped me overcome my fear of hardware.

u/dm0x48 · 1 pointr/Unity3D

Unity has already been well covered here by other redditors.

Nevertheless, if you are also in the process of learning how to program, I would like to contribute with two pointers.

Code: The Hidden Language of Computer Hardware and Software, by Charles Petzold

For the very basic of computer science

https://www.amazon.com/dp/0735611319

Inside C#, by Tom Archer and Andrew Whitechapel

This is quite old now but very easy to read and good to understand the language

https://www.amazon.com/dp/0735616485

If you are not in the US, just google around for other sources.

Happy hacking

u/SlinkyBlue · 1 pointr/ITCareerQuestions

Thank you for the reply, that's very helpful. Obviously I have been able to benefit from the book I already have, it makes sense to continue along that avenue. [This]
(Https://www.amazon.com/dp/0735611319/ref=cm_sw_r_cp_awdb_sq.azbES12DCH) is the book that has helped me out.

I live in Sacramento. There is a massive health care and education based economy here that has an expanding infrastructure that I want to get in on ASAP.

u/MelAlton · 1 pointr/cscareerquestions

Ah those are both excellent suggestions!

Code is a great book - If a programmer newcomer reads Code and doesn't find it interesting, that's a no a good sign, as all the concepts in the book are vital, and presented very well Link: Code by Charles Petzold

And medical science background is a great lead-in to bioinformatics - in that area OP would have a lead on other straight-CS programmers. And bioinformatics uses a lot of Python, which is pretty easy to learn.

u/perrylaj · 1 pointr/learnpython

This isn't a programming book, or even a strictly computer science book. It's about Code, the history, the mindset, the means of communicating in binary/encoded processes, the evolution of signals. It touches on electronics, computers, coding, communication and so many other things. It's great for getting some history in an enjoyable format and lays some groundwork in understanding the roots of modern computing. I read it after having some years of experience in CS and still learned a lot of neat ways to look at the field from different perspectives.

http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_1?ie=UTF8&qid=1418450151&sr=8-1&keywords=code+the+language+of&pebp=1418450154033

Highly recommend it.

u/hamishtarah · 1 pointr/AskComputerScience

Charles Petzold's book "Code" gets recommended a lot for understanding how computers work, from a very basic level.

http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

There are even several people that have built relay computers similar to those discussed by Petzold.

http://nablaman.com/relay/

http://web.cecs.pdx.edu/~harry/Relay/

u/eolith · 1 pointr/hardware

The book that fits here is Code, by Charles Petzold. Not a textbook, more a general CS divulgation book, but is gold. The author explains the very basics of how a computer works in a very gradual way. You end up thinking, as you said, is amazing what we have. Complex async graphical UIs, distributed systems, virtualization... I work on IT for more than 10 years and still can't believe we have it.

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

u/ibrokedown · 1 pointr/learnprogramming

A good, if somewhat dated (and dense) traversal through this is the book Code by Charles Petzold.

u/Zeroe · 1 pointr/learnprogramming

I found the Build a Modern Computer: From NAND to Tetris course to be very informative regarding the basics of computer architecture. They also have a book in print that covers all the same material, and the first half of the book and course are available on their website for free.

If you'd like something that is less detailed and just covers the ideas, I recommend the book Code: The Hidden Language of Computer Hardware and Software by Charles Petzold. It starts from the basics of encoding information using systems like Morse code and binary numerals and shows how combining this with Boolean operations represented in hardware, we can construct machines that carry out general, programmable computations. He shows simple diagrams of relays and how those are assembled into gates, memory, and more complicated structures.

I highly recommend both resources.

u/juniordevradio · 1 pointr/programming

Tl;dr: Code: The Hidden Language of Computer Hardware and Software is a fantastic primer for this course.

u/r_acrimonger · 1 pointr/ProgrammerHumor
u/tlazolteotl · 1 pointr/learnprogramming

If you are interested in theory, a great book is Code by Charles Petzold.

u/skamansam · 1 pointr/explainlikeimfive

Software on your computer tells the CPU how to send instructions to a given piece of hardware. The software is called, "a device driver." if you want to go really deep in the answer, check out the book Code, by Charles Petzold (http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319). It is probably the best book on the subject, and is written in an ELI5 manner.

u/Headpuncher · 1 pointr/ProgrammerHumor

I had PTSD from your class and I wasn't even in attendance.

Obligatory mention of Code book that explains logic gates and some other stuff.

u/lazyAgnostic · 1 pointr/santashelpers

For programming, what kind of programming is he into? Here are some cool programming books and things:

  • Automate the Boring Stuff with Python This book has a lot of beginner projects that are actually useful.

  • Arduino A little microprocessor that he can use to make cool projects. I'm a software engineer and I had fun playing aroung with this. Plus, you can use it for actual useful things (I'm planning on making an automatic plant waterer, but you can look online for all the awesome stuff people have made).

  • Raspberry Pi Similar to the arduino but it's a full computer. For more software-heavy projects than the arduino. I'd probably recommend starting with the arduino.

  • Great book about how code and computers actually work that's geared towards the "intelligent layperson" link.

  • If he's already programming and wants to create games I can recommend this one.. Not good for beginners though.

  • If you want to give him a well written tome about game programming here it is. Again, not really for beginners but really good for someone wanting to learn about game programming
u/chalcidfly · 1 pointr/programmer

As always, read the docs first: (Code by Charles Petzold)[https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319] At least give it a skim. Great book, worth the 20$ for sure. That'll give you a really, really good intellectual basis for how computer hardware works.

Then, you could try getting a microcontroller like the Raspberry Pi or an Arduino or one of the millions of other options. Don't just get the microcontroller though, if you can afford it, get as many of the add-ons and books you can find and try to build something simple, like a computer. (Not too difficult with a Raspberry Pi.)

Or, if you want to just manipulate some LED stuff, you can buy some little LED lights and plug them straight into your microcontroller.

Whatever you do, it helps to document it through a blog or Facebook or Twitter or whatever, it'll keep you accountable.

u/VonAcht · 1 pointr/AskElectronics

I always suggest this book. It's pretty good for an introduction to electronics/basic computers. Read it and see if you like it. There's also a list of resources for beginners in this sub's FAQ, but most of them are electronics only.

u/FetusFeast · 1 pointr/books

lets see...

u/Foryourconsideration · 1 pointr/programming

I fowned a great book at Chapters in Canada called Code. It's certainly very interesting as it deals with all sorts of "code" from Morse to C# and it looks at how they are related. A surprising break from tutorial books, which I also love.

u/baldhippy · 1 pointr/AskReddit

This book does a good job of explaining it. In it the author starts off with a simple relay which is either on(1) or off(0) and slowly builds it up in complexity until he builds a CPU. Really interesting read.

u/topherker · 1 pointr/funny

http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

You would love this book if you haven't read it already.

u/t3h2mas · 1 pointr/learnprogramming

I think your plan is just fine. You may be distracted by the large amounts of choices you find along your way -- this thread is clear proof of that. Stay focused.

If you intend on starting with C I suggest a reading list kinda like this.

  • Code (good to read before you start to code)

  • K & R - The C Programming Language

  • LCTHW - Learn C the Hard Way (free to read from the site.)

    And then begin Java study. The first two may be a bit heavy at times... Work your way through and do your best, but don't except to get everything.

    You could start with Java. There's a fine community and great resources. I say go for the C -> Java route and get a taste of some low level fundamentals.
u/redbuurd · 1 pointr/learnprogramming

I'm going to be that guy...I know you asked for documentaries, but if you haven't read this book already, you should. One of the best explanations of low level computing that I've seen.

u/beefcheese · 1 pointr/techsupport

If you're looking for literature, check out nand2tetris and Code: The Hidden Language of Computer Hardware and Software

u/mech_eng_lewis · 1 pointr/computerscience

Read this book: https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

It goes from learning to count in binary to writing flip-flops, adding circuits, muxes, comparators, shift register etc. There's hardly any maths you just need to know basic algebra and how to add and multiply. The book teaches you the rest.

If you like hardware you'll love this book.

I'm an Electronic Engineering student AMA.

u/LordPachelbel · 1 pointr/explainlikeimfive

Charles Petzold's Code: The Hidden Language of Computer Hardware and Software is a great book with several chapters about this topic and topics that are closely related to it, such as Morse code, Boolean algebra, the telegraph system, electromechanical telephone relays, etc. Those chapters explain really well how everything we do with computers is ultimately handled by tiny electrically controlled switches that are connected together in special ways.

u/crabbone · 1 pointr/learnprogramming

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319 this one is a nice book. It's not exactly a way to teach oneself to code in a particular language for a particular job position, but, if you have time to spare, it gives you a lot of background knowledge / motivation behind some of the basic concepts of programming.

u/random012345 · 1 pointr/learnprogramming

Books on project management, software development lifecycle, history of computing/programming, and other books on management/theory. It's hard to read about actual programming if you can't practice it.

Some of my favorites:

  • Code: The Hidden Language of Computer Hardware and Software - GREAT choice I notice you already have listed. Possibly one of my favorite, and this should be on everyone's reading list who is involved in IT somehow. It basically how computers and programming evolved and gets you in a great way of thinking.

  • The Code Book: The Science of Secrecy from Ancient Egypt to Quantum Cryptography - Another great history book on code and how things came to be. It's more about crypto, but realistically computing's history is deeply rooted into security and crypto and ways to pass hidden messages.

  • Software Project Survival Guide - It's a project management book that specifically explains it in terms of software development.

  • The Art of Intrusion: The Real Stories Behind the Exploits of Hackers, Intruders and Deceivers - A fun collection of short hacking stories compiled and narrated by Kevin Mitnick, one of the most infamous hackers. Actually, any of Mitnick's books are great. Theres a story in there about a guy who was in jail and learned to hack while in there and get all kind of special privileges with his skills.

  • Beautiful Data: The Stories Behind Elegant Data Solutions - Most of the books in the "Beautiful" series are great and insightful. This is one of my more favorite ones.

  • A Guide to the Project Management Body of Knowledge: PMBOK(R) Guide - THE guide to project management from the group that certifies PMP... boring, dry, and great to help you get to sleep. But if you're committed enough, reading it inside and out can help you get a grasp or project management and potentially line you up to get certified (if you can get the sponsors and some experience to sit for the test). This is one of the only real certifications worth a damn, and it actually can be very valuable.

    You can't exactly learn to program without doing, but hopefully these books will give you good ideas on the theories and management to give you the best understanding when you get out. They should give you an approach many here don't have to realize that programming is just a tool to get to the end, and you can really know before you even touch any code how to best organize things.

    IF you have access to a computer and the internet, look into taking courses on Udacity, Coursera, and EDX. Don't go to or pay for any for-profit technical school no matter how enticing their marketing may tell you you'll be a CEO out of their program.
u/gfawke5 · 1 pointr/computerscience

I recommend you take a look at this book. It builds a computer from scratch, and the author has made it extremely easy to follow. You could even apply what you learn there on a physics sandbox and make your own alu/cpu.
Hope this helps.

u/extra_specticles · 1 pointr/AskMenOver30

Before you commit to it, read Code: The Hidden Language of Computer Hardware and Software by Charles Petzold. If it fires your imagination then computer programming may be is for you.

Another one to read is Soul of a New Machine by Tracy Kidder which is much older, but easily readable by non coders. Again if it fires your imagination then coding might be for you.

CS can lead to many many careers - many more than when I did my degree (80s), but you need to understand where the world of computers is moving to and where you want to be in that space.

If you're just looking for more money, then perhaps you shouldn't be looking at coding as a panacea. Don't get me wrong, coding is fantastic thing to do - if it floats your boat. However it's main problem is that you constantly have to keep yourself up to date with new technologies and techniques. This requires you to have the passion and self motivation to do that training.

I'm been coding since I was 11 (1978) and have seen many many aspects of the industry and the trade. I will concur with some of the comments here that indicate that the degree itself isn't the answer, but could be part of it.

Either whatever you decide - good luck!


u/mcandre · 1 pointr/cscareerquestions

I wouldn't try to read programming tutorials during commutes, as programming is best learned by trying out exercises in a real text editor / terminal as you follow along in the book. Only something like Microsoft's Code would make for light, commuter reading.

As for podcasts, The Verge is a fun tech podcast.

I've started listening to Welcome to Night Vale, a fun, nontech podcast.

u/coned88 · 1 pointr/linux

While being a self taught sys admin is great, learning the internals of how things work can really extend your knowledge beyond what you may have considered possible. This starts to get more into the CS portion of things, but who cares. It's still great stuff to know, and if you know this you will really be set apart. Im not sure if it will help you directly as a sys admin, but may quench your thirst. Im both a programmer and unix admin, so I tend to like both. I own or have owned most of these and enjoy them greatly. You may also consider renting them or just downloading them. I can say that knowing how thing operate internally is great, it fills in a lot of holes.

OS Internals

While you obviously are successful at the running and maintaining of unix like systems. How much do you know about their internal functions? While reading source code is the best method, some great books will save you many hours of time and will be a bit more enjoyable. These books are Amazing
The Design and Implementation of the FreeBSD Operating System

Linux Kernel Development
Advanced Programming in the UNIX Environment

Networking

Learning the actual function of networking at the code level is really interesting. Theres a whole other world below implementation. You likely know a lot of this.
Computer Networks

TCP/IP Illustrated, Vol. 1: The Protocols

Unix Network Programming, Volume 1: The Sockets Networking API

Compilers/Low Level computer Function

Knowing how a computer actually works, from electricity, to EE principles , through assembly to compilers may also interest you.
Code: The Hidden Language of Computer Hardware and Software

Computer Systems: A Programmer's Perspective

Compilers: Principles, Techniques, and Tools

u/n06 · 1 pointr/ECE

This isn't a textbook, but this is one of my favorite down to earth books about computing and electronics I have ever read. My physical computing teacher gave it to me as a gift. It is called Code and it's very cool, and pretty cheap. It's well written too.

u/Birdrun · 1 pointr/AskReddit

Try this book: "Code" by Charles Petzold. It's a wonderful summary of computer workings, starting with the most basic electronic principles, and building up to everything needed to build a usable (if simple) computer.

It's all very well written and quite enjoyable to read.

u/Kal5 · 1 pointr/ireland

If you're unemployed for over a year, there are two state provided education schemes called Springboard and Momentum whose courses you can apply for. By my understanding, Springboard courses are put on by colleges/universities while Momentum courses are put on by private training companies (Not sure about that though as I think some colleges are putting out courses under Momentum). Most last about 9 months while some are 6 months.

If you are interested in computer science, I recommend you take course which teaches you a programming language solely before you take any courses that promise the moon, sun and stars. These usually try to cram too much in so you have a shite knowledge of everything.

Get learning a programming language out of the way and then everything becomes easier. Your choices are usually either c# or java, two object oriented languages. c# is microsoft's copy of java and is used to make stuff that runs on their architecture. Java is used in lots of places as it is fairly free to use. Android apps are made with it. If you know one, you know the other. There are minor differences. Basically if you know the concepts of any object oriented language, you can learn any of them very easily.

CCT used to put out a really excellent java only course but I don't see it there now. I only see this one which seems like an all encompassing one: http://www.springboardcourses.ie/details/3478

I really would suggest you write to them and ask them if they might put one on next September as there are two start times form programs throughout the year so you'll see different programs on offer depending on when you search.

Currently I see two java courses on the Momentum scheme. Both outside Dublin. I can't say how good they might be.

searching for java on Springboard only brings up these two courses, looks like you would need to know java before attempting either of those. But just have a look around those sites and search for computer related terms, "programming" etc.

If you learned java, you would be able to take this Android course on momentum:
www.devstream.io/android-developer
(can't see that listed on the momentum site for some reason but it is a momentum course)

You'd still have to pick up some database knowledge somewhere but a book on mysql would be easy enough to understand and do yourself. If you learned java, and then databases in your own time, and then did the android course, you could become an Android dev in about 2 years. Course you'd be missing a lot of knowledge about how computers work which degree course don't teach well anyway so to that end, just read this book:

http://www.amazon.co.uk/Code-Language-Computer-Hardware-Software/dp/0735611319/

u/avinassh · 1 pointr/learnprogramming
u/DudeManFoo · 1 pointr/lostgeneration

I am an old linux guy ( since 92-3ish ) and this reminds me of ... me...

If you are still that guy that likes to just know, I ( the unix guy ) would recommend an awesome book for anyone wanting to emulate your / my up bringing and it is actually from microsoft press called Code: The Hidden Language of Computer Hardware and Software

u/cdubose · 1 pointr/IWantToLearn

There is actually a system to Braille. I read the first part of the book Code, and it does a great job explaining how someone might have first conceived of a system like Braille. For instance, notice the letters A though J. Then notice the letters K through T. The Braille patterns for K through T are the exact same as those for A through J but with the lower left dot filled in. Then notice the letters U through Z. With the exception of W, the last few letters of the Braille alphabet are like the first few letters, but with the bottom two dots filled in. (W doesn't match the pattern because W isn't part of nineteenth century French, the native language of Louis Braille).

Knowing some contextual information like this will help you memorize and understand the Braille alphabet better. I would start by learning the numbers associated with the different dot positions and go from there. This page is a good introduction I think.

u/jonc211 · 1 pointr/learnprogramming

This is another book to have read if you're interested in this stuff.
It takes you from 1s and 0s to computers actually doing things.

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

u/_rs_ss · 1 pointr/romania_ss

Dar daca ai citit deja ce am zis de nimeni ca nu glumesti dar e buna. http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319.

u/KnipSter · 1 pointr/MechanicalKeyboards

There's a poetic coincidence with this keyboard as well.

Jeff Atwood, in his blog post introducing this keyboard, described the name as "a homage to one of his favorite books": Code: The Hiden Language of Computer Hardware and Software by Charles Petzold.

I am also a fan of this particular book. I'm proud to be a contributor to its errata.

u/mlevin · 1 pointr/learnprogramming

I highly highly highly recommend the book Code by Charles Petzold. It seriously changed my life.

u/ienjoybuckyballs · 1 pointr/lostgeneration

Programming is not for everyone. People like to talk about future saturation of programmers but I don't believe it will happen simply because the majority of people will never cut it as programmers and of those that do most will be terrible at it. It's not a sleight on your brain power or intelligence to say you won't be a good programmer, it is simply an observation of the way your brain and mind work and the way you think. Most people just flat out do not think the way a programmer needs to think. It doesn't mean you can't give it a go and it doesn't mean you'll fail but it might mean you'll struggle to understand advanced concepts and may find yourself over your head and slow to adapt.

It's also important to make a distinction between a 'programmer' and a 'designer.' While there is plenty of work in web design, web design is not programming. Neither HTML nor CSS are programming languages. There is such a thing called web application development that involves HTML, CSS, and back end programming but this requires the knowledge of one or many programming languages such as PHP, Ruby, or C#.

For anyone interested in programming I would recommend you read this book before you start trying to learn programming. http://www.amazon.com/gp/product/0735611319/ref=oh_details_o00_s00_i00?ie=UTF8&psc=1 If you find yourself lost or confused and cannot finish, programming is not for you. Again, that doesn't mean you're stupid or less intelligent than anyone else it is simply an observation that you think differently. It is not an observation of inferiority, I can't stress that enough. If you can finish this book you will likely find success learning advanced programming concepts. Pick a language, any language, and start learning.

u/snowe2010 · 1 pointr/explainlikeimfive

Along with Kngjon's comment I would suggest reading the book ("Code")[http://www.amazon.com/gp/aw/d/0735611319?pc_redir=1411971211&robot_redir=1]. It's a very easy read and super fun also. You'll learn everything you ever need to know about computers. (Mostly)

u/cupkeyboardpaper · 1 pointr/0x10c

I consider myself C competent but I've never done any assembly programming. I ordered these two books today to supplement any internet resources I might come across:

Code: The Hidden Language of Computer Hardware and Software

The Elements of Computing Systems: Building a Modern Computer from First Principles

One of those (or maybe both) were mentioned in some forum (or maybe on Reddit) in reference to preparing to learn the language.

u/Amablue · 1 pointr/askscience

If you want a very good overview of how computers work, you should try reading the book Code. It starts off talking about codes, like Morse Code and Binary. Then it moves on to light switches and batteries, and other neat constructions you can make with switches and relays, then it shows you how to build a simple adder. By the end of the book the author has basically given you an overview of how computers work from the logic gates all the way up to the processors and operating systems. It's a really good book, and each chapter flows pretty well to the next and it explains things in ways that are easy to understand.

u/stucky602 · 1 pointr/explainlikeimfive

If you want to know more I would recommend checking out Code: The Hidden Language of Computer Hardware and Software by
Charles Petzold. It's not even really a coding book (well it is but it isn't). He goes form the bare basic of the telegraph and works all the way up to computers. It explains everything in a way that makes so much sense.

I'm not a programmer. I only really know how to make code spit out Hello World. I love this book as it was so freaking interesting.

https://www.amazon.com/gp/product/0735611319/ref=ppx_yo_dt_b_search_asin_title?ie=UTF8&psc=1

u/JonnyRocks · 1 pointr/computing

As others have noted, your question needs to be fleshed out but this book will answer your question regardless:

http://amzn.com/0735611319

u/BeanerSA · 1 pointr/learnprogramming

Code, by Charles Petzold might be useful

http://amzn.com/0735611319

u/fuckjeah · 1 pointr/todayilearned

Yes I know, that is why I mentioned general purpose computation. See Turing wrote a paper about making such a machine, but the British intelligence which funded him during the war needed a machine to crack codes through brute force, so he doesn't need general computation (his invention), but the machine still used fundamental parts of computation invented by Turing.

The Eniac is a marvel, but it is an implementation of his work, he invented it. Even Grace Hopper mentions this.

What the Americans did invent there though, was the higher level language and the compiler. That was a brilliant bit of work, but the credit for computation goes to Turing, and for general purpose computation (this is why the award in my field of comp. sci. is the Turing award, why a machine with all 8 operations to become a general computer is called Turing complete and why Turing along with Babbage are called the fathers of computation). This conversation is a bit like crediting Edison for the lightbulb. He certainly did not invent the lightbulb, what he did was make the lightbulb a practical utility by creating a longer lasting one (the lightbulbs first patent was filed 40 years earlier).

I didn't use a reference to a film as a historical reference, I used it because it is in popular culture, which I imagine you are more familiar with than the history of computation, as is shown by you not mentioning Babbage once and yet the original assertion was the invention of "Computation" and not the first implementation of the general purpose computer.

> The Engine incorporated an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete.

Here is a bit where Von-Neuman (American creator of the Von-Neuman architecture we use to this day) had to say:

> The principle of the modern computer was proposed by Alan Turing, in his seminal 1936 paper, On Computable Numbers. Turing proposed a simple device that he called "Universal Computing machine" that is later known as a Universal Turing machine. He proved that such machine is capable of computing anything that is computable by executing instructions (program) stored on tape, allowing the machine to be programmable.

> The fundamental concept of Turing's design is stored program, where all instruction for computing is stored in the memory.

> Von Neumann acknowledged that the central concept of the modern computer was due to this paper. Turing machines are to this day a central object of study in theory of computation. Except for the limitations imposed by their finite memory stores, modern computers are said to be Turing-complete, which is to say, they have algorithm execution capability equivalent to a universal Turing machine.

TLDR: History is not on your side, I'm afraid. Babbage invented computation, Turing invented the programmable computer. Americans invented the memory pipelines, transistor, compiler and first compilable programming language. Here is an American book by a famous Nobel prize winning physicist (Richard Feynman) where the roots of computation is discussed and the invention credit awarded to Alan Turing. Its called Feynman's Lectures on Computation, you should read it (or perhaps the silly movie is more your speed).

u/dnabre · 1 pointr/compsci

Feynman's Lectures on Computation

Definitely light reading. Some of the stuff seems a bit dated and some a bit basic, but Feynman's has a way of looking at things and explaining them that is totally unique. (You might want to skip the chapter on quantum computing if you don't have the background).

u/CypripediumCalceolus · 1 pointr/askscience

Feynman Lectures On Computation gives a lot of practical examples of how the laws of thermodynamics, engineering developments, and information theory limit information storage density in such systems. Yes, there is a limit, but it is very big and far away.

u/animesh1977 · 1 pointr/programming

As gsyme said in the comment, he covers bits from Feynman's book on computation ( http://www.amazon.com/Feynman-Lectures-Computation-Richard-P/dp/0738202967 ). Basically the lecturer is trying to look at the electronic and thermodynamic aspects of computation. He refers to review from Bennett ( http://www.research.ibm.com/people/b/bennetc/bennettc1982666c3d53.pdf ) @ 1:27 . Apart from this some interesting things like constant 'k' @ 1:02 and reversible-computing at 1:26 are touched upon :)

u/IamABot_v01 · 1 pointr/AMAAggregator


Autogenerated.

Science AMA Series: I’m Tony Hey, chief data scientist at the UK STFC. I worked with Richard Feynman and edited a book about Feynman and computing. Let’s talk about Feynman on what would have been his 100th birthday. AMA!

Hi! I’m Tony Hey, the chief data scientist at the Science and Technology Facilities Council in the UK and a former vice president at Microsoft. I received a doctorate in particle physics from the University of Oxford before moving into computer science, where I studied parallel computing and Big Data for science. The folks at Physics Today magazine asked me to come chat about Richard Feynman, who would have turned 100 years old today. Feynman earned a share of the 1965 Nobel Prize in Physics for his work in quantum electrodynamics and was famous for his accessible lectures and insatiable curiosity. I first met Feynman in 1970 when I began a postdoctoral research job in theoretical particle physics at Caltech. Years later I edited a book about Feynman’s lectures on computation; check out my TEDx talk on Feynman’s contributions to computing.



I’m excited to talk about Feynman’s many accomplishments in particle physics and computing and to share stories about Feynman and the exciting atmosphere at Caltech in the early 1970s. Also feel free to ask me about my career path and computer science work! I’ll be online today at 1pm EDT to answer your questions.


-----------------------------------------------------------

IamAbot_v01. Alpha version. Under care of /u/oppon.
Comment 1 of 1
Updated at 2018-05-11 17:56:32.133134

Next update in approximately 20 mins at 2018-05-11 18:16:32.133173

u/n00bj00b · 1 pointr/askscience

I haven't heard of that book, I may have to check it out. I was going to recommend Lectures on Computation by Richard Feynman. It's one of the best books i've read on the subject; its starts out with just simple logic, excluding circuits and transistors, but eventually going all the way to talking about quantum computing.

u/BertRenolds · 1 pointr/textbooks

that's kinda ironic.. was about to start my search

Discrete Mathematics
by Norman L. Biggs 2nd.


https://www.amazon.ca/Discrete-Mathematics-Norman-L-Biggs/dp/0198507178

u/stewartr · 1 pointr/programming

In MSCS we spent a semester going through this. It was a hard class! As we were hurling question after confused question about our text

http://www.amazon.com/Introduction-Automata-Theory-Languages-Computation/dp/0201441241

(first edition). The instructor exclaimed, "no, yes, this is good!"

u/o0o · 1 pointr/programming

They're called ε transitions in the book from which I first learned (HMU):

http://www.amazon.com/Introduction-Automata-Theory-Languages-Computation/dp/0201441241

But, all subsequent classes I've been in have called them lambda.

I continue to call them ε transitions because before I had heard otherwise, I read a paper discussing the sorts of (equivalent) automata that are created by allowing multiple NFAs to run concurrently.

http://citeseer.ist.psu.edu/297888.html

In the paper, the sort of transition that is created due to the shuffle operator is called a "lambda transition," so that is what I think of when I hear "lambda".

I recommend the read. It introduces the shuffle operator, adds a new construct to the "Thompson Construction" method of RE->NFA, and discusses their equivalence to binary Petri Nets.

It should be noted that a review problem in HMU discusses the closure properties of shuffling two regular languages (i.e., the shuffle of 2 regular languages is still regular). Kozen also discusses the shuffle in his intro to automata book.

u/jhill515 · 1 pointr/MachineLearning

Personally, I like "Introduction to Machine Learning" by Alpaydin.


I also strongly recommend reading "The Computational Complexity of Machine Learning" by Michael Kerns.

I agree with @machinedunlearned on the point that ML is a multidisciplinary field. I've been doing work in this field for several years, and I don't consider myself a subject matter expert on much outside of what I call Intelligent Systems. As such, I tend to get "tied at the hip" to a field expert when I'm applying ML to various problems. That said, note that ML has a wide range of techniques that are ever expanding since it's a hot area of research. First gain a broad understanding of what constitutes supervised, unsupervised, and reinforcement learning, then lean when each type of learning is best applied to various problems. That skill will prove invaluable. The references will touch on this some, but don't be afraid to try something different to learn something new!

u/Plutarch_Rime · 1 pointr/newjersey

The ultimate bathroom book is the New Hacker's Dictionary, based on the famous jargon file. It is a list of old computing slang and terminology up to, mostly, the 1980s.

Where known, the origin of the term is listed (this is edited by Eric Raymond), and you'll see a lot of schools you expect - MIT, Stanford (SAIL), etc.

You'll also see Rutgers pop up a surprising amount of times.

I used to study over on Busch in a basement in...I forget the building. This was the early 90s when they still had PLATO terminals set up and working. The whole building seemed to hum with technology.

I wasn't majoring in anything technical but even then Rutgers had a surprisingly high-tech groove. There are Usenet posts out there that I made in 1993 from the Livingston library. Thanks to Rutgers, I got on the Internet via shell (for free) in 1991. Maybe a lot of Universities offered those accounts universally to students but I remember actually staying up all night to get into the library when it opened in the morning, just to screw around on FTP sites.

u/ReactsWithWords · 1 pointr/explainlikeimfive

OK, in terms of Authority Figures, how about the Internet Engineering Task Force?

Or, if you want historical documents, you can't beat The Jargon File (Note: If you insist on a dead tree version of that, We got you covered there, too - note how A) it's the 3rd edition, and B) it's copy written 1996, which means the first and second editions are even older.

I've yet to see a single source that's older to say otherwise. If you know of one, I'd love to see it.

u/Sawta · 1 pointr/AskReddit

If anyone is interested, you can also buy The New Hacker's Dictionary off of amazon, here

Note: I used my Amazon Associates reference code in that link, I hope people don't mind if I try to make $1 or so off of potential sales.

u/macemoth · 1 pointr/computerscience

Here's a PDF which I used as reference and which covers most concepts:

https://cglab.ca/~michiel/TheoryOfComputation/TheoryOfComputation.pdf

​

For more depth, this book is often seen as "the bible" of this topic:

https://www.amazon.com/Introduction-Automata-Theory-Languages-Computation/dp/0321455363

​

If you're looking for exercises, this could be a good resource (especially designing Turing Machines is a thing of practice):

https://www.cl.cam.ac.uk/teaching/exams/pastpapers/t-ComputationTheory.html

​

If primitive recursive functions are also relevant for you, I can strongly recommend this video to you:

https://www.youtube.com/watch?v=cjq0X-vfvYY

u/techgeek6061 · 1 pointr/engineering

Check out "The Definitive Guide to How Computers Do Math : Featuring the Virtual DIY Calculator." It's not a textbook, but it is a pretty good guide to computer architecture with a lot of hands-on labs to work through.

https://www.amazon.com/Definitive-Guide-How-Computers-Math/dp/0471732788

u/picado · 1 pointr/learnmath

Sipser on algorithms is the can't-go-wrong starting point. You can get an older edition cheap and it will be just as good.

Do the problems. Come back with questions.

u/CorruptLegalAlien · 1 pointr/AskReddit

College books are also much more expensive in the USA than in Europe.

For example:

$152.71
VS
£43.62($68.03)

$146.26 VS
£44.34($69.16)

u/leoc · 1 pointr/compsci

It's not free (in fact it's sickeningly expensive) but Sipser [amazon.com] is a very self-teachable (self-learn-from-able? :) ) text covering automata theory, computability theory, and complexity theory.

u/icelandica · 1 pointr/math

Work hard and you'll get there. I preferred the applied side of things, but if I just stuck with pure math I think I would have eventually gotten a tenure track position in the mathematics side of things.

My favorite book to this day, for a beginners course in Computational complexity is still, Michael Sipser's Introduction to theory of computation, I highly recommend it. It might be a little too easy for you if you already have a base, let me know and I'll recommend books more advanced.

Here is a link to the book on amazon, although any big college library should have it, if not just have them order it for you. I've gotten my college's library to buy so many books that I wanted to read, but not spend money on, you'd be surprised at how responsive they are to purchasing requests from PhD candidates.

u/3rw4n · 1 pointr/compsci

Depending on the amount of energy you want to put into this: "Introduction to Lambda Calculus" by Henk Barendegt et al. is great ((http://www.cse.chalmers.se/research/group/logic/TypesSS05/Extra/geuvers.pdf).

Study the proofs and do the exercises and you will learn a ton, quickly. You can also read "proposition as types" by Philip Wadler (http://homepages.inf.ed.ac.uk/wadler/papers/propositions-as-types/propositions-as-types.pdf) and pick up the "Introduction to the Theory of Computation" book (https://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/0534950973/)

Of course you don't need to read all of this to get a basic understanding of lambda calculus but if you want to understand for "real" so it sticks.

u/seepeeyou · 1 pointr/compsci

My local used book store has a copy of Sipser for $15 that I've been meaning to pick up. Considering the $143 price tag on Amazon, it's a pretty good bargain. I just don't know whether it's 1st or 2nd edition. Anyone have any idea if there are major differences?

u/Nerdlinger · 1 pointr/geek

Oi. Disclaimer: I haven't bought a book in the field in a while, so there might be some new greats that I'm not familiar with. Also, I'm old and have no memory, so I may very well have forgotten some greats. But here is what I can recommend.

I got my start with Koblitz's Course in Number Theory and Cryptography and Schneier's Applied Cryptography. Schneier's is a bit basic, outdated, and erroneous in spots, and the guy is annoying as fuck, but it's still a pretty darned good intro to the field.

If you're strong at math (and computation and complexity theory) then Oded Goldreich's Foundations of Cryptography Volume 1 and Volume 2 are outstanding. If you're not so strong in those areas, you may want to come up to speed with the help of Sipser and Moret first.

Also, if you need to shore up your number theory and algebra, Victor Shoup is the man.

At this point, you ought to have a pretty good base for building on by reading research papers.

One other note, two books that I've not looked at but are written by people I really respect Introduction to Modern Cryptography by Katz and Lindell and Computational Complexity: A Modern Approach by Arora and Barak.

Hope that helps.

u/propaglandist · 1 pointr/gaming

That's not an algorithms. No, sirree, what you've got there is a theoretical computer science class. This is the Sipser on the board.

u/kaylee-anderson · 1 pointr/Documentaries

Get the book CODE by Charles Petzold. It's a super easy read, and by the time you're done with it you'll understand how CPUs work.

u/3burk · 1 pointr/learnprogramming

If you want to start at the ones and zeros and really understand how a computer makes calculations, there is no better book than:

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

u/HarambeBerlusconi · 1 pointr/italy

Visto che mi pare che tu stia iniziando e non abbia alcuna conoscenza, ti sconsiglio tutti i libri che ti hanno suggerito, perché malgrado in qualche caso siano buoni (tanenbaum) se non hai la minima idea di cosa stiamo parlando non saranno proprio immediati e forse troppo noiosi.

Purtroppo, il libro che ti consiglio non mi risulta sia più disponibile in italiano (non è molto difficile da leggere in inglese) ma potresti ancora trovarlo: https://www.amazon.it/Code-Language-Computer-Hardware-Software/dp/0735611319/

E' un'introduzione di alto livello ai concetti più importanti dell'informatica, non insegna nulla di pratico ma credo sia il miglior testo introduttivo mai scritto. Se non sai nulla di informatica eviterei di leggere qualsiasi altra cosa, anche se più pratica, prima di questo. Ti darà la base teorica necessaria per apprezzare a pieno tutto il resto.

u/Nasty_Nate93 · 1 pointr/learnprogramming

Computer Science as a whole is going to be hard to digest. I have heard good things about This book as a starting point, to give you a basic feel for it all (and determine if it's the right career path for you). You will need to choose your first language to start studying as well. Python or Java are not bad options.

u/Cyphierre · 1 pointr/math

I Highly Recommend for that purpose.

u/RadioRoscoe · 1 pointr/funny

I have played on that learning circuit board thing, and it is truely awesome. The books that come with it are top notch. A nice compliment to it would be Petzoid's book, Code.

u/dnew · 1 pointr/Unity3D

There was a book I read 40 years ago that covered basically everything from vacuum tubes and semiconductors up to basically chips. It was in the library, and it was like 800 pages long. I asked on reddit if anyone knew what it was, and someone pointed me at the newest edition. But I don't really have time to go through all my comment history looking for "electronics book" or to write a program to do same, but you should feel free to do so. :-) Then I got into assembly for the 8-bit CPUs, picked up the 16-bit and 32-bit CPUs of the day, and the mainframe stuff. Then I went back to school. :-)

However, all that said, this looks like what I read, and the intro sounds like he's describing the first edition I remember: https://smile.amazon.com/Electronic-Devices-Circuit-Theory-11e-ebook/dp/B01LY6238B/ref=mt_kindle

If you want more about assembler, just flipping through this seems like it starts with the very fundamentals and goes through a fair amount. https://smile.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_5 If you already know how to program, and you understand the basics of how (for example) basic assembler language works and how the chip accesses memory and what an interrupt does and etc, then learning new assembler languages is pretty straightforward. Sort of like "I know Java, now I need to learn C#."

But honestly, at this point, I'd look online. When I learned all this stuff, textbooks were the way to go. Nowadays, everything moves so fast that you're probably better off finding a decent description online, or looking up an online class or something and seeing what texts they use.

If you don't want to learn assembler or hardware, but you still want to challenge yourself, the other thing to look into is unusual programming languages and operating systems. Things that are unlike what people now use for doing business programming. Languages like APL (or "J"), or Hermes, or Rust, or Erlang, or Smalltalk, or even Lisp or Forth if you've been steeped in OOP for too long. Operating systems like Eros or Amoeba or Singularity. Everything stretches your mind, everything gives you tools you can use in even the most mundane situations, and everything wonderful and wild helps you accept that what you're doing now is tedious and mundane but that's where you're at for the moment. :-) (Or, as I often exclaim at work, "My kingdom for a Java list comprehension!")

u/rocknerd32 · 1 pointr/learnprogramming

This Book really me understand The Hardware And how it Functions. May Help

Code: The Hidden Language of Computer Hardware and Software https://www.amazon.co.uk/dp/0735611319/ref=cm_sw_r_cp_api_KxpLxbD41R90A

Sorry For The Spelling, German

u/XUtYwYzz · 1 pointr/educationalgifs

Code: The Hidden Language of Computer Hardware and Software is one of my favorite books and covers this topic. It's amazing.

u/stepstep · 1 pointr/compsci

It doesn't take much math to understand the basic principles of how a physical computer works (starting at the level of transistors, and not including all the software that runs on it). OP said he/she wants to work from the level of transistors upward, so he/she doesn't really need to know any chemistry or physics. In fact, OP doesn't even need much electronics knowledge (e.g. inductance, impedance, etc.), since most of the construction of a computer will be at the level of digital logic gates. "Engineering" is a broad term that encompasses all of these topics, so I won't consider it separately. Software development and computer science might be relevant once OP has conquered his/her quest of understanding how a physical computer works and wants to study higher abstractions, but this is not prerequisite knowledge. industrial engineering will be useful only if OP actually wants to cheaply build practical computers out of silicon, but this is probably not the goal.

By the way, I second /u/azimuth's suggestion: Code: The Hidden Language of Computer Hardware and Software.

u/pat_trick · 1 pointr/learnprogramming

She may be interested in these books (though they are a bit higher level):

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/

https://www.amazon.com/Break-Code-Cryptography-Beginners-Childrens/dp/0486291464/

I would ask what programming language they are learning in class.

I would NOT recommend Scratch as a programming language, though that's my personal opinion. Better to learn an actual language.

u/Teknull · 1 pointr/IWantToLearn

Is this it?

http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

I am interesting in building an 8-bit cpu as well.

u/batmannigan · 1 pointr/ECE

Have you read the good book?. Joking aside, Code is an amazing book, which really tied alot of things together conceptually for me.

edit: My god I need to make a reddit bot, those are cool.

u/LocalAmazonBot · 1 pointr/ECE

Here are some links for the product in the above comment for different countries:

Amazon Smile Link: the good book?


|Country|Link|
|:-----------|:------------|
|UK|amazon.co.uk|
|Spain|amazon.es|
|France|amazon.fr|
|Germany|amazon.de|
|Japan|amazon.co.jp|
|Canada|amazon.ca|
|Italy|amazon.it|
|China|amazon.cn|




This bot is currently in testing so let me know what you think by voting (or commenting). The thread for feature requests can be found here.

u/idoescompooters · 1 pointr/learnprogramming

Big Nerd Ranch Objective C is a great book. Take your time with it. Code: The Hidden Language of Computer Hardware and Software, here, is really good.

u/metawhimsy · 1 pointr/compsci

You may be interested in Code: The Hidden Language of Computer Hardware and Software by Charles Petzold.

u/synt4xtician · 1 pointr/explainlikeimfive

Code is a great book that starts with this analogy & works its way up to advanced computer science and information technology concepts. Highly recommended if your'e interested in this!

u/ctide · 1 pointr/ruby

code is really low level, but is awesome for getting a good core understanding of how things work.

u/looeee · 1 pointr/math

some amazing books I would suggest to you are:

  • Godel Escher Bach

  • Road to Reality By Roger Penrose.

  • Code by
    Charles Petzold.

  • Pi in the Sky by John Barrow.

    All of these I would love to read again, if I had the time, but none more so than Godel, Escher, Bach, which is one of the most beautiful books I have ever come across.

    Road to Reality is the most technical of these books, but gives a really clear outline of how mathematics is used to describe reality (in the sense of physics).

    Code, basically, teaches you how you could build a computer (minus, you know, all the engineering. But that's trivial surely? :) ). The last chapter on operating systems is pretty dated now but the rest of it is great.

    Pi in the Sky is more of a casual read about the philosophy of mathematics. But its very well written, good night time reading!

    You have a really good opportunity to get an intuitive understanding of the heart of mathematics, which even at a college level is somewhat glossed over, in my experience. Use it!
u/marrick66 · 1 pointr/hacking

I own the first edition, and looking at the second, it does a much better job of giving some rudimentary programming basics. You might want to pair it with Code, which is great for getting an overall view of how computers work.

u/JumboJellybean · 1 pointr/learnprogramming

Here are my two big enthusiastic suggestions.

Sign up to Lynda. You get a 10-day free trial, and then it's $30 for a month. Watch Foundations of Programming: Fundamentals (4h 47m), Foundations of Programming: Data Structures (2h 29m), Programming Fundamentals in the Real World (3h 8m), and Python 3 Essential Training (6h 36m).

These are short crash courses, you obviously don't walk away a full-on programmer. But the main instructor Simon Allardice is excellent at explaining the basics and fundamentals in a very clear, natural way. I have taken university courses, I have watched MIT and Harvard courses, I have used a dozen tutorial sites and watched a bunch of lecturers and read three dozen books: the Lynda programs I linked are the best first-intro things I've seen. I strongly recommend that you watch at least the first two.

You might not understand it all, that's fine. Don't worry about what languages he uses for examples, 90% of stuff carries over between languages. If you can absorb a good chunk of that material it'll be a huge boost for you and a nice foundation to start on. You'll walk into your first real class in a better headspace, already knowing the gist of what you're going to flesh out and properly sink your teeth into. And if you find that the Lynda stuff really works well, look up their C and databases courses, since you'll wind up using that stuff too.

My second recommendation is that you buy and read Charles Petzold's wonderful book Code: The Hidden Language of Computer Hardware and Software. This book doesn't focus on a specific programming language, or how to implement programs, or the mathematics, or the theory. Instead it answers "So what is a computer actually doing under there when we program? What's a CPU, exactly, how does it understand words and numbers?" It does this in a very natural, accessible, for-the-layman way, starting with really simple analogies about signal flags and morse code, and before you know it, bam, you understand logic gates and binary arithmetic, and a lot of the mystery and magic of computers has dissolved for you.

u/prettymucheverywhere · 1 pointr/AskReddit
u/a_redditor · 1 pointr/learnprogramming

Let me just say right off the bat that it sounds like you're well on your way to being a successful programmer.

One thing I can definitely suggest (which helped me a lot) is reading Code or some other book like it. It is effectively a guide from the ground up of how computers and programming work in general. I had the fortune of reading most of it before I started my CS degree, and it really helped me breeze through my hardware courses.

As well, any books on data structures would probably be helpful, as this is one of the early topics covered in many CS programs. I can't suggest any specific books, but I'm sure others can.

Most of all I have to suggest just getting very comfortable with programming and learning several different languages. It looks like you're already well on your way with this, but the goal here is to have a strong passion for programming before college. That way, when you're up at 3 AM the night before an assignment is due, it's not because you procrastinated and you waited until the last minute to start because you loathe the thought of programming, but because you're so excited about making your code perfect and adding in additional functionality because you absolutely love programming.

u/pubgrub · 1 pointr/explainlikeimfive

You might want to look at this book

or this webpage

u/dybt · 1 pointr/Minecraft

While not specific to redstone/minecraft, this book is a great introduction to the workings of computers and the concepts apply to redstone computers as well. It takes you from why flicking a switch turns on a lightbulb, to a stage where you can make sense of and implement fairly complicated circuit diagrams such as this in the length of an average novel.

In terms of applying it to minecraft, just read the redstone related pages in the wiki. I think this relies on the fact that a redstone signal can pass through glass but not solid blocks so you can store data using glass as a 1 and another block as 0 and the pistons push them around in a circuit.

u/Junaos · 1 pointr/learnpython

No problem :) If you want to read a little bit further on why this happens, take a look at Two's Complement, which explains a bit more how this works, and why it works, at the binary level.

Also, this book does a much better job explaining these concepts than I ever could.

u/lucasmez · 1 pointr/NoStupidQuestions

I would recommend this book to start out.

u/clever-clever · 1 pointr/explainlikeimfive

If you're interested, this is an amazing read.

u/zeitistjetzt · 1 pointr/explainlikeimfive

For further reading
http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

It started from the beginning with binary, Morse code, and light switches, etc, gradually building up to motherboards and operating systems. It starts to make me feel like I could build a computer from scratch.

u/Blufalcon94 · 1 pointr/AskNetsec

Check out Code: The Hidden Language of Computer Hardware and Software https://www.amazon.com/dp/0735611319/ref=cm_sw_r_awd_Rdrfub17VD49B

u/maratc · 1 pointr/buildapc

Petzold's Code is a very nice book.

u/reeecheee · 1 pointr/compsci

This book was absolutely awesome and will take you logically from a flashlight to logic gates to an entire computer with no EE background:

Code: The Hidden Language of Computer Hardware and Software by Charles Petzold http://www.amazon.com/dp/0735611319/ref=cm_sw_r_udp_awd_X-7-tb14JMMZV

u/Wittgenstienwasright · 1 pointr/computerscience

I am sorry but that is simply not true. The Admins on this site are quite careful about such things. Without the age of the child it is impossible to provide a suitable answer. At eleven possibly, as I recommended in my DM, perhaps Rpi and surrounding projects as shown on the site, at thirteen, then Python and later on as experience dictates. Please do not berate a fellow father trying to educate people especially for free. Good luck and before your teenage tries anything new, perhaps you can find this physical tome.

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319



The first chapter is about communication.

u/scrabbles · 1 pointr/explainlikeimfive

Further to the excellent comments already left, if you want to investigate things later on in your own time, you might enjoy this book Code: The Hidden Language of Computer Hardware and Software. It explains programming and computer hardware fundamentals using excellent (and real - based on history) examples. I think even a 10 yr old would get a lot out of it, I would go so far as to recommend it to tech inclined parents for themselves and their children.

u/Dwaligon · 1 pointr/arduino
u/bladekill97 · 1 pointr/learnprogramming

I put C++ in the similar boat as C, both are good languages to start.

Also HTML is not programming language rather it is markup language used for structuring the web sites.

If you are going with C++ I would suggest:

u/TomCoughlinHotSeat · 0 pointsr/learnprogramming

Sipser's book is basically free on amazon if u buy used old editions. http://www.amazon.com/gp/aw/ol/053494728X/ref=olp_tab_used?ie=UTF8&condition=used

It basically just asks wut restricted types of computers can do. Like wut happens if u have a program but only a finite amt of memory, or if u have infinite memory but it's all stored in a stack. Or if u have infinite memory with random access.

Turns out lots of models r equal and lots r different and u can prove this. Also, these models inspire and capture lots of ur favorite programming tools like regex (= DFA) and parser generators (= restricted PDA) for ur favorite programming languages.

u/kasbah · 0 pointsr/ECE

I have just started reading "Code" by Charles Petzold. I think this book would have been a godsend when I was just starting out.

http://www.amazon.com/Code-Dv-Undefined-Charles-Petzold/dp/0735611319/ref=pd_bbs_1?ie=UTF8&s=books&qid=1219981358&sr=8-1

u/Flofinator · 0 pointsr/learnprogramming

Yikes! Well it's going to be pretty hard for you to really understand how to do Python without actually coding in it.

The one thing you could do though is get a book with examples and write them down and try to modify the examples to do something a little extra while at work.

I find the http://www.headfirstlabs.com/books/hfpython/ books the absolute best books for almost anything if you are just starting out. The Java book is especially fun!

I know this isn't exactly what you are asking but it might be a good resource for you to start using.

Another great book that will teach you parts of the theory, and has really good examples on how computers work is http://www.amazon.com/Code-Language-Computer-Developer-Practices-ebook/dp/B00JDMPOK2/ref=sr_1_1?s=digital-text&ie=UTF8&qid=1457746705&sr=1-1&keywords=code+charles+petzold .

That really helped me think about computers in a more intuitive way when I was first starting. It goes through the history and to what an adder is and more. I highly recommend that book if you want to understand how computers work.

u/jackmott · -2 pointsr/programming

Well, I haven't suggested that you know everything down to subatomic physics, I've suggested you understand assembler. There are good reasons for this, even if you are a Java programmer, as the JVM has it's own bytecode, which you might want to inspect at times, and that turns into real machine instructions, which you might want to inspect sometime, to understand if your abstraction is turning out the way you want.

Example: Yesterday another programmer and I were not sure if the JVM would produce SIMD instructions for a simple loop doing math on doubles. He thought it should, from the documentation, but the speed of the function suggested it wasn't.

By looking at the assembler, we were able to figure it out. While it was using SSE floating point instructions, it was only using a single lane. (64bit compilers do this for obscure reasons I Can get into, if you change your mind about willfull ignorance one day) but only 1 lane of the SSE registers were being used.

So in short, no, the JVM wasn't vectorizing the loop, probably because it was floating point, which is not associative for addition. It may do it with integers. To find out I'll go check the disassembly.

Anyway, the more you know about how the hardware works, the better you can use high level languages and abstractions, because you will understand how to structure them so the assembly that the JVM or compiler produces is efficient.

It is a bit old but still relevant, good book on this subject:
https://www.amazon.com/Write-Great-Code-Low-Level-High-Level/dp/1593270658

u/Sonoff · -15 pointsr/speedrun

If you did not understand what was happening but thought "wow" and wish you had, reading this book is a great way to start (maybe the best) : https://www.amazon.com/gp/aw/d/0735611319/ref=mp_s_a_1_1