Reddit Reddit reviews Feynman Lectures On Computation (Frontiers in Physics)

We found 13 Reddit comments about Feynman Lectures On Computation (Frontiers in Physics). Here are the top ones, ranked by their Reddit score.

Computers & Technology
Books
Computer Science
AI & Machine Learning
Machine Theory
Feynman Lectures On Computation (Frontiers in Physics)
Used Book in Good Condition
Check price on Amazon

13 Reddit comments about Feynman Lectures On Computation (Frontiers in Physics):

u/twopoint718 · 14 pointsr/programming

My favorite example of a mind-bogglingly well-staffed company was "Thinking Machines Corporation":

(Taken from Wikipedia, not exhaustive!):

  • Greg Papadopoulos (Sun CTO)
  • Guy L Steele, Jr. (Scheme designer)
  • Brewster Kahle (Internet Archive, Founder)
  • Marvin Minsky (AI pioneer)
  • Doug Lenat (AI pioneer, Cyc project)
  • Stephen Wolfram (Mathematica creator)
  • Eric Lander (Human Geneome Project, President Obama's council of science and technology advisors, Co-chair)
  • Richard Feynman (Nobel Prize, Physics, Manhattan Project)
  • Alan Harshman (High-performance computing, AI)
  • Tsutomu Shimomura (security expert, notable for his involvement in the arrest of Kevin Mitnick)

    I found this when I was reading about Feynman one time. This isn't meant to disparage Google at all, it's an amazing list though.

    EDIT: I forgot to mention what I started out writing. Feynman produced an excellent book, The Feynman Lectures on Computation, which if you're familiar with the physics version of the same, is an incredibly lucid, short, and informative book. I think this would make an excellent textbook for a course in computer architecture.

u/sleepingsquirrel · 9 pointsr/ECE
u/allforumer · 6 pointsr/programming

You might like this book -

Feynman Lectures on Computation

u/[deleted] · 4 pointsr/Physics

As a computational physicist Feynman's Lectures on Computation is my favorite book.

You'd be surprised how much of it still relevant, also Feynman poses challenges that can be worked on at any skill level (including problems he hasn't solved himself).

u/Augur137 · 3 pointsr/compsci

Feynman gave a few lectures about computation. He talked about things like reversible computation and thermodynamics, quantum computing (before it was a thing), and information theory. They were pretty interesting. https://www.amazon.com/Feynman-Lectures-Computation-Frontiers-Physics/dp/0738202967

u/TezlaKoil · 2 pointsr/compsci

I think he may have meant the other Feynman lectures.

u/n00bj00b · 1 pointr/askscience

I haven't heard of that book, I may have to check it out. I was going to recommend Lectures on Computation by Richard Feynman. It's one of the best books i've read on the subject; its starts out with just simple logic, excluding circuits and transistors, but eventually going all the way to talking about quantum computing.

u/IamABot_v01 · 1 pointr/AMAAggregator


Autogenerated.

Science AMA Series: I’m Tony Hey, chief data scientist at the UK STFC. I worked with Richard Feynman and edited a book about Feynman and computing. Let’s talk about Feynman on what would have been his 100th birthday. AMA!

Hi! I’m Tony Hey, the chief data scientist at the Science and Technology Facilities Council in the UK and a former vice president at Microsoft. I received a doctorate in particle physics from the University of Oxford before moving into computer science, where I studied parallel computing and Big Data for science. The folks at Physics Today magazine asked me to come chat about Richard Feynman, who would have turned 100 years old today. Feynman earned a share of the 1965 Nobel Prize in Physics for his work in quantum electrodynamics and was famous for his accessible lectures and insatiable curiosity. I first met Feynman in 1970 when I began a postdoctoral research job in theoretical particle physics at Caltech. Years later I edited a book about Feynman’s lectures on computation; check out my TEDx talk on Feynman’s contributions to computing.



I’m excited to talk about Feynman’s many accomplishments in particle physics and computing and to share stories about Feynman and the exciting atmosphere at Caltech in the early 1970s. Also feel free to ask me about my career path and computer science work! I’ll be online today at 1pm EDT to answer your questions.


-----------------------------------------------------------

IamAbot_v01. Alpha version. Under care of /u/oppon.
Comment 1 of 1
Updated at 2018-05-11 17:56:32.133134

Next update in approximately 20 mins at 2018-05-11 18:16:32.133173

u/animesh1977 · 1 pointr/programming

As gsyme said in the comment, he covers bits from Feynman's book on computation ( http://www.amazon.com/Feynman-Lectures-Computation-Richard-P/dp/0738202967 ). Basically the lecturer is trying to look at the electronic and thermodynamic aspects of computation. He refers to review from Bennett ( http://www.research.ibm.com/people/b/bennetc/bennettc1982666c3d53.pdf ) @ 1:27 . Apart from this some interesting things like constant 'k' @ 1:02 and reversible-computing at 1:26 are touched upon :)

u/CypripediumCalceolus · 1 pointr/askscience

Feynman Lectures On Computation gives a lot of practical examples of how the laws of thermodynamics, engineering developments, and information theory limit information storage density in such systems. Yes, there is a limit, but it is very big and far away.

u/dnabre · 1 pointr/compsci

Feynman's Lectures on Computation

Definitely light reading. Some of the stuff seems a bit dated and some a bit basic, but Feynman's has a way of looking at things and explaining them that is totally unique. (You might want to skip the chapter on quantum computing if you don't have the background).

u/fuckjeah · 1 pointr/todayilearned

Yes I know, that is why I mentioned general purpose computation. See Turing wrote a paper about making such a machine, but the British intelligence which funded him during the war needed a machine to crack codes through brute force, so he doesn't need general computation (his invention), but the machine still used fundamental parts of computation invented by Turing.

The Eniac is a marvel, but it is an implementation of his work, he invented it. Even Grace Hopper mentions this.

What the Americans did invent there though, was the higher level language and the compiler. That was a brilliant bit of work, but the credit for computation goes to Turing, and for general purpose computation (this is why the award in my field of comp. sci. is the Turing award, why a machine with all 8 operations to become a general computer is called Turing complete and why Turing along with Babbage are called the fathers of computation). This conversation is a bit like crediting Edison for the lightbulb. He certainly did not invent the lightbulb, what he did was make the lightbulb a practical utility by creating a longer lasting one (the lightbulbs first patent was filed 40 years earlier).

I didn't use a reference to a film as a historical reference, I used it because it is in popular culture, which I imagine you are more familiar with than the history of computation, as is shown by you not mentioning Babbage once and yet the original assertion was the invention of "Computation" and not the first implementation of the general purpose computer.

> The Engine incorporated an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete.

Here is a bit where Von-Neuman (American creator of the Von-Neuman architecture we use to this day) had to say:

> The principle of the modern computer was proposed by Alan Turing, in his seminal 1936 paper, On Computable Numbers. Turing proposed a simple device that he called "Universal Computing machine" that is later known as a Universal Turing machine. He proved that such machine is capable of computing anything that is computable by executing instructions (program) stored on tape, allowing the machine to be programmable.

> The fundamental concept of Turing's design is stored program, where all instruction for computing is stored in the memory.

> Von Neumann acknowledged that the central concept of the modern computer was due to this paper. Turing machines are to this day a central object of study in theory of computation. Except for the limitations imposed by their finite memory stores, modern computers are said to be Turing-complete, which is to say, they have algorithm execution capability equivalent to a universal Turing machine.

TLDR: History is not on your side, I'm afraid. Babbage invented computation, Turing invented the programmable computer. Americans invented the memory pipelines, transistor, compiler and first compilable programming language. Here is an American book by a famous Nobel prize winning physicist (Richard Feynman) where the roots of computation is discussed and the invention credit awarded to Alan Turing. Its called Feynman's Lectures on Computation, you should read it (or perhaps the silly movie is more your speed).