Reddit Reddit reviews Numerical Recipes 3rd Edition: The Art of Scientific Computing

We found 10 Reddit comments about Numerical Recipes 3rd Edition: The Art of Scientific Computing. Here are the top ones, ranked by their Reddit score.

Computers & Technology
Books
Computer Software
Mathematical & Statistical Software
Numerical Recipes 3rd Edition: The Art of Scientific Computing
Used Book in Good Condition
Check price on Amazon

10 Reddit comments about Numerical Recipes 3rd Edition: The Art of Scientific Computing:

u/w3woody · 12 pointsr/computerscience

Read about the topic.

Practice.

Set yourself little challenges that you work on, or learn a new language/platform/environment. If you're just starting, try different easy programming challenges you find on the 'net. If you've been doing this a while, do something more sophisticated.

The challenges I've set for myself in the recent past include writing a LISP interpreter in C, building a recursive descent parser for a simple language, and implementing different algorithms I've encountered in books like Numerical Recipes and Introduction to Algorithms.

(Yes, I know; you can download libraries that do these things. But there is something to be gained by implementing quicksort in code from the description of the algorithm.)

The trick is to find interesting things and write code which implements them. Generally you won't become a great programmer just by working on the problems you find at work--most programming jobs nowadays consist of fixing code (a different skill from writing code) and involve implementing the same design patterns for the same kind of code over and over again.

----

When I have free time I cast about for interesting new things to learn. The last big task I set for myself was to learn how to write code for the new iPhone when it came out back in 2008. I had no idea that this would change the course of my career for the next 9 years.

u/7thSigma · 4 pointsr/Physics

Numerical Recipes is a veritable catalogue of different methods. Depending on what field you're interested in though there is surely a text with a title along the lines of 'Computational methods for [insert field] physics'

u/Anarcho-Totalitarian · 3 pointsr/math

There are numerical methods that make essential use of randomness, such as Monte Carlo methods or simulated annealing.

And numerical methods can be applied to statistics problems. A nonlinear model is probably going to require a numerical scheme to implement. The book Numerical Recipes, which is all about actually implementing numerical methods on a computer, has four chapters covering randomness and statistics.

> My plan at present is to do a PhD in numerical PDEs and then go into industry in scientific computing as a researcher or developer.

I'd make sure that whoever it is you want to work with is heavy on the computation side. Even better if they work with supercomputers. I say this because even a topic like numerical PDEs can go very deep into theory (consider this paper ). Industry likes computer skills.

u/SoSweetAndTasty · 3 pointsr/AskPhysics

To go with the other comments I also recommend picking up a book like numerical recipes which describes in detail many well tested algorithms.

u/EorEquis · 2 pointsr/astrophotography

> I have struggled with noise reduction in PixInsight and even when I used Photoshop.

I think most of us (I KNOW I do) have the same difficulty, and it boils down to...well, quite frankly, to wanting our amateur images to look like Hubble results.

Harsh though he can sometimes be, I think Juan Conejero of the PixInsight team said it best :

> A mistake that we see too often these days is trying to get an extremely smooth background without the required data support. This usually leads to "plastic looking" images, background "blobs", incongruent or unbelievable results, and similar artifacts. Paraphrasing one of our reference books, this is like trying to substantiate a questionable hypothesis with marginal data. Observational data are uncertain by nature—that is why they have noise, and why we need noise reduction—, so please don't try to sell the idea that your data are pristine. We can only reduce or dissimulate the noise up to certain limits, but trying to remove it completely is a conceptual error: If you want less noise in your images, then what you need is to gather more signal.

Admittedly...agreeing with Juan doesn't mean I'll stop trying to "prove him wrong" anyway. I'll still mash away at TGVDenoise until the background looks like lumpy oatmeal and call it "noise reduction"...but I'll feel 2 minutes of shame when /u/spastrophoto calls me out on it. ;)

Having said that, I think the article linked above and this comparison probably did more for my "understanding" of PI's NR tools than any others I've read....for whatever that's worth. :)

> Glad to see a fellow hockey player on here...not many astrophotographer/hockey player hybrids out there!

Thought the username looked vaguely familiar. :) It IS an interesting combination, ain't it?

That's one more added to the list now... /u/themongoose85 is a hockey player too.

u/EmperorsNewClothes · 2 pointsr/Physics

In addition, this book will save your life. With a good programming base, it's almost like cheating.

u/ObnoxiousFactczecher · 1 pointr/Amd
u/sneddo_trainer · 1 pointr/chemistry

Personally I make a distinction between scripting and programming that doesn't really exist but highlights the differences I guess. I consider myself to be scripting if I am connecting programs together by manipulating input and output data. There is lots of regular expression pain and trial-and-error involved in this and I have hated it since my first day of research when I had to write a perl script to extract the energies from thousands of gaussian runs. I appreciate it, but I despise it in equal measure. Programming I love, and I consider this to be implementing a solution to a physical problem in a stricter language and trying to optimise the solution. I've done a lot of this in fortran and java (I much prefer java after a steep learning curve from procedural to OOP). I love the initial math and understanding, the planning, the implementing and seeing the results. Debugging is as much of a pain as scripting, but I've found the more code I write the less stupid mistakes I make and I know what to look for given certain error messages. If I could just do scientific programming I would, but sadly that's not realistic. When you get to do it it's great though.

The maths for comp chem is very similar to the maths used by all the physical sciences and engineering. My go to reference is Arfken but there are others out there. The table of contents at least will give you a good idea of appropriate topics. Your university library will definitely have a selection of lower-level books with more detail that you can build from. I find for learning maths it's best to get every book available and decide which one suits you best. It can be very personal and when you find a book by someone who thinks about the concepts similarly to you it is so much easier.
For learning programming, there are usually tutorials online that will suffice. I have used O'Reilly books with good results. I'd recommend that you follow the tutorials as if you need all of the functionality, even when you know you won't. Otherwise you get holes in your knowledge that can be hard to close later on. It is good supplementary exercise to find a method in a comp chem book, then try to implement it (using google when you get stuck). My favourite algorithms book is Numerical Recipes - there are older fortran versions out there too. It contains a huge amount of detailed practical information and is geared directly at computational science. It has good explanations of math concepts too.

For the actual chemistry, I learned a lot from Jensen's book and Leach's book. I have heard good things about this one too, but I think it's more advanced. For Quantum, there is always Szabo & Ostlund which has code you can refer to, as well as Levine. I am slightly divorced from the QM side of things so I don't have many other recommendations in that area. For statistical mechanics it starts and ends with McQuarrie for me. I have not had to understand much of it in my career so far though. I can also recommend the Oxford Primers series. They're cheap and make solid introductions/refreshers. I saw in another comment you are interested potentially in enzymology. If so, you could try Warshel's book which has more code and implementation exercises but is as difficult as the man himself.

Jensen comes closest to a detailed, general introduction from the books I've spent time with. Maybe focus on that first. I could go on for pages and pages about how I'd approach learning if I was back at undergrad so feel free to ask if you have any more questions.



Out of curiosity, is it DLPOLY that's irritating you so much?

u/fatangaboo · 1 pointr/AskEngineers

Yes there is software.

The first thing I would suggest is to try the Microsoft Excel "Solver" . It is actually a wonderful piece of highly polished numerical analysis code, buried inside a stinky, steaming turd called Excel Spreadsheets. You and Google, working together, can find hundreds of tutorials about this, including

(link 1)

(link 2)

(link 3)

(link 4)

If you prefer to code up the algorithm(s) yourself, so you can incorporate them in other bigger software you've got, I suggest purchasing the encyclopaedic textbook NUMERICAL RECIPES. This tour-de-force textbook / reference book has an entire chapter devoted to optimization, including source code for several different algorithms. I recommend Nelder-Mead "amoeba" but other people recommend other code.

u/baialeph1 · 1 pointr/math

Not sure if this is exactly what you're looking for as it was written by physicists, but this is considered the bible of numerical methods in my field (computational physics): http://www.amazon.com/Numerical-Recipes-3rd-Edition-Scientific/dp/0521880688