Top products from r/programming
We found 352 product mentions on r/programming. We ranked the 1,628 resulting products by number of redditors who mentioned them. Here are the top 20.
1. Working Effectively with Legacy Code
Sentiment score: 18
Number of reviews: 29
Prentice Hall

2. Code: The Hidden Language of Computer Hardware and Software
Sentiment score: 17
Number of reviews: 27
Microsoft Press

3. C Programming Language, 2nd Edition
Sentiment score: 10
Number of reviews: 25
Prentice Hall

4. Code Complete: A Practical Handbook of Software Construction, Second Edition
Sentiment score: 8
Number of reviews: 24
Microsoft Press

5. Clean Code: A Handbook of Agile Software Craftsmanship
Sentiment score: 7
Number of reviews: 22
Prentice Hall

6. Introduction to Algorithms, 3rd Edition (The MIT Press)
Sentiment score: 9
Number of reviews: 16
Hard Cover

7. Purely Functional Data Structures
Sentiment score: 12
Number of reviews: 16
Used Book in Good Condition

8. The Elements of Computing Systems: Building a Modern Computer from First Principles
Sentiment score: 8
Number of reviews: 16
MIT Press MA

10. Design Patterns: Elements of Reusable Object-Oriented Software
Sentiment score: 3
Number of reviews: 15
Great product!

11. Introduction to Algorithms, Second Edition
Sentiment score: 5
Number of reviews: 15

12. Refactoring: Improving the Design of Existing Code
Sentiment score: 7
Number of reviews: 14
Addison-Wesley Professional

13. The Pragmatic Programmer: From Journeyman to Master
Sentiment score: 6
Number of reviews: 13
Save up to 15% when buying these two titles together.The Pragmatic Programmer cuts through the increasing specialization and technicalities of modern software development to examine the core process--taking a requirement and producing working, maintainable code that delights its users.It covers topi...

14. JavaScript: The Good Parts
Sentiment score: 6
Number of reviews: 13
O Reilly Media

15. Compilers: Principles, Techniques, and Tools (2nd Edition)
Sentiment score: 4
Number of reviews: 12

16. The Mythical Man-Month: Essays on Software Engineering, Anniversary Edition (2nd Edition)
Sentiment score: 6
Number of reviews: 10
NewMint ConditionDispatch same day for order received before 12 noonGuaranteed packagingNo quibbles returns

17. Beautiful Code: Leading Programmers Explain How They Think (Theory In Practice (O'reilly))
Sentiment score: 4
Number of reviews: 10
O Reilly Media

> The next question would be what are some key concepts I should learn before I start programming?
How to write instruction manuals for retards. I don't know actually. I've taught programming (friends and TAing) but I don't know how to teach someone to be a programmer. It takes a certain attitude. You have a problem and there are certain small steps you are allowed to perform. Programming requires you to translate your problem in a sequence of small steps. There are some people who have a knack for it and they will catch it immediately. Others will look at problem, know the steps but blank out completely and never be able to make the connection of what steps they can perform to solve the problem using the small steps. I would recommend go slow and up the difficulty enough to keep you interested but not so much as to discourage you. I don't know how hard you it will be for you but I think the attitude in proving a theorem in mathematics is what would help. It sort of has the attention to tiny detail, creativity and dealing with the frustration of not seeing the solution immediately that programming requires. I think the easiest way to learning how to program would be trying.
> What do I need to install in my computer in order to practice?
I meant to answer this question in my first post. For C on Windows your options also include installing cygwin and then pretending you're on Linux. For Java, Eclipse. Itself written in Java, so cross platform. Perl, Python should be again avilable on all Platforms on the commandline, all it needs is a text editor (haven't seen any Perl IDEs). Oh in C if you're using good old gcc you should also choose a text editor. A good IDE with a not intimidating example project should be decent, but you'll probably become confident a lot quicker if you were using a simple text editor with just your compiler and interpreter.
> Is there a great difference in programming command line programs and GUI programs? What do I need to know to do GUI programs? Do I need special software / libraries / skills?
To be honest I have barely done any GUI programming. I have modified a couple of GUI programs and have done some 3D stuff in OpenGL. My answer is GUI programming is just annoying not interesting. Most of the real work does not involve how you interact with the user. GUI programming is mostly calling a bunch of library functions with some voodoo magic. Of course no matter how technically awesome your program is it's not impressive to a non-programmer until they see a GUI.
> Could you recommend a good book for self-reference?
For Perl the camel book (larry Wall is author of Perl) will serve you your entire life. For C K&R is by the auhtors of the language and considered the definitive learn C book (haven't used though). As far as C++ don't learn that as first language but I know there is a book by Stroussup and Deitel and Deitel is used in almost all universities. I found Thinking in C++ to be useful (used in my school alongside Deitel and Deitel) but please please don't learn C++ first. I plan on reading ANSI common LISP by Paul Graham one of these days.
But seriously a book won't get you very far. The quickest way of learning programming is doing it. Once you're confident programming you'll find all languages making sense and starting to edit a program in a language is a great way to learn. You have code examples right in front of you and the fact that it already runs and does most of the stuff boosts your confidence until you get comfortable enough with the language that you can see yourself do the whole thing.
> I tried teaching myself Python a while back, but then they upgraded Python to 3.0 and I couldn't do anything anymore and I was very confused as to why. It since has become clear to me that knowing a couple of codes in a particular programming language is not the same as "programming". So, I would like to know more. Could you please help?
That's why I said start with C. It's a very concise language that does exactly what you say. And there is slmost nothing to remember for it. It should help you develop the programmer mentality. The other school of thought though is that a learner computer language is something that helps you express algorithms which makes C the worst choice because it is too lcose to what you want to tell the computer rather than the ideas you havem but I'm from the bottom up rather than top-down school of thought.
I don't know of any one source that teaches "good testing principles". There are thousands of sources and Sturgeon's law is working against you. A few sources are predominantly good, most have bits (often the same bits) of genuinely good advice in-between chapters of bland, uninsightful repetition, many are appropriations of popular acronyms by closely or distantly related professions (no, you're not "testing" a requirement specification, you're just reviewing it), and some sources are just plain bad.
I had an opportunity to attend Dan North's Testing Faster course and would strongly recommend it. In my case it was more helpful for formalising my own experience than learning concrete new things but other attendees did absolutely "learn new things". He made a point that "TDD" and "BDD" are both inaccurate names and that something like "example-guided development" would have been far more honest; he recommended a book, I think Specification by Example, as a good resource to that end (and noted that that name, too, is technically inaccurate). He also confirmed that Cucumber is a solution looking for a problem.
Test Driven Development: By Example by Kent Beck is a classic, and as far as I can remember, decent. It's maybe a little old now, and it definitely misses some subtle points about maintainability of automated tests in general (or perhaps rather, doesn't really address that).
I've skimmed Code Complete 2. I don't remember it in detail but my overall impression of it was that the sooner it becomes irrelevant the better, because that would signify our profession maturing (if not quite reaching maturity). A lot of its contents would be considered basic by contemporary software development standards and that's a good thing. I don't remember what it says about testing. One thing in a very late chapter (33.8?) stuck with me, though: that seniority has little to do with age and your approach to software development will be formed early on.
Working Effectively with Legacy Code by Michael Feathers is excellent, perhaps the most practically applicable one here.
Sandi Metz is famous in the Ruby community for speaking on this topic and there are recordings on YouTube. From what I've seen her material also mainly addresses beginners but it's fast and easy to consume and her form doesn't bother me the way Martin's does.
One piece of advice I picked up from one of those mostly-mediocre sources had to do with naming in tests, trying to capture the essentials. If you're relying on a particular property of a piece of input to test behaviour, make sure this is evident. Conversely, if any input would satisfy, avoid drawing undue attention:
fn bees_can_fly() {
let some_bee = ...
let bumblebee = ...
let dest = ...
assert fly(some_bee, dest);
assert fly(bumblebee, dest);
}
fn bees_can_pollinate() {
let some_bee = ...
let flower = ...
assert pollinate(some_bee, flower);
}
Testing is about developing confidence. There are many kinds of testing and many things to develop confidence in. For automatic tests it's more about checking (arguably not "testing") that you retain correctness in the face of continuous change. Automatic tests that obstruct that change or compromise your confidence are not helping you and should be rewritten or removed. Reliability of tests is usually more valuable than coverage, for instance.
I. BACKGROUND INFORMATION
II. CURRENT ROLE INFORMATION
III. YOUR INTEREST IN THE FIELD
Are you a beginner to programming, or have you been programming for a while in other languages and are just getting started with JS?
If you're a beginner to programming altogether, this is a good resource, but I'd also recommend some dead-tree books. I've heard good things about Javascript: The Good Parts, though I've never read it myself.
Edit: One thing to be careful of when you're just getting started, by the way: try not to focus too much on any single language and its features. You want a good solid base of fundamentals, you don't want to hyper-specialize from the start. Don't just learn Javascript, for example. Try Java and Clojure and C and Haskell too, and any other language you can get your hands on. They all have their own idioms and lend to certain styles of problem solving, it's good to be able to figure out which one is best for the task you're facing.
If you've done some coding before, and are branching out into JS as a new language, there's no better way than reading about it and then trying it out on your own. JSFiddle is a great resource for just playing around. You could try implementing solutions to Project Euler puzzles in JS. Or you could pick a pet project you want to work on that has some client-side behavior and implement it in HTML5/JS. Or server-side behavior and do it in node.js. Or you can find an open-source project using JS that you're interested in, and get involved there.
Edit: Also, Stack Overflow has dozens of JS-related questions answered every hour. Sometimes those answers come attached to a lot of useful information. Try browsing there, and if you have questions, ask them. It's a great resource.
Define minimum? Do you understand OO? Java is fine to get back into it. Definitely more beginner friendly. Lots of books, a lot of them bad, but some very good ones like "Effective Java".
There are a lot of good languages to learn. My current stance right now is to recommend to people that they learn a low level language along with a high level language. Right now the best candidates are C and Haskell. This will cover all your bases.
The advantage of C is that you learn the basic concepts of programming while remaining close to the machine. You learn about managing memory yourself, loops, pointers. The C programming language is one of the best programming books ever written.
The reason I chose Haskell is because it is probably one of the most sophisticated languages out there. You will learn functional programming along with how to work with a powerful type system.
You need to show that you know your stuff. Just because you're doing something more applied like Network Security in grad school doesn't mean that you won't have a base level of knowledge you're expected to understand. In that case, you need to learn some basic stuff a CS student at a good school would know. I'm not "dumbing down" anything on my list here, so if it seems hard, don't get discouraged. I'm just trying to cut the bullshit and help you. (:
Again, don't be discouraged, but you'll need to work hard to catch up. If you were trying for something like mathematics or physics while doing this, I'd call you batshit insane. You may be able to pull it off with CS though (at least for what you want to study). Make no mistake: getting through all these books I posted on your own is hard. Even if you do, it might be the case that still no one will admit you! But if you do it, and you can retain and flaunt your knowledge to a sympathetic professor, you might be surprised.
Best of luck, and post if you need more clarification. As a side note, follow along here as well.
Netsec people feel free to give suggestions as well.
Edit: I didn't realize the link was just the first chapter. If you really liked it, I do suggest purchasing it. You can find it all online for free, but I do highly recommend just having this book. It's a fun read.
 
Here's an excerpt that I really love right from the beginning of the book.
>>All programmers are optimists. Perhaps this modern sorcery especially
attracts those who believe in happy endings and fairy godmothers.
Perhaps the hundreds of nitty frustrations drive away all
but those who habitually focus on the end goal. Perhaps it is
merely that computers are young, programmers are younger, and
the young are always optimists. But however the selection process
works, the result is indisputable: "This time it will surely run," or
"I just found the last bug."
Here's a link to a Physical copy [on Amazon] (https://www.amazon.com/Mythical-Man-Month-Software-Engineering-Anniversary/dp/0201835959) if you want it.
 
edit: Bonus Dilbert Comic
This is a tough one. It's been several years since I've used textbooks so there may be better ones out there, and I can't remember what they were in any case :).
CS is a pretty wide field, and does rely on a lot on mathematics. I don't believe you need the theory to be a business programmer, but I do believe that expanding your knowledge of computing, in any way, will make you more capable as a programmer.
There's a couple of free resources you can use to give you the back ground, MIT's opencoureware is excellent for this, and don't forget wikipedia, seriously don't laugh, but do follow up on the references in a wikipedia article.
from what you're saying it sounds like you're more interested in algorithms than computational theory. So I'll just focus on that:
The MIT course introduction to algorithms looks good, and This also has some videos of the lectures this is the textbook they used.
> there's not really design patterns in procedural code
That's a plus for me. I'm not a huge fan of design patterns. I only made it 1/3rd of the way through "Head-First" before feeling too overwhelmed by the complexity.
> I'm relatively unfamiliar with functional programming.
That's pretty much everybody right now. I'm of the opinion that we'd all have a lot more fun and get more done more simply in FP (it's a working theory, based on my journey), so I say these kinds of things to get people interested, or even just informed if they haven't yet heard much, if anything, about it. That's my story - fighting for many years to get my ideas out in OOP, then told by a friend about FP, and now much more able to express what I mean.
> Never mind that function composition is about a thousand time less complicated than object composition.
I'm not exactly sure what object composition really is/means, but if what you say is true, let's use functions instead! I like the sound of "a thousand times less complicated" :) I've found working with simple values and functions to be a lot easier, yet more powerful, than trying to think in objects.
which books exactly, please guide me.
these ones?
Effective C++: 55 Specific Ways to Improve Your Programs and Designs (3rd Edition)
Effective Modern C++: 42 Specific Ways to Improve Your Use of C++11 and C++14 1st Edition
More Effective C++: 35 New Ways to Improve Your Programs and Designs 1st Edition
What is the difference between them? It seems effective cpp 3 is from 2005 and more effective cpp is from 1996. Is there a point in reading more effective cpp after reading the third edition of effective cpp?
Also, what do you think about C++ How to Program?
So I feel that it's worth pointing out that the linked article was written by somebody who apparently is (was) associated with a company that sells testing services. Keep in mind that they might not necessarily be a neutral party.
I could just as easily link you to Working Effectively with Legacy Code or anything written by Kent Beck. They too can write many words to try to convince you that they know what they're talking about.
Having said that, I do agree with many of his points, though I disagree with several as well. The problem I have with that paper is that his points don't prove his conclusion, and definitely don't support your argument. I mean, some of his points are:
And I pretty much agree with all of those (though the third one, while true, is completely unhelpful). But to jump from that to "and that's why unit tests are basically worthless" is making quite a leap.
Unit tests are a tool. They're particularly well suited for testing algorithms. For example, suppose you had to implement
sqrt
yourself. Yes, I know that you don't have to, but there's some application-specific algorithm that you had to implement at some point. Since I don't know your application, I'll just pretend that you had to implementsqrt
. Having a table of well-known inputs and outputs forsqrt
, and verifying that your implementation produces results within some epsilon of those known values, is obviously a good idea. And checking thatsqrt
works on its own, outside of your vertex transformation code (because the application that I'm thinking of is a graphics engine) is obviously a good idea. And this is the textbook example of when you would use a unit test.But unit tests are just one tool. There are some things that just aren't worth testing, and there are things that make no sense to test at a unit level. But those two statements don't refute that unit tests are perfectly appropriate to a large range of problems.
I mean, if you've already decided that you will not write unit tests, then there's nothing I can say to convince you otherwise and there's no conversation to be had. I'd be much more interested in discussing when it is appropriate to use unit testing, and when it's appropriate to do something else.
When I started getting interested in compilers, the first thing I did was skim issues and PRs in the GitHub repositories of compilers, and read every thread about compiler construction that I came across on reddit and Hacker News. In my opinion, reading the discussions of experienced people is a nice way to get a feel of the subject.
As for 'normal' resources, I've personally found these helpful:
In addition, just reading through the source code of open-source compilers such as Go's or Rust's helped immensely. You don't have to worry about understanding everything - just read, understand what you can, and try to recognize patterns.
For example, here's Rust's parser. And here's Go's parser. These are for different languages, written in different languages. But they are both hand-written recursive descent parsers - basically, this means that you start at the 'top' (a source file) and go 'down', making decisions as to what to parse next as you scan through the tokens that make up the source text.
I've started reading the 'Dragon Book', but so far, I can't say it has been immensely helpful. Your mileage may vary.
You may also find the talk 'Growing a language' interesting, even though it's not exactly about compiler construction.
EDIT: grammar
readscheme is a good place to start, it hasa a bunch of good links to papers on issues related to macros: http://library.readscheme.org/page3.html
(It also has lots of other material, but you asked about macros specifically, so that's the link I've posted.)
If you can buy one book, buy Lisp In Small Pieces. It's generally excellent, and has good coverage of macro implementation strategies.
http://www.amazon.com/Lisp-Small-Pieces-Christian-Queinnec/dp/0521545668/ref=pd_bbs_sr_1?ie=UTF8&s=books&qid=1196347235&sr=8-1
Another good resource is the discussion of an implementation of syntax-case that's in "Beautiful Code": http://www.amazon.com/Beautiful-Code-Leading-Programmers-Practice/dp/0596510047/ref=pd_bbs_sr_1?ie=UTF8&s=books&qid=1196346388&sr=8-1
Maybe you like
K&R. If you're in a programming 101 class that involves C, just buy this book unless your prof tells you otherwise.
---
K&R has the reputation it has because they did an excellent job of balancing between "experienced programmers can use this as a reference" and "newbie programmers can use this as a starting point. Let me clarify: K&R will not make you a better programmer, but it is an excellent example of what industry professionals would consider to be a good piece of technical documentation.
If you're going to ever work with APIs or large amounts of technical documentation about software, this book will mirror the experience you get reading "good docs". In short, learning C from this books does an excellent job of showing you how much you'll have to figure out yourself and what information you should be expected to be given to you when working in the industry.
Three books that come to mind:
Types And Programming Languages by Benjamin Pierce covers the ins and outs of Damas-Milner-style type inference, and how to build the bulk of a compiler. Moreover, it talks about why certain extensions to type systems yield type systems that are not inferrable, or worse may not terminate. It is very useful in that it helps you shape an understanding to understand what can be done by the compiler.
Purely Functional Data Structures by Chris Okasaki covers how to do things efficiently in a purely functional (lazy or strict) setting and how to reason about asymptotics in that setting. Given the 'functional programming is the way of the future' mindset that pervades the industry, its a good idea to explore and understand how to reason in this way.
Introduction to Algorithms by Cormen et al. covers a ton of imperative algorithms in pretty good detail and serves as a great toolbox for when you aren't sure what tool you need.
That should total out to around $250.
Smalltalk Best Practice Patterns by Kent Beck. In my opinion, it's his best book. It's a great book on the nitty gritty of coding.. great for all programmers. It's easy to read even if you're not a Smalltalker; all you have to do is google for a Smalltalk cheatsheet.
I also like Working Effectively with Legacy Code. It's about the sort of code that most of us confront daily: how to deal with its problems, and get it under test so that you can refactor it or add to it without gumming it up.
Although Lua does allow you to implement an OOP system yourself, that just leads to The Lisp Curse. Humans being humans, everyone will build their OOP system differently, so an expert in one augmented Lua dialect moving to another project with its own dialect loses their expertise.
A language with a mediocre OO system which is fixed in stone by the language definition is better than one flexible enough to let you define any OO system you like, from a training and community expertise standpoint.
You see echoes of this in Perl and JavaScript, too.
Like Lua, Perl also had an OO sidecar bolted onto it after it became popular. Because there is no one single way to do things, you get classic Perl OO users vs. the Moose people vs. those that go totally their own way, doing weird shit like blessing arrays.
In JavaScript, it's both better and worse than Lua or Perl. JavaScript can at least claim, with a straight face, that it is OO to the core. The problem is, that core has two different ways of manifesting: pure prototypal extensions of Object and such vs. the C++ inspired paintjob on top, all that business with new and constructors. The Scheme/Lisp-inspired flexibility of JavaScript lets you bring The Lisp Curse down on yourself again, because there's nothing telling you how you must implement your constructors or factory methods. In Douglas Crockford's lovely book on how to program in JavaScript with style and panache, there are three or four different ways to build up objects. Add to that the one your JS framework of choice probably gives you. Then of course you know best, so you ignore that and define a sixth style for your project. It becomes a tarpit.
I'm no hater of any of these languages. I happily use them all. It's important to realize, however, that there's something to be said for languages that nail things like OO down in the language definition.
strictly speaking, it depends on the language, but probably you don't understand objects. given that you're in Python and it's got OO (or close enough), I'm going to escalate that from a probably to a very very probably. basically, if you move the stuff inside the ifs into methods on objects, in the absolute simplest case, you'll put each if inside a different object, and then your huge chain of ifs turns into:
object.method(arguments)
And instead of throwing all those ifs in there, you basically run through all the ifs ahead of time by just giving that block of code an object of the appropriate class.
The absolute best book on OO (in my opinion) is Refactoring.
http://www.amazon.com/gp/product/0201485672?ie=UTF8&tag=gilebowk-20&linkCode=xm2&camp=1789&creativeASIN=0201485672
I have a feeling the Head First book on OO is also very, very good. I haven't checked it but the series is a great series.
To date, the best programming book that I've read is C Programming Language by K&R. It's a pretty complete text on the C language. It is more than sufficient to enable the reader to be a good C programmer, yet it is still entirely digestable by new programmers. It is 274 pages. There are some recent gems, like Programming Clojure (304 pages). However, these days the norm seems to be more like Applying Domain-Driven Design and Patterns: With Examples in C# and .NET (576 pages), Real World Haskell (710 pages), and The C++ Programming Language (1030 pages). These books are all good. They just are hard to carry around and hard to hold while reading for long periods. I'm looking for good programming books that are short; an upper limit of roughly 325 pages. Post links to your favorites!
I'd suggest
Now past this it's not entirely clear where to go, it's much more based on what you're interested in. For web stuff there's Yesod and it's associated literature. It's also around this time where reading some good Haskell blogs is pretty helpful. In no particular order, some of my favorites are
And many, many more.
Also, if you discovery type theory is interesting to you, there's a whole host of books to dig into on that, my personal favorite introduction is currently PFPL.
> As for D, it does not even exist in real world.
It does. It just needs some good tools support.
For reference Python first appeared in 1991, but didn't really gain wide acceptance until well after 2000. Ruby first appeared in 1995, but didn't gain wide acceptance until RoR was open sourced in 2004.
D was first designed in 1999. It's starting to gain more and more acceptance -- Andrei Alexandrescu is writing a book on it. This is about the time languages really start gaining traction. We'll see what happens in the next few years.
We did something similar as well. The labs were tons of fun. I remember having to run a couple dozen lines of code through the CPU cache on a test once, including some sneakery of using code as data at one point. I do appreciate having done it, but I'm not sure how much practical lasting value that really contributed.
That said, for those who are interested in this there is The Elements of Computing Systems: Building a Modern Computer from First Principles, more commonly known as "NAND to Tetris".
Petzold's Code is excellent as well.
Edit: Actually, while I've suggested those two let me throw Computer Systems: A Programmer's Perspective into the mix. It's a book we used across two courses and I really enjoyed it. We used the 2nd edition (and I have no issue recommending people get a cheaper, used copy of that), but there is a 3rd edition now. Being a proper text book it's stupidly priced (you can get Knuth's 4 book box set for $30 more), but it's a good book.
Anyone have suggestions similar to that Computer Systems's text? I've always wanted to revisit/re-read it, but could always used a different perspective.
For iOS devices, you're going to want to start here, this will get you familiarized with the NeXtStep family of jive turkeys, followed up with a more formal introduction to Objective-C. I'll be honest, having some working knowledge of C will never hurt you, so after you're done with that, take a peek at K&R.
If you're aiming for Android, you have a bit of a different education outlook, I'd recommend brushing up with Head First Java. When I started poking around with Android, I read Hello, Android most of it should be still pretty relevant. I'm not entirely sure if it has been updated as of late, I outgrew it rather quickly, and if you do too; pretty much anything and everything by Mark Murphy is relevant. Best of luck!
You can read Code Complete by McConnell which is a must have for software engineering and has several sections on writing and documentation etc.
The number 1 rule of thumb is to think as if you are not the one writing, but the one reading it later. Take commit messages - there are far too many "fixed a minor bug" on one end, and wall-of-text-about-how-you-found-and-fixed-the-bug-but-little-useful-info on the other. Write what the bug was and what the fix is, and its side-effects if any.
If you want to improve your writing, identify people who have done it well, and seek to follow them, and practise. Just reading good writers, commenting on forums etc. will also improve your writing skills.
I believe there is always another best code waiting for you. But till yet I find the Jon Bentley's version of Quicksort in Beautiful Code pretty awesome. He describes it as "The most beautiful code I never wrote"
First of all, for any software development questions you may have, I suggest you post your questions on Stackoverflow because the people there will surely provide you with answers.
Now, for a list of books I recommend:
JavaScript
JavaScript: The Definitive Guide; if you're new to JS, start with this one.
JavaScript: The Good Parts; not a beginner's book, but a must-read if you are going to use JS
If you are going to be using JS, you will most probably be developing using a framework, and for that I seriously recommend mastering jQuery because as they say, you will write less and do more!
CSS
CSS Mastery: Advanced Web Standards Solutions
Web Usability
Don't Make Me Think: A Common Sense Approach to Web Usability; the book that shows the users' perspective when viewing a website
Performance
High Performance Web Sites: Essential Knowledge for Front-End Engineers and Even Faster Web Sites: Performance Best Practices for Web Developers;if you want to get serious about performance for your websites
Preface: I'm not being condescending, I know a lot of incredible programmers who never attended a computer science class.
To be honest, this is the sort of stuff you learn about in a computer science degree program (though this would probably fit better in a graduate degree than an undergrad). If it really interests you, you should look into it.
If that's not a really viable option, I can't recommend this book on algorithms highly enough. It talks about runtime, scaling, data structures and graphs, etc. Truly fascinating stuff.
That's a good question, acutally. I picked it up in bits and pieces over years. I probably started to pick up when I tried to implement an object-oriented programming system in C. The dragon book was also a great help in figuring this sort of stuff out.
Another great way to learn is to write simple test programs in C or C++, and see what they compile down to with GCC. Using '-O' I find gives me the most readable "direct" assembly.
http://asm.sourceforge.net/howto/Assembly-HOWTO.html
Also, if you have any specific questions, possibly a tutorial or two... well, it's time that I started putting together a website.
I fowned a great book at Chapters in Canada called Code. It's certainly very interesting as it deals with all sorts of "code" from Morse to C# and it looks at how they are related. A surprising break from tutorial books, which I also love.
Isn't this argument kind of a strawman?
Who says that self-documenting code means absolutely no comments? Even the biggest champion of self-documenting code, Uncle Bob, devotes an entire chapter in Clean Code to effective commenting practices.
The idea of "self-documenting code" is that comments are at best a crutch to explain a bad design, and a worst, lies. Especially as the code changes and then you have to update those comments, which becomes extremely tedious if the comments are at too low a level of detail.
Thus, while code should be self-documenting, comments should be sparse and have demonstrable value when present. This is in line with the Agile philosophy that working code is more important than documentation, but that doesn't mean that documentation isn't important. Whatever documents are created should prove themselves necessary instead of busy work that no one will refer to later.
Uncle Bob presents categories of "good comments":
Some examples of "bad comments":
excellent post, thanks, it's always enlightening to get historical perspective like this
in a similar vein I recommend Deep C Secrets which is overflowing with historical anecdotes about the evolution of C and Unix systems
You'll find your answers in this book. Great both as a tutorial and a reference.
The TL;DR version is that with a bit of cleverness you can use redundancy in your data structures to save time and memory. For example, naively implementing a purely functional stack is easy peasy. Just take an immutable linked list; all stack operations are O(1) time and space.
>So what did you do? Anyone else have a formal CS education and feel like they came out of it with nothing?
I graduated in 2006 and I've been doing web development professionally for almost four years now. Until about two weeks ago, I felt like I could have skipped my entire five years at school because most of the stuff just doesn't apply to web development since it's so far abstracted from the hardware. I was reading my algorithms book on the toilet the other day when I realized that I learned a shitton at school and it gave me an incredible advantage over the guy who learned web development on the fly. It helps to go back and re-read things after you have a context to put it into so you can apply what the theory to what you've learned.
It took me a long time to start getting designs down. You have to make a lot of mistakes before you can learn from them. It's as simple as that. Don't get discouraged. If you haven't read Head First Design Patterns, buy that book right now and read it cover to cover. I had read design pattern catalogs, but none of them conveyed the 'why' as well as HFDP did. They don't have abstract, car has wheels, Ford is a car. They have real code examples and they show you why you should favor composition rather than inheritance. Why you should follow the law of Demeter.
I've entertained the notion of starting over several times. Don't quit, and don't get discouraged. If you ever get to the point where you think you've learned all you need to learn and you're writing code that can't be improved, start over.
Disclaimer: This may not be true for your job, but it has been for every job I have worked at.
That everything they are teaching you about algorithms will not be useful to you when you get into the field. Your education starts day one at your first job. Clients don't pay us to innovate in algorithms. They pay us to find and glue together other peoples libraries and to use this to present them the requested information.
Code you will actually be writing:
Things you will be doing that CS degree does not prepare you for
I would suggest reading books like Design Patterns, Mythical Man-Month and Code Complete
Sorry to see this getting downvoted. Read the about page to get an idea of why /u/r00nk made the page.
I have to agree with one of the other comments that it is way too terse at the moment. I remember when we learnt about e.g. d-latches in school and it was a lot more magical and hard to wrap your head around at first then the page gives credit for. That and, or, and xor gates can be built up from just nand gates (the only logic gate properly explained) is also glossed over. Either go over it, or don't show the interiors of the other logic gates.
The interactive stuff is really neat. Good work on that.
Edit: If anyone reading wants to learn this stuff in more detail, two good books are
Edit, 9 hours later: Just so people don't think I'm bitching that this post is only 84% upvoted, it was struggling at below zero points and 42% upvoted when I first commented.
I've enjoyed Working Effectively With Legacy Code by Michael Feathers.
It is an excellent survey of techniques for breaking up monolithic codebases into independently testable modules. He's a strong advocate of test driven development, which I would argue is rather critical to you successfully redesigning your application.
The mythical man-month https://www.amazon.com/gp/aw/d/0201835959/ref=dbs_a_w_dp_0201835959
The first version was published 40 years ago, but its content was reviewed and updated when the current version was written, 20 years ago. And you'll be surprised to realize that developers still keep doing the same mistakes as back then...
BTW: Also check out The pragmatic Programmer. Also focused on best practices gained from real-world experiences.
https://www.amazon.com/gp/aw/d/020161622X/ref=dbs_a_w_dp_020161622x
I haven't seen this suggested, but it is a good book on exactly that topic.
Clean Code by Robert Martin.
and I have to mention...
Growing Object-Oriented Software, Guided by Tests by Steve Freeman and Yours Truly, which frequently touches on that topic
Personally I'm very partial to Design Patterns: Elements of Reusable Object-Oriented Software. I see Java more and more as a software engineering, rather than a programming language. You can do programming more effectively in Python/Jython or (J)Ruby, but Java for me still is king for developing type-safe, robust libraries and unit testing.
You might also want to read up on Eclipse or another decent IDE. Eclipse reduces the amount of monkey typing that all Java developers must endure dramatically with things like templates, getter/setter generation, delegate method generation etc. Since the editor parses the code as you type and keeps an AST in memory, refactoring support is excellent and you'll spend less time worrying about minor design issues when starting a new project. Code is compiled on the fly so startup times are minimal. It's also able to produce very descriptive and useful information about any errors you might have in your code (unlike GCC, for instance:)
If you want to know a bit more about how the JVM itself works, read The JavaTM Virtual Machine Specification, Second Edition which is online and free. It'll give you a bit more insight into why some crappy things are as crappy as they are (backwards compatibility with Java 1.1). Read books that are recent enough to include language features of 1.5 and 1.6, such as static imports, enums, generics, varargs and so on, and decompile some Java code to see how the compiler implements them.
An excerpt from Working Effectively with Legacy Code by Michael Feathers:
> Unit tests run fast. If they don't, they aren't unit tests.
>
> Other kinds of tests often masquerade as unit tests. A test is not a unit test if:
>
> 1. It talks to a database.
>
> 2. It communicates across a network.
>
> 3. It touches the filesystem.
>
> 4. You have to do special things to your environment (such as editing configuration files) to run it.
>
> Tests that do these things aren't bad. Often they are worth writing, and you generally will write them in unit test harnesses. However, it is important to be able to separate them from true unit tests so that you can keep a set of tests that you can run fast whenever you make changes.
Funny you should mention The Pragmatic Programmers. When I started working on The D Programming Language I've been seriously discussing working with TPP. (They pay very good royalties for one thing.) After a few discussions, it became clear that they want me to obey their exact format and toolchain, which I found rather limiting. They wouldn't want to accommodate some simple requests such as multi-page tables. So I decided to go with Addison Wesley Longman instead, which gives me total control over format (I send them the final PDF). I think this will be a win for the reader.
Still a C noob, but I originally started it by studying The C Programming Language. http://www.amazon.com/The-Programming-Language-Brian-Kernighan/dp/0131103628
Is this a bad place to start? I thought since it was well approved of historically it would be a good base. Thoughts?
That depends on the book. Books on frameworks or specific languages are rarely useful -- I find that online reference manuals are the best for that.
However, books like TCP/IP Illustrated, The Art of Multiprocessor Programming, Compilers: Principles, techniques, and tools, An Introduction to Algorithms and similar tend to age pretty well, and I still find myself pulling them out and referring to them quite often.
If you're looking for a good book on this subject, I'd recommend checking out Working Effectively with Legacy Code. It's 90% about unit testing, but offers a lot of great advice.
This is awesome! I've been slowly getting more and more interested in hardware, and this is something I would absolutely love to do. I just don't know where to start.
I've been reading a couple of books about learning lower level stuff, and planned on working my way up.
I'd really like to get out of webdev and into low-level programming, or even hardware design and implementation. There's sooooo goddamn much to learn, that I doubt I'll be ready without getting a BS in Comp. Engineering, and maybe a master's as well.
(I'm absolutely a beginner, and if anyone is interested in the books I've been reading, these are they:
CLRS is a great book and you'll likely need it in any reasonable CS dept. It's very heavy on the math and might be a bit over your head right now, but it's a solid book. It's a bit more than $50 but Operating Systems Concepts is also a good buy. I personally don't go for language-specific books since they quickly go obsolete, but books on fundamentals are very useful to have in your library.
I highly recommend the book Code. I read it in middle school and it was absolutely fascinating. Pretty short too.
Friedman & Felleisen Little Schemer seems noteworthy. Unorthodox, but nicely done.
And while I think that Meyer's technical writing isn't exactly the best, Object-Oriented Software Construction has a nice visual layout and is one of the few computer books that uses color effectively.
> If you can spare the ram and computing time, sure. This also exists in OOP under the name of Memento pattern but is hardly ever applied because of how slow it can be with big data sets.
The advantage with immutable data structures is that your "modifications" are stored as a delta from the original so the memory requirements are fairly low. [0][1] You probably would have plenty of ram to spare.
>`How do you write the following in FP, with a single stack
(def graph (atom #{ #_"vertices go here"}))
(def stack (atom (list)))
(let [some-value 42.0]
(def my-command {:do (fn [graph] (map #(merge %1 {:length (+ (:length %1) some-value)} graph)
:undo (fn [graph] (map #(merge %1 {:length (- (:length %1) some-value)} graph)})
(defn apply-command [cmd]
;; replace the graph with a version mutated by do
(swap! graph (:do cmd))
;; put the undo function on the stack
(swap! stack conj (partial swap! graph (:undo cmd))))
(defn undo-last []
(swap! stack
(fn [stack]
;; run the undo fn
((first stack))
;; return the stack sans the top element
(rest stack))))
(apply-command my-command)
(clojure.pprint/pprint @graph)
(undo-last)
(clojure.pprint/pprint @graph)
But you probably wouldn't have the graph as a global atom, someValue would be injected into the command, etc, etc.
[0] https://www.amazon.com/Purely-Functional-Structures-Chris-Okasaki/dp/0521663504/ref=sr_1_1?ie=UTF8&qid=1493603954&sr=8-1&keywords=purely+functional+data+structures
[1] http://blog.higher-order.net/2009/09/08/understanding-clojures-persistenthashmap-deftwice.html
Edit: formatting, do was + undo was - in the original, add usage at the end
An immutable list is implemented as described by the other responses, but equivalent immutable data structures exist for all mutable data structures. Immutable arrays with O(1) lookup and O(1) 'assignment', which of course enables O(1) dictionaries. And all the others.
This talk by Rich Hikley, creator of Clojure, has a good example of how it works (About 23 minutes in, but the rest of the talk is good). Also see Purely Functional Structures for an indepth look at it, and many more.
I think SICP is one of the greatest books I've ever read and that anyone who is serious about programming should read it (or be aware of the ideas discussed there).
However, it is a daunting book especially for newcomers (doubly so if the newcomer wants to get the most out of the book and wants to do every exercise).
I would recommend a book such as Simply Scheme to build up some background knowledge before tackling SICP.
I also highly recommend the Schemer series: Little Schemer, Seasoned Schemer.
> That's not an experiment, it's an anecdote about a thing that happened once.
It's a whole class, taught by a not exactly nobody professor. If it was one student, that would be anecdotal. But this is a sizeable sample, bordering on "statistically significant". As for "happened once", I'm sure he taught several other similar classes since then. Maybe we should ask him how it went?
A better argument than the worn out "anecdote", is to suspect the evidence to be filtered, one way or the other. I presented the argument to counter some point I believed false, but nothing guarantees that I didn't know of, and omitted, arguments to the contrary. (There are many reasons why I may do so, included but not limited to self deception, dishonesty, conflict of interest…) I will just hereby swear that I do not recall having ever encountered evidence that mandatory indentation was either detrimental, or neutral to the learning of programming languages. Trust me, or don't.
> And it's an anecdote about introducing absolute novices to programming.
It was their second language. I assume they programmed for at least a semester.
> Even if it were an experiment, experiments don't provide arguments, they provide data to use to test arguments.
Experiments provide evidence for or against hypothesises. Pointing out "hey, look at this experiment that crushes your beliefs flat!" is the argument. Which may have flaws besides the experiment itself (the results of the experiment may have to crush my beliefs flat, and I misread the paper).
</pedantic>
---
> And even if this were an experiment with a result compatible with an argument about indentation, there's no reason to think that this would have any bearing on infix expression shenanigans in Lisp.
I agree. Yes, you have read that correctly, I agree.
<Sardonic smile…>
There is something I suspect you and many others in this thread have totally missed: sweet expressions are not just about infix expressions. That's a detail. The crux of sweet expression is actually significant indentation. Here:
define (factorial n)
if (<= n 1)
1
(* n (factorial (- n 1)))
I don't like the last line (too many parentheses). Let's try this:
define (factorial n)
if (<= n 1)
1
factorial (- n 1)
So, while results about indentation doesn't have any bearing about infix notation, it does have direct bearing about sweet expressions as a whole.
> You're pretty sloppy when you address something that seems to support your position, aren't you?
You just deserve my smug smile :-D
Thank you all for your responses! I have compiled a list of books mentioned by at least three different people below. Since some books have abbreviations (SICP) or colloquial names (Dragon Book), not to mention the occasional omission of a starting "a" or "the" this was done by hand and as a result it may contain errors.
edit: This list is now books mentioned by at least three people (was two) and contains posts up to icepack's.
edit: Updated with links to Amazon.com. These are not affiliate - Amazon was picked because they provide the most uniform way to compare books.
edit: Updated up to redline6561
Well damn, I see what you mean. Sorry for flying off the handle there. Here's an upvote.
Also, check out the book Code. It's really great at explaining how all transistor and low level crap works. It's good; take it from a guy who pretty much loves software up until the point it starts turning into electrical engineering.
I read an interview a long time ago where one of the K&R guys said something to the effect of "Yeah, we got some of the operator precedence rules wrong. We noticed it after awhile but we didn't change it because by that time there were hundreds of kilobytes of code written in C and it didn't seem fair to change it."
I don't remember where I read that; it might have been in this book. Incidentally the fish on the cover is a coelacanth, pronounced "C-le-kanth" and it remained undiscovered until the 20th century, making it a "Deep C secret". I love the double pun and the book was quite good.
Thank you for this -- I will certainly try this out. I can also recommend Michael Feathers' book, Working Effectively with Legacy Code
Charles Petzold also wrote Code: The Hidden Language of Computer Hardware and Software. It's a great book. I'm sure most of the people browsing this subreddit will already understand most of what is in the book (or have read it already) but fantastic read nonetheless.
There's a pretty good book that touches the refactoring as learning tool subject: Working effectively with legacy code.
Introduction to Algorithms is an absolutely fantastic book. I've read it through a couple times. It's very well written and they have plenty of descriptive diagrams to help you intuitively grasp the different algorithms.
I admire your dogged adherence to being wrong in every particular. It takes a special brand of stubborn contrarianism to quote someone's badly edited notes as a primary source and then followup by a claim that this is best possible research.
However, outside in the real world, Alan Kay writes extensively and authoritatively here and in his numerous contributions on Hacker News quite aside from publications spanning decades.
And an awful lot of people agree with his definition. The introduction of the classic Design Patterns defines objects as an encapsulated package that only responds to messages. People who teach OO programming readily quote Mr Kay's definition. The Ruby programming language is fundamentally based upon it, and before you shout "but Ruby has classes" note that Ruby classes are actually themselves objects, for which the
new
message happens to do something particular by convention. And so on; the point being that Alan Kay's definition is super influential, which is why the idea that Erlang is the most object-oriented language is not a new proposition.You've probably come across it already, but if not Code Complete is similarly old, but still well worth your time (I actually found it better than Pragmatic... first time I came across it)
No. People say Cormen's Intro to Algorithms text is dense and inaccessible (though I thought it was very accessible when I first encountered it in college). TAoCP is way more dense and inaccessible than Intro to Algorithms.
I would recommend Intro to Algorithms if you want to seriously dive into algorithms. The newest version features improved pseudocode, which is pretty clear and instructive—especially compared to the MIX assembly code in TAoCP.
FWIW, the "applicative order Y-combinator" is the punchline of The Little Schemer, which I highly recommend.
Most explanations of it describe it as "how you do recursion without names," which is a good enough operational description, I guess. What it really is, historically, is the proof of Curry's paradox, a simplification of a proof that was originally arrived at by Alonzo Church's grad students, Stephen Kleene and John Rosser, as a demonstration that the untyped lambda calculus is logically inconsistent.
In other words, its significance isn't as a tool for implementing general recursion. Its significance is as a proof that the untyped lambda calculus isn't useful for what it was originally intended (i.e. use as a formal logic), and motivates the introduction of type theory in logic and computer science.
I can easily recommend the book Code: The Hidden Language of Computer Hardware and Software ; fascinating stuff!
/me suggests reading Working effectively with legacy code which helps by providing more tools for proactive handling of "legacy issues".
True. I can highly recommend reading Working Effectively with Legacy Code by Michael Feathers.
Try this book: Code, which is a bottom-up approach. Depending on how rigorous your college CS curriculum was, it'll be either a good review of your college classes or mind-blowing, but I think that the approach that the book takes is really great.
I'd certainly recommended Charles Petzold's "Code" as a source of material:
http://www.amazon.co.uk/Code-Hidden-Language-2nd-DV-Undefined/dp/0735611319/ref=sr_1_1?ie=UTF8&amp;s=books&amp;qid=1265836950&amp;sr=1-1
Great explanations of all the points you mention
Sounds like a good case to harness this legacy codebase in a unit test framework.
See Working Effectively with Legacy Code.
Kernighan and Ritchie's The C Programming Language
It will teach you almost all you need to know. It is written by the creators of C and is still the best way to learn C.
I even use it as a reference for parts of C I do not use often.
Also, you should try to be running a unix os, since c programming will be much more integrated and easier.
link: http://www.amazon.com/Programming-Language-Prentice-Hall-Software/dp/0131103628
As mentioned by this paper, “Beautiful Concurrency (PDF)”, Simon Peyton Jones' chapter in Beautiful Code is well worth the read.
I thought it was one of the best chapters in the book.
http://www.amazon.com/JavaScript-Good-Parts-Douglas-Crockford/dp/0596517742
As long as you stick to the good parts Javascript is a pretty nice language that supports both functional and OO styles.
Once you use things that aren't the "good parts" then you are going to run into trouble.
If you're really in this situation, go read Working Effectively With Legacy Code by Michael Feathers.
Then have a look at Martin Fowler's work on "Strangler Applications". This blog post looks like a nice overview.
These things will save you from a world of pain.
EDIT: Added links and reformatted.
K & R and Expert c programming are the best books I have read on programming. http://www.amazon.com/Expert-Programming-Peter-van-Linden/dp/0131774298
If you fully understand c then you will better understand how computers work and be able to debug issues in other languages where details are hidden in the language libraries and syntax
Not a video, but Code: The Hidden Language of Computers by Charles Petzold is really great.
They're not really that subjective. It's quite clear what a difference these changes make.
There's plenty of great advice in this book: http://www.amazon.co.uk/Clean-Code-Handbook-Software-Craftsmanship/dp/0132350882
In Michael Feather's Working Effectively With Legacy Code, he recommends a technique of refactoring for understanding. Clean up the code as you try to understand it, extracting methods, renaming variables, etc. You don't have to commit the changes, but the act of refactoring might just foster better understanding.
Yes. The best book i read that explains "computer" from really "ground up" is Charles Petzold' Code. Even my literature-graduate girlfriend understood it.
Doesn't answer your question . . . but get this book. It sounds like it would help.
Seriously, that book is awesome. It seems like, for a while, my CS professors were able to pick whatever books they wanted in teaching their classes. So we had theory of computation with Sipser, algorithms and complexity with this book (not sure if it has a shorthand reference), compilers was the Appel book by preference, with several others acceptable.
Then, I would guess, someone complained about "difficulty". The administration got involved and things are more "approachable" now. Not so wonderful anymore.
Also, for me this book handled abstract algebra in much the same way that Sipser handles the theory of computation.
In broad strokes it's a goal tracking app with community interaction.
When it comes to design there are a few important pieces that you want to disconnect from each other for various reasons. Any app will have a view, a controller and a model, or some variation there on (there are half a dozen major schools of thought on this and dozens of sub-schools bickering violently about them but in essence that's always true). Usually you'll have 1 controller for each view which handles all of the interactions with the view. The really interesting part starts coming in when you get down to the model level where you're creating abstract representations of all of your data and doing manipulations on that data. Some people argue that manipulation should primarily happen in controllers, some in models. I personally have model controllers that use facades and interfaces to be accessed by and to access the other controllers and model(s). In those model controllers I do any manipulations and essentially leave the models as pure(ish) data structures which hold representations of what's going on.
Now I haven't really started designing the app because we haven't officially been brought on as a team to do the app (no money down = no work yet) but I can tell you that I have basic ideas on how I'll divvy up the model into manageable segments that can stand even crappy outsourcing. Let's take the goal portion of this software. Each goal will need to have a name, it'll need to have a list of users which are all competing on this goal and it'll need to have the metric they're competing for. It'll have to have % completion for each of the users on the goal saved somewhere and it'll need to be fetched from the server whenever someone loads up their app so that the most up-to-date version of everyone's progress is displayed. There will also need to be server side pushing happening to let players know when there is completion on a goal (Friend A finished 100 push ups goal today!). So that piece of the model needs some server connections. Well I need to establish an interface then for the server so that I'm not calling methods that might be renamed or changed entirely all over my code. The interface becomes a contract to always do things a certain way. So I create a [server fetchUpdatedGoals] call that happens when my app loads and the object which inherits from the ServerInterface class (ServerProxy in this case) will then ask the server for the data. Having collected that data the server proxy will then need to set it in the model. This can be done a couple of ways, either with call backs/observers or with hard coded access changes. If we went the hard coded access changes route I would package up all of the goal data (for many different goals) and send it to a facade between the server class and the main data model. In that facade I would unpackage the data sent by the proxy into the individual goals then set them within the model. The use of a facade means that even if the method signatures within the model change I don't have to change code all throughout the program, it blocks that ripple effect and keeps it localized in the facade. So if I change it from [model setGoal:int toGoalObject:GoalObject] to [model updateGoalObject:GoalObject] I would only have to make changes in the facade to change from the previous signature to the new signature.
That's one tiny piece of a much larger puzzle where your entire goal is to break things down into as small of pieces as you can and then separate them from each other as much as possible so that they're not interdependent on implementation but rather only on interfaces and facades. The design before you write philosophy means that you are less likely to have to make changes after the code is written and any changes you do have to make will be limited in scope. It also generally allows for easier unit testing as you have more modular systems that can run separately from others using dependency injection to create false analogues to deliver canned data for testing purposes.
If you're seriously thinking of switching over to software design I cannot recommend this book enough: Clean Code by Robert Martin. It talks about how to design code that is maintainable and simple. And my descriptions here might have rambled a bit (especially compared to how well he describes things) but I only blame that on me being sick with a bit of a fever. I'd love to answer any questions though.
LWRellim is exactly right.
Tomorrow morning, go to your local library or bookstore and find this book. You don't have to read the whole thing right away, but if you can spend the afternoon with it at least, it's worthwhile.
Chapter 11 especially is appropriate here, where you're in the position of "flushing it all down the proverbial crapper." Plan to Throw One Away One can also use a line from The Matrix - everyone falls the first time.
I'm not going to say that every software project bombs the first time around. That's definitely not the case. But many times, especially on your first custom programming experience, no matter how well you think you understand what needs to be built, you will paint yourself into corners and discover all kinds of stuff that you had never considered before you started to build the first version.
Depending on where you are in the project, and how things have been put together, you may have to literally toss everything. In your case, you might be able to salvage some UI bits, but your new developer will probably redo everything else.
Some books I enjoyed:
The Algorithm Design Manual by Steve S. Skiena, $61.15
Real Time Rendering, 3rd. Edition by Tomas Akenine-Moller, Eric Haines, Natty Hoffman, $71.20
Structure and Interpretation of Computer Programs, by Hal Abelson's, Jerry Sussman and Julie Sussman, Free
Clean Code by Robert C. Martin, $37.85
http://www.amazon.co.uk/Clean-Code-Handbook-Software-Craftsmanship/dp/0132350882
This book handles the area of writing clean code and in particular comment free code perfectly.
It teaches you to name your variables and functions in such a way that the code comments itself. Of course to do this you may end up having to refactor your code into smaller, more reusable chunks - which is equally good, but gets away from having dormant out of date comments floating around.
Essentially, when you go to write a comment - ask your self why does this need a comment? Then rename the variable or extract the piece of code into it's own function and name that accordingly so it explains itself.
Meh, these sort of definitional quibbles annoy me. What is the point of manufacturing a distinction such as this that pretty much exists solely as a "gotcha"? The fundamental idea at work is that you're re-using the answers to subproblems in order to computer the answer to the big problem. As long as you've got that idea down, what's the huge philosophical difference in the order that the subproblems are computed? I'm not saying that the distinction is invalid - there is an difference for sure - just not of great importance. In fact, if I were explaining it, I'd introduce DP as a particularly clean form of memoization -- one where you have spent effort to formulate an algorithm in a way that guarantees all subproblems are solved before their solutions are required in the next tier of problems.
I'm not 100% sure, but I think I remember Introduction to Algorithms explaining it this way. They may also have gone for taxonomy where memoization is the specific mechanism of storing results of previously-solved problems and dynamic programming is the application of such to algorithms (thus there would be top-down dynamic programming and bottom-up dynamic programming).
Edit: syntax (changed "computed in" to "computed")
You know how you wish you could relive the experience of your first time watching your favorite movie or listening to a mind-blowing song -- that's how I feel about reading Code and The Elements Computing Systems. I'm a self-taught programmer with impostor syndrome who's spent a career playing catch-up to formerly trained comp sci-ers. When I first read these books, they utterly blew my mind.
I think you're confusing poorly factored code with a holistic piece. You can always find a way to split up a long method into smaller, more meaningful pieces -- you may end up with more lines of code, but it's about how easily the code is understood, not the disk space it takes to store the code.
It's not a simple choice between "stuff it all in a class" and "stuff it all in a function." Refactoring has a huge list of ways of cleaning up existing code -- I'd suggest getting a copy and reading it.
Do not read any of the stuff this poster put out there :OD
Design Patterns: A book written in the early 90's to get around problems of early 90's languages. It's as old as you are practically.
Mythical Man Month: This is about working on software in large organizations. It is not at all related to what you're going to be doing for a number of years:
Code Complete: This is out of date. You should...at best....be reading Code Complete 2. I would say you should wait a couple more years to read that, as while it goes through all the correct sort of ways to build software, most of it will fly right over your head while it maters, and additionally you will likely take them as dogmatic rules.
Now estimation, in a year or so, I'd think about reading the Estimation book by the Code Complete guy....but only think about it...as it will help you budget your time a little better, but you'll really benefit from reading it most once you've had an internship or two.
I highly recommend JavaScript: The Good Parts. I'd say to read that one first because it explains how you should think when programming in JavaScript. Knowing the syntax and function names is no good unless you how to use them.
Perhaps your course was bad, rather than microprocessors? Purchase and read The Elements of Computer Systems: Building a Modern Computer from First Principles. It's only $25, and if you can't afford that most of the materials for the book are available for free online.
After Gödel, Escher, Bach: An Eternal Golden Braid, it's one of my favorite books ever. I already knew a lot about programming, but this helped me understand how circuits, timing, and compilers worked in a very nice way.
There are several good books on designing good software:
Code Complete
Design Patterns
Refactoring
The darcs people (and the pure FP world) have been talking about the similarities between persistant structures, transactional memory and rollback, and revision control for years, FWIW. Good to see these ideas becoming more widespread.
E.g. here's a similar, older triple:
and I'll throw in NixOS, a Linux package manager based on immutable packages.
This is what the FP revolution looks like, I guess.
This is a good read: Working Effectively with Legacy Code
Basically,
P.S. I assume you have already studied Refactoring by Martin Fowler.
So basically Uncle Bob forgot how to actually refactor, and didn't bother whipping up his copy of Refactoring?
Automated refactoring tool are a good thing, because (especially in statically typed languages) they make refactoring safer (but not safe, if you use e.g. code generation or reflection, your refactoring browser won't help you), more systemic, more systematic and much faster.
But they were never necessary, the basic concepts of refactoring were tool-less, and Refactoring lists only manual recipes (with a discussion on refactoring browsers in another chapter), some of them -- if I remember well, it's been some time since I've read it -- not even being automatically implementable.
Yes. Got every answer correct, with correct explanation.
There are a few very, very good books on this topic I can only suggest everyone to read:
Basic:
C++ Primer
Effective C++
Effective STL
Advanced:
More Effective C++
C++ Coding Standards: 101 Rules
C++ Templates The Complete Guide
Large Scale C++ Design
My favorite CS book. Reading this made it all click into place for me. Here is a link to the original book on Amazon: https://www.amazon.com/dp/0262640686/ref=cm_sw_r_fm_apa_aEzPAb1SG38F7.
I think he is spending too much time complaining about what Go is missing. He isn't complaining that there was a design decision for its absence, but that it isn't there yet. I don't see much complaint about the concurrency solution it implements.
I'd recommend the author and anyone feeling his pain take a real good look at D v2.0 in six months. Maybe pick up the book.
I really like Working Effectively With Legacy Code.
I just picked this up the other day, and was surprised how good the advice was. It's geared towards programmers, but has a lot of interesting lessons that can be appplied to anything.
Pragmatic Programmer: From Journeyman to Master
The book has a different approach than standard textbooks. While all other books I have read classify algorithms by the problem they solve, this book classifies algorithms by the technique used to derive them (TOC). I felt that the book tries to teach the thought process behind crafting algorithms, which, to me, is more important than memorizing details about a specific algorithm. The book is not exhaustive, and many difficult topics are ignored (deletions in Red-Black trees, for example). The book is written in an engaging style, and not the typical dry academic prose. I also liked the use of puzzles as exercises.
The only other comparable book, in terms of style and readability, is that of Dasgupta, Papadimitriou and Vazirani. But I like Levitin's book better (only slightly). These two books (+ the MIT videos) got me started in algorithms; I had to read texts like CLRS for thoroughness later. If someone is starting to study the topic my recommendation would be to read Levitin and get introduced to the breadth of the topic, and then read CLRS to drill deeper into individual problems and their details.
Haskell does throw a little too much at you at once, I agree. Another option though would be to go straight with ML, using The Little MLer. It's like The Little Schemer but, shockingly, in (S)ML. It'll get you enough of types and recursion and consing and so on that Haskell will "only" be adding laziness, type classes and monads. At the same time, I think it's a very accessible book, and it mentions food a lot.
Or perhaps even learn Scheme, then ML, then Haskell. Make your life easier at each stage, and learn more languages to boot.
What you're saying doesn't make sense. Amazon has had a rating/review system for a very long time. You can access them here. As you can see, the reviews are overall very good and it is probably a good idea to purchase this.
No "Working effectively with legacy code". Which could be titled "Pragmatic testing".
The DDD blue book, ok. But you'd be better served with "Implementing Domain-Driven Design" which was written 10 years later and have a lot less focus on the technical implementation of DDD.
Mastering Regular Expression should be there too.
> In other disciplines, engineering in particular, there > exist treatises on architecture. This is not the current case in software,
Gee I better throw out all those books on architecture since clearly they don't exist.
We can also ignore Fowler's book, GOF, SICP, and TAOCP since clearly they are treatises on software either.
This is not a flippant or sarcastic answer -> "step by step" or "one part at a time"
Easier said than done for some, these books should help
http://www.spinellis.gr/codereading/
http://www.amazon.com/Working-Effectively-Legacy-Robert-Martin/dp/0131177052
For anyone who wants to learn more about compilers and loves reading books, the so called dragon book is highly recommended lecture on this topic: https://www.amazon.com/Compilers-Principles-Techniques-Tools-2nd/dp/0321486811
If you already have a basic idea of how the machine works, I really have to recommend D as a powerful and, above all, sensible high-level language that isn't bound to a single platform the way C# is. You can do (almost) everything you can in C++, and more (the almost is multiple inheritance and binding to C++ libraries, but there are ways around both). If you're curious, check out Andrei's book or ask for more info in our IRC channel (irc://irc.freenode.net/#d).
i would recommend this book even if your course isn't using it - it has very good explanations for non-genius level (rare in CS textbooks it seems), and all the chapters are self-contained (meant to be read non-sequentially)
at my school this book is used for both semesters of algorithms (the intro and advanced)
http://www.amazon.com/Introduction-Algorithms-Thomas-H-Cormen/dp/0262033844
If you're looking for reading material, you're at the right stage of coding to read Code Complete. It'll change the way you write code. It'll make your code more manageable when you start coding with a team. It'll teach you good coding practices. I highly recommend it.
There are a lot but my favorites are
https://www.amazon.com/Purely-Functional-Data-Structures-Okasaki-dp-0521663504/dp/0521663504/ref=mt_paperback?_encoding=UTF8&me=&qid=
https://www.amazon.com/Structure-Interpretation-Classical-Mechanics-Press/dp/0262028964/ref=sr_1_2?crid=3RF2M50NKPULH&keywords=structure+and+interpretation+of+classical+mechanics&qid=1572889989&s=books&sprefix=structure+and+inte%2Cstripbooks%2C207&sr=1-2
https://www.amazon.com/Structure-Interpretation-Computer-Programs-Engineering/dp/0262510871/ref=sr_1_1?crid=1CSGG60L6328Q&keywords=structure+and+interpretation+of+computer+programs&qid=1572890005&s=books&sprefix=structure+and+inte%2Cstripbooks%2C413&sr=1-1
There are also more mathy type theory books and category theory books you should check out but I'd probably start with those 3
Maybe too light for anyone with a CS/CE degree, but I'm presently enjoying The Elements of Computing Systems.
Thanks much, I appreciate that.
edit: Note, all, the one book I've seen on this.
Andrei's book on D: "The D Programming Language" would be another great read whether you are looking into the D language or not, it is a great read for general programming design IMHO :-)
Wish there were more books like this... http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_1?ie=UTF8&amp;qid=1334003626&amp;sr=8-1
For anyone interested in this kinda stuff I would really recommend "Code: The hidden language of computer hardware and software"
https://www.amazon.co.uk/Code-Language-Computer-Hardware-Software/dp/0735611319
Awesome. How about Clean Code by Robert C. Martin?
You need this.
http://www.amazon.com/Clean-Code-Handbook-Software-Craftsmanship/dp/0132350882
Among other things, there is a whole chapter on naming. It's fucking awesome.
Shouldn't the language be expressive enough so there is no miscommunication (and, thus, no need for semi-colons)?
Don't get me wrong, I agree with you about the fact that there is miscommunication in code, but it seems it's because the language itself have such constrains that don't let the developer express exactly what they want.
For a whole year, I was coding in C. I had to really push things toward what Robert C Martin said in Clean Code, even with other developers asking "do you had to create an API?" (the answer is yes, I had to create because it allowed me to be more expressive than C let me).
Python, which I'm back to, on the other hand, I found that can be really expressive -- and it doesn't use semi-colons at all.
So it's not a matter that semi-colon avoid miscommunication, is that the language is not expressive enough to avoid miscommunication.
fair enough, I shall tell you my favorite C programming book then
deep c secrets
Professor of computer science at U.S. Military Academy, probably best known for his book Purely Functional Data Structures.
As usual beautiful code is late.
More Power to Cut-And-Paste!
http://www.amazon.com/Beautiful-Code-Leading-Programmers-Explain/dp/0596510047
Can't you get laziness with a function pointer (in a thunk struct perhaps) in any language from C up? Having it as the default is a syntactic convenience?
What are doubly linked lists really useful for? What about having two singly linked lists in opposite orders?
Would you start by buying this book?
http://www.amazon.com/Purely-Functional-Structures-Chris-Okasaki/dp/0521663504
I tend to use sets, maps, queues (priority, fifo), and lists (indexed and singly linked).
The Little Schemer
This book is a great tutorial for "programming" in lisp just by reading the book. You can learn "lisp as a concept" more than lisp for work.
For anyone that hasn't read it, [Charles Petzold's "Code"] (http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_1?ie=UTF8&amp;qid=1407846258&amp;sr=8-1&amp;keywords=code+charles+petzold) is a great read through the history of computing and programming.
This book is awesome. I learnt a lot from it.
There is a book for that!
https://www.amazon.co.uk/Working-Effectively-Legacy-Michael-Feathers/dp/0131177052
Where legacy code is defined as code not protected by tests. Really great book.
>Start at the bottom. Some books I liked...
>
>Learn what a computer does: Computer Organization & design - Patterson & Hennessy
>
>Learn C: Programming in C - Stephen Kochan
>
>VERY IMPORTANT learn your data structures: Introduction to Algorithms
>
>You will have learn Java in university, I found this book good: Absolute Java 4th ed.
>
>This is just scratching the surface, a lot more to learn afterword.
Don't worry, FTFH
This is called "Learn C the Hard Way," but I think a better name is "C Programming for Idiots." Just dive in with the real stuff.
http://www.amazon.com/Programming-Language-2nd-Brian-Kernighan/dp/0131103628/ref=sr_1_1?ie=UTF8&amp;qid=1317921424&amp;sr=8-1
http://www.amazon.com/Linux-Programming-Interface-System-Handbook/dp/1593272200/ref=sr_1_1?s=books&amp;ie=UTF8&amp;qid=1317921571&amp;sr=1-1
On this topic I’d like to recommend this book:
https://www.amazon.com/Working-Effectively-Legacy-Michael-Feathers/dp/0131177052
Many others have answered this sufficiently but as it's your gf taking the course I thought I would chime in to suggest picking up The Little Schemer which is a cute book for informally learning the language and you will pick up cute-points.
I'll recommend Code, even though it isn't specifically theoretical. However, it does go over how code (semaphore, morse code) evolved over time. From someone who does program this is about a 'human' a book as they come which could fit exactly what you are looking for.
Introduction to Algorithms, Second Edition. Of course, don't miss Mr. Krumins Video Lectures perfect complement.
I don't know what he'd recommend, but I found the Dragon Book and Modern Compiler Design to be decent treatments of the subject. There are lots of interesting texts out there though.
Sorry for the cheeky reply.
I think one of the most profound articles I read about just what makes a computer "work" is Brian Kernighan's article, "What an educated person should know about computers". It's a decade old now and was developed into a book of similar name. (Amazon link)
Another was sitting down and reading Code: the hidden language of computing (Amazon link) and actually walking through it. The book is coming up on 20 years old, but Petzold (who has taught many a developer how to do fancy tricks with their silicon) really sat down and wrote a book that anyone could understand and come away from feeling better off and more knowledgeable about the way our world works. This is the book I refer new programmers and old knitting circle nannies to read when they ask how a computer works.
Andrei's book is just great, covering language design considerations as well:
http://www.amazon.com/D-Programming-Language-Andrei-Alexandrescu/dp/0321635361
Also, I don't know about good :) but I am translating my Turkish book on D into English, which is geared towards the novice programmer:
http://ddili.org/ders/d.en/index.html
If you check nulls all the time you're doing something wrong. Only when interfacing with third parties (through an API) should you be afraid of nulls. If you null check in your own internal methods, you are in essence saying that you don't trust your future self to use the method correctly. I see that type of coding a lot from programmers who come fresh out of college.
edit 1: If you want to read more about it: https://stackoverflow.com/a/271874
> This to me sounds like a reasonably common problem that junior to intermediate developers tend to face at some point: they either don't know or don't trust the contracts they are participating in and defensively overcheck for nulls. Additionally, when writing their own code, they tend to rely on returning nulls to indicate something thus requiring the caller to check for nulls.
edit 2: Not really a surprise that this was an unpopular opinion with the general reddit demographic being so young. I really recommend just reading the link I provided. It goes into more details about what it means to have a "contract" in programming. Also this one might be of interest. You should not have to double check all of your variables all the time. It will make your code unreadable and unmaintainable. The concept is backed by the leading code developers of the world, you don't need to take my word for it. Robert Martin writer of Clean Code:
Don't Return Nulls
------------------
https://i.imgur.com/0lKknlo.png
Don't Pass Nulls
-----------------------
https://i.imgur.com/RgCNbiN.png
The speaker wrote this as a book, which was actually published in 2005.
Bought my own copy some time ago, definitely recommended!
Another good place to start: http://www.amazon.com/The-Elements-Computing-Systems-Principles/dp/0262640686
It is not just a text, it's a course that walks through building a virtual CPU using Hardware Description Language and a Hardware Emulator, and then coding Tetris on top of it.
It starts with a single NAND gate and go on from there.
The 86 stands for the instruction set for the cpu. Basically, every chip designed in the world accepts input and output, but in different ways (different numbers of connections, ordering). All of those chips have more or less backwards compatibility with regard to that, so it makes it easier for others to develop around that.
So there is a meaning conveyed, though it probably isn't important to you if you aren't developing hardware or writing assembly.
I strongly recommend Code by Charles Petzold which explains the origins of these chipsets. Basically Intel put out the 8080 in 1974 which was an 8-bit processor, then the 8086 in 1978 was a 16-bit processor, so they just ran with the number scheme (6 for 16 bit). The "80" from 8080 probably came from IBM punchcards which were used for the US census (since the 1920s!), which is actually how IBM started, basically as the child of Herman Hollerith who built automated tabulating machines in the late 19th century. Also this is to blame for the 80-character terminal convention. Blame IBM.
Tl;dr: Code: The Hidden Language of Computer Hardware and Software is a fantastic primer for this course.
Refactoring
read it. seriously. it's a great book. and it's invaluable.
He hits on a lot of the same points in his book: http://www.amazon.com/Refactoring-Improving-Design-Existing-Code/dp/0201485672/
Worth a read, certainly had an impact on how I approach refactoring and design in general.
Since you're fuzzy on the whole "Turing Complete" concept, I wouldn't start at the toy app level. The best place for you is probably The Little Schemer with javascript transformations from http://javascript.crockford.com/little.html applied (Unless you want to get yourself an actual Scheme implementation, which would be fantastic). Also, if you work through both that, and "Structure and Interpretation of Computer Programs," you'll have super-high nerd cred.
Please don't ever separate the word javascript again. Javascript is not script-able Java. And why link a book to a youtube video? Why not Amazon? Or O'Reilly? Or even the author's web site?
I think the article is quite good, even with the syntax problem others have pointed out. It covers most of the issues common to embedded development.
For more C, I recommend this book (it's very readable):
http://www.amazon.com/Expert-Programming-Peter-van-Linden/dp/0131774298
Or you could get the 3rd edition:
http://www.amazon.com/Effective-Specific-Improve-Programs-Designs/dp/0321334876/
with 5 more ways to improve your C++
Thanks for the recommendation!
What do you think about his books on Rapid Development and Software Estimation?
Effective C++ and More Effective C++?
The two books referenced at the bottom of my article (Cormen's etc. "Introduction to Algorithms" and Vazirani's etc. "Algorithms") are, I think, a great place to start learning more about algorithms if you want a more rigorous and complete treatment of the subject.
First book of every aspiring politician: The Little Schemer. Complete with the republican Elephant on the cover :)
Any day!
If you read Clean Code, the author, Robert Martin, goes so far as to suggest wrapping ALL third-party libraries (See CSharpSharp). This is so, should you choose to switch libraries, you only have to change your wrapper and not every place you called that library throughout your entire application. By the time you've written wrappers around this trivial library, you might as well have written it yourself.
This code is provided with out warranty, as per the GPL. Should the project lose support, or a rouge developer makes a check in, your only choice is to fix it yourself. At that point you might as well rolled your own and open sourced it. (Note: even if you fix the OSS library, if the project creator has abandoned it and you don't have write permissions to it's sourcecontrol, it's like you wrote it yourself.)
EDIT: Spelling and punctuation.
>We are talking about a high-level language compiler, remember?
I still consider C a high level language. Some people don't for various reasons..
>You were complaining that it compiles to C rather than emit instructions.
You simply read/took my post wrong. Nowhere am I complaining. Merely making an observation. It is not an unusual feat for a compiler to generate assembly instructions or machine code. Nor would I call it super difficult to write a compiler, but rather straightforward.
>If you are going to emit instructions, it's up to you to write your own optimizer.
Or buy/obtain a compiler that already is capable of doing this step.
I believe this is a pretty good book for legacy / maintenance systems:
https://www.amazon.com/Working-Effectively-Legacy-Michael-Feathers/dp/0131177052
Peter van der Linden's Expert C Programming (Google Books, Safari) is a good resource on this question, though it's slightly dated and undoubtedly not the whole story.
The "Gang of Four" design patterns book http://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612/ref=sr_1_1?ie=UTF8&amp;s=books&amp;qid=1251223171&amp;sr=8-1 is the one that started it all. There are code samples in C++.
I've also heard good things about the "Head First Design Patterns" book.
Introduction to Algorithms by Cormen, Leiserson, Rivest and Stein
ISBN 978-0-262-03384-8
One of the few truely good Computer Science books.
http://www.amazon.co.uk/Introduction-Algorithms-Thomas-H-Cormen/dp/0262033844/ref=sr_1_1?ie=UTF8&amp;qid=1302640399&amp;sr=8-1
Here is a good resource to learn/improve your C++ skills.
JavaScript: The Good Parts
Seriously. It's just fantastic. Also in that same vein (though not as "hey, make this entire hardware/software stack"), is Code by Charles Petzold.