Reddit Reddit reviews How Google Tests Software

We found 11 Reddit comments about How Google Tests Software. Here are the top ones, ranked by their Reddit score.

Computers & Technology
Books
Networking & Cloud Computing
Cloud Computing
How Google Tests Software
Addison-Wesley Professional
Check price on Amazon

11 Reddit comments about How Google Tests Software:

u/infinite · 73 pointsr/technology
  1. Diss former employer while writing a book about what you learned there.


    He should write another book, "How to Burn Bridges and Be a Corporate Sycophant."
u/anamorphism · 5 pointsr/learnjavascript

the programming language isn't particularly as important as the methodology. the wiki is actually a pretty decent place to start: https://en.wikipedia.org/wiki/Test_automation

google and microsoft both used to have dedicated roles for these types of folks: SET/SDET software engineers/developers in test. i'm not sure if they still do or if it's more that every software engineer they hire needs to also have those skills. you basically need to be a capable software engineer while also knowing things about software testing.

folks at google wrote a book that was a decent read: https://www.amazon.com/Google-Tests-Software-James-Whittaker/dp/0321803027

i would say the best place to start is to make sure you're writing decent unit tests for your c# code at work. get familiar with continuous integration and deployment systems and start thinking more about higher levels of testing.

u/nazone · 3 pointsr/selenium


The industry is just picking up steam with the whole Selenium, QTP, Waitr stuff so there are a lot of opportunities out there for these specific technologies. However, I wouldn't bet the house that you can make 100k just being an expert on Selenium forever. It's going to become more mainstream and a normal part of everybody's development lifecycle. Which means every developer will know it, and they won't need some specialist that can't bring anything else to the table. A better title for the job you probably want is Software Engineer in Test, which is what Google calls it. These jobs are going to be around for the long haul. People that know how to develop, but approach problems from a quality perspective.

Back to your questions :

  • I have a non-technical degree (BA in Econ). How will this hinder my prospects, or is it a plus that I'm self-taught?

    Some companies will cut you off right away without a CS or related degree. Others will require you to just have a degree in something but be prepared to back up your talk with some real code (read : github). Others (like Google) don't care if you have a degree at all, as long as you can put your money where your mouth is.

  • I don't have any experience working with other programmers, so I'm unfamiliar with enterprise standards. What are some more skills I should teach myself that I couldn't just pick up on the fly at a new job? For example, most of my coding is done in Notepad++ or Geany depending on the OS... I'm guessing this is not very "professional."

    This is a problem. You need to join a community (meetup.com) or a find a friend who can show you the ropes so to speak. Development in a professional large scale environment is massively different than writing a few Selenium scripts with Python. To get in the door be prepared to have at least a general understanding of source control, agile development, coding standards, continuous integration, unit testing, integration testing, and test strategies.

  • How beneficial would it be to teach myself Java, or can I be employable with just python? (I really love python). Would the time required to learn Java be better employed elsewhere (lettuce, mobile testing, _____)?

    Very. Learn java. Learn Ruby. Learn Go. Learn javascript (JQuery or AngularJS ). You are going to make yourself a 100x time more marketable by showing that you love programming not just Python. Its okay to have a favorite language but you need to show that your capable of being fluent in any language. Study the elements of every language and picking up the basics of a new language will be just learning a few new rules.

  • I have started to get more active on /r/python and stackoverflow, but what are some other things I can do to boost my CV? My employer would probably not appreciate me sharing the code I've written. People always say to get involved with an open-source project, but testing seems like something that you would not really find in a random github repo.

    Start your own repo on github and start making something that makes your life easier. Start small and don't try and create the next facebook. Try and create a goal for yourself to commit code everyday for 6 months. Everything else will follow. You'll inevitably create a problem for yourself that will lead you down a path to projects you would like to help contribute you to.

  • Can you link me to any good code examples of automation in action that would demonstrate how a company might do testing, on a macro level? Like, I can easily write methods to do stuff with selenium, but how do I compose the smaller test pieces into the bigger framework?

    Checkout Thucydides. Its java but its a great framework to learn how to write tests in a controlled manner for a large project. Pay careful attention to ideas like Page Objects and BDD. Try and and come up with an answer to why those paradigms exist and when they wouldn't be useful.

  • Could you describe a day in the life of a QA Automation Engineer?

    Really depends, but pretty much the same as a developer. Coffee - checkout code - cry a little - write code - cry a little - write code - commit - scrum meeting - coffee - write code - meeting on how to meet unreachable deadline - write code - cry a little - commit - profit$$

  • Salaries I see online range 80-110k. Is this accurate? I was honestly surprised by the salary -- do you think this level of pay will be around for the long-term?

    Yes, you can make a lot of money right now by calling yourself a QA automation engineer, I've seen upwards of 150k. It won't last. If you want real money and real work than think of yourself as a developer that is really good at testing, not as an automation engineer that knows a little bit about development.

  • What are some questions I might expect in an interview?

    Take a look at How Google Tests. They have a whole section about their interview process.


    Also, a big piece of advise when starting with automation -- You can't automate everything. I'll leave the gateway site here to a rabbit whole of really really really good testing.

    ( I have no affiliation to this blog other than being a fan )

    http://www.satisfice.com/blog/

u/tech_tuna · 2 pointsr/QualityAssurance

My feeling is that the more you can condense and summarize your test plans in this context, the better. I'd even argue that most companies handle test plan management poorly.

I've read two testing books recently that discuss this in a way that I find palatable and sensible:

http://www.amazon.com/Explore-It-Increase-Confidence-Exploratory/dp/1937785025

http://www.amazon.com/Google-Tests-Software-James-Whittaker/dp/0321803027

In both books, the authors argue that it's better to create high level test summariesstrategies than fully expanded and granular lists of EVERY SINGLE test case.

I completely agree with this sentiment, however the problem I have with Explore It is that the author proposes the term "Test Charter" for this type of test plan. I don't like how that sounds personally and I'm both hesitant and skeptical about adding new QA-specific lingo into the mix. I.e. using new terms to describe testing to others (non-testers). I am not a fan of the terms Exploratory Testing and Context Driven Testing. I also have a problem with the whole testing vs. checking debate brought on by Michael Bolton and James Bach.

Overall, I like the Explore It book much more than the Google one - I found the Google book to be lacking in details about the "magic" of Google's testing processes. . . also, it should be noted that James Whitaker left Google (for Microsoft!) a few months after that book was published. . .

Anyway, the core problem is that no one wants to sit through a review of a gigantic spreadsheet, or whatever tabular format that you are almost guaranteed to be using. It's incredibly boring.

Furthermore, you will have a very difficult time getting a developer to review a test plan that looks like this. Visually scanning a long list of test cases is anathema to most developers, trust me, I've been there and done it. This is entirely bad either, any good developer naturally despises repetition and inefficiencies. Which is the problem here, reviewing a long list of test cases isn't an efficient group activity.

Also, the tool/format matters a lot too. I've used a bunch of different tools to manage test plans, my current favorite is TestRail. It's not perfect but it's much more pleasant than anything else I've used in the past (NOTE: I have nothing to do with the company that makes TestRail but I used it at me last job and we use it at my current company).

tl;dr Ask people to review a test plan summary. You may want to call out specific risks and challenges, just don't ask people to read through a list of 500 rows in table somewhere. In the setting of a meeting, a high level presentation (Powerpoint or whatever) might be a good starting point, followed by a Q & A and brainstorming session.

I could go on but I feel like test plan management is yet another aspect of testing software that everyone seems to disagree about. :)

u/tubilol · 2 pointsr/europe
u/Chibraltar_ · 2 pointsr/france

Alors, si t'aimes bien les bouquins, y en a plusieurs cools :

Devops est un classique,

et j'ai aimé How google Tests Software , qui montre la suite logicielle qu'a édité Google pour automatiser les tests logiciels de leur côté.

De ton côté c'est quoi ton mandat ou ton objectif ?

u/shemp420 · 2 pointsr/Android

I just started reading a book by this guy on safari books.

How Google Tests Software
By: James A. Whittaker; Jason Arbon; Jeff Carollo

u/LabelUnable · 1 pointr/ProgrammerHumor

The idea of QA as a non-engineering discipline is pretty old school, and comes from the days when almost all testing was black-box manual testing.

Other than some specific industries, QA is full of engineers.

Look at SDETs, SETs, QEs, SQEs, TEs, and SEQs. You should check out Google, Microsoft or really any of the major tier one software companies. Even many games companies are coming around to the benefits of SEs in QA.

In most places I have worked QA isn’t a separate org anymore, and instead is part of dev (or general software engineering org at least). Though there are often still QA Managers and leads around to help design and implement STLC, and to direct technical efforts related to internal quality focused software development.

For many SEs, the QA track has become equivalent and reasonably attractive over the years.

You can think of it like this. You have Software Engineers, and they have different specializations. Software Engineer, Quality is just a specialization, but no less an SE.

I understand if that isn’t the way things work where you work, but technical expectations have really shifted for QA in most industries.

Edit: People should check out How Google Tests Software - https://www.amazon.com/dp/0321803027/ref=cm_sw_r_cp_api_i_QB4LDbA6GGRWZ

Edit2: One issue currently is that because so many companies have come around to the idea of QA focused SEs, and there has traditionally been prejudice against the concept of QA Engineering in some industries, the market for SDETs is tight as a drum.
In fact, it can be a great track for SEs that think they would enjoy focusing on quality, tools, and automation as disciplines. It isn’t all that unusual to make more money, and have more bargaining power, than a feature dev of similar skill.

u/LieutenantKumar · 0 pointsr/practicemodding

...continued...

> Test plans - When you apply for QA roles, you'll almost certainly be asked "how would you test ____?". The correct answer is to be methodical. Don't just spew out a stream of test cases as you brainstorm them. Understand the different scopes (unit, functional, integration, maybe end-to-end) and what the goals of each is, and how they differ. Understand that there are different areas of testing like boundary, happy path, special cases (null, " ", 0, -1), exceptions, localization, security, deployment/rollback, code coverage, user-acceptance, a/b, black box vs white box, load/performance/stress/scalability, resiliency, etc. Test various attributes at the intersection of a compenent and a capability (borrowed from the book How Google Tests Software), and I believe you can see a video that goes into this called The 10 Minute Test Plan. Understand how tests fit into your branching strategy - when to run bvts vs integration vs regression tests.

> Test methodologies - Understand the tools that make you an efficient tester. These include data driven tests, oracles, all-pairs / equivalency class, mocking & injection, profiling, debugging, logging, model-based, emulators, harnesses (like JUnit), fuzzing, dependency injection, etc.

> Test frameworks - Knowing all the tests you need to write is good, but then you have to write them. Don't do all of them from scratch. Think of it as a system that needs to be architected so that test cases are simple to write, and new functionality is easy to implement tests for. I can't recommend any books for this because it's something I learned from my peers.

> Test tools - Selenium / WebDriver for web ui, Fiddler for web services (or sites), JUnit/TestNG, JMeter (I have to admit, I don't know this one), integration tools like Jenkins, Github/Stash, git/svn.

> System design - As you're entry-level, this may not be a huge focus in an interview, but know how to sensibly design a system. Know which classes should be used and how they interact with each other. Keep in mind that the system may evolve in the future.

> Whiteboarding - Practice solving problems on a whiteboard. The process is more than just writing the solution, though. This is the process I follow (based loosely on the book Programming Interviews Exposed):

  • Clarify the problem - resolve any ambiguities, determine behaviors for special cases (throw an exception vs return null?). Look for gotchas (like if you're doing some string manipulation with overlaps)
  • Give a couple test cases to demonstrate your understanding of the problem, to make you think of other special cases, and because they want someone who's test-focused if you go into QA. Give a happy path scenario and a couple negative or special cases
  • Propose a solution - do this verbally, and give its runtime complexity (and less importantly, its memory usage). If the runtime complexity is bad (polynomial, exponential), then say so and think of a better solution (there will almost certainly be one)
  • Implement the solution - verbalize your thought process while doing so. If you don't know something, say so. The interviewer will likely help you out without penalty. Listen very carefully for clues, because the interviewer will be giving them. Really understand everything the interviewer says, and understand his motivation for saying it. If you see potential bugs, say so ("I want to be careful that I don't go out-of-bounds in the last iteration of this loop").
  • Debug the solution - walk through it as if you're a debugger, using the happy path test case that you made earlier. Oftentimes, the interviewer will give you a test case with the problem. Use it - he probably selected it for a reason (the numbers are in an interesting order that will find the most bugs, for example).
  • Test the solution - Add to the handful of tests you gave earlier. Think about the different types of tests, and if they apply.

    Resources:-

    > Learning to test:

  • How Google Tests Software
  • Guice, and another
  • Google Test Automation Conference
  • Netflix's Simian Army
  • Google Testing Blog
  • Hermetic testing
  • The Art of Software Testing (I've only skimmed it)

    > Learning to interview:

  • Programming Interviews Exposed
  • Programming Pearls

    > Learning to program:

  • Design Patterns (I'm embarrassed that I don't have more recommendations for this...)

    > Miscellaneous

  • Meetup
  • Inventing on Principle

    > What sort of skills should I really hone? I realize I gave you a ton of stuff in this post, so here's a shorter list:

  1. Read How Google Tests Software
  2. Understand dependency injection
  3. Understand unit, functional (use hermetic environments), and integration testing
  4. Understand mocking (Mockito's a good one for java)

    > Examples of projects that make you look valuable

  • Refactoring product code to be Guice-friendly
  • Tool to profile method calls simply by adding annotations
  • Tool to automate bug filing/updating/closing - assign to the right person, re-activate when they repro, give good steps, close when they're fixed and don't repro
  • Tool to automatically quarantine flaky tests that aren't caused by product bugs
  • Aggregation of distributed logs into central, indexed location (I didn't write the solution, just did the work to integrate an existing one (Logstash/Kibana))
  • Automatically display the picture of the team member who checks in code with the highest coverage (I didn't do this, just something cool I read about)
  • Tool that logs messages with contextual information, so for example you can see all messages associated with user 123
  • Tool that captures inter-server traffic, associated with the user-request
  • Tool that provides metadata about test cases in your web proxy