Top products from r/deeplearning

We found 18 product mentions on r/deeplearning. We ranked the 15 resulting products by number of redditors who mentioned them. Here are the top 20.

Next page

Top comments that mention products on r/deeplearning:

u/x_vanq · 3 pointsr/deeplearning

If you would state what you will do with your laptop I think it will be easier to give advice, like small project might mean different thing from one person to another.

To answer, your question about; ..or would I be better off just getting.. is kind of hard to answer. If a laptop is the only device you are willing to buy, then i would suggest Nvidia Quadro Rtx 5000 given that you have the money, otherwise i would not buy any of the option you mentioned.

Okay that said, i would never suggest to use your laptop as the main training device for your deep learning projects, and for the price razor is asking 4,000 $, Hell no!

My recommendation, is getting a stationary computer, and a laptop. Personally, for my 'small project' i use my laptop which have 1050ti which more than enough, but for my 'bigger projects' i have stationary which have 1080ti.

So i would choose a computer like this; https://pcpartpicker.com/list/xWQRJb

you need to choose a case, and make sure that the parts are compatible i did this really fast. So couple of reason i choose a blower style graphic card is that when you have done couple of project you will wish either, more VRAM or i want to train 2 models at the same time, you will be able to fit 2 graphic cards and they take 2 slot and not 3 like the majority.

Another reason why you should get a separate work station for your bigger project - in Ubuntu, is that when you graphic card is working 100% your can't really do anything else on the computer. If you have laptop you can do your school stuff there.

The laptop; https://www.amazon.com/ASUS-IPS-Type-R7-3750H-GeForce-FX505DV-PB74/dp/B07SXTDL1S/ref=sr_1_1?keywords=ASUS+TUF+Gaming+FX505DV&qid=1572156915&sr=8-1

those two will cost you almost, 3842 $ which is cheaper than razor - minus the case.

If you just want a laptop i would still consider my suggestion over razors.
And to answer your question yes, Nvidia Quadro RTX 5000 GPU is beneficial for deep learning and is better than NVIDIA GeForce RTX™ 2080 with Max-Q Design.

edit-- make the answer more clear, and answer the question, better.

u/SupportVectorMachine · 9 pointsr/deeplearning

Not OP, but among those he listed, I think Chollet's book is the best combination of practical, code-based content and genuinely valuable insights from a practitioner. Its examples are all in the Keras framework, which Chollet developed as a high-level API to sit on top of a number of possible DL libraries. But with TensorFlow 2.0, the Keras API is now fundamental to how you would write code in this pretty dominant framework. It's also a very well-written book.

Ordinarily, I resist books that are too focused on one framework over another. I'd never personally want a DL book in Java, for instance. But I think Chollet's book is good enough to recommend regardless of the platform you intend to use, although it will certainly be more immediately useful if you are working with tf.Keras.

u/Atralb · 1 pointr/deeplearning

Thx a lot for this extensive answer :). I will read all of that tomorrow when I got time.

  1. I'm in Europe so my pricings may vary from yours. First I'm confused cause on my amazon there is this one which is at 2200€ and this one at 280€. Is this cause the first one is V3 and the latter is V2 ? How is there so much of a price difference ?Then there is this one on a second-hand website for my area (it's in french, but you can understand the specs)This is the kind of thing that really gets me confused with Intel CPU. Almost the exact same model reference and three completely different prices. Well I guess it's just V1, V2, and V3, but it's really uneasy with their hundreds of different references.

  2. So you're telling me 1000Mhz RAM is enough for DL ? wow I was thinking of buying 3k ones.

  3. I should have maybe mentioned that I don't strictly want to make like a DL server with only DL experiments running on it.. I want to build my personal workstation within my budget with a priority on Deep Learning, but also want it capable as a private server, multitasking, and even a bit of videogame developing and video editing (music production mainly). Would the Xeon V2 still be strong enough for this ?

    A simpler and maybe more efficient thing I could ask you is what complete Rig would you make if you had a 2k budget and the expectations I mentioned ? I completely understand if you don't have time for this :), just in case.
u/songanddanceman · 1 pointr/deeplearning

You're comment was really helpful, and I appreciate you also elaborating on some points because it helps me learn.

I am sorry about 1950x experience. I see your point about not being able to avoid MKL.

Thank you also for commenting on the motherboard and letting me know about the lack of space on the 4th slot. That really saved me a lot of trouble. I think I opt for a server style case that separates the motherboard from the GPUs and use PCIe risers to connect them. Maybe something like this or this

I tried doing some follow-up research on your comment about MIT researcher switching to the blower style GPU. I found this article, which might be the one you reference. I am thinking the server style case for 8 GPU might be helpful for also creating separation between 4 GPUs

u/video_descriptionbot · 1 pointr/deeplearning

SECTION | CONTENT
:--|:--
Title | How to Train Your Models in the Cloud
Description | Let's discuss whether you should train your models locally or in the cloud. I'll go through several dedicated GPU options, then compare three cloud options; AWS, Google Cloud, and FloydHub. I was not endorsed by anyone for this. Code for this video: https://github.com/floydhub/fast-style-transfer Please Subscribe! And like. And comment. That's what keeps me going. High Budget GPU: Titan XP https://www.amazon.com/NVIDIA-GeForce-Pascal-GDDR5X-900-1G611-2500-000/dp/B01JLKP3IS Medium Budget GPU...
Length | 0:09:22






****

^(I am a bot, this is an auto-generated reply | )^Info ^| ^Feedback ^| ^(Reply STOP to opt out permanently)

u/elliot_o_brien · 2 pointsr/deeplearning

Read https://www.amazon.in/Reinforcement-Learning-Introduction-Richard-Sutton/dp/0262193981.
It's a great book for beginners in reinforcement learning.
If you're a lecture guy then watch deep mind's reinforcement learning lectures by David silver.
School of AI's move 37 course is also good.

u/omnihaand · 2 pointsr/deeplearning

I'm the manager of IT and Cloud Services for an AI startup, so I get it. TF, Torch, Keras, Horovod...

Are you planning to do parallel training? Did you get a big enough PSU for 4 cards? (Typically 1600w or more.) Does the motherboard support 4 GPUs in x16 mode?

If you're sure you want 4 cards at some point, check out AMD's Epyc chips and the ASRock epycd8 motherboard.

Asrock Rack Server Motherboard EPYCD8-2T SP3 Socket EPYC CPU https://www.amazon.com/dp/B07PGLF6ZB/ref=cm_sw_r_cp_apa_i_PhVrDbJS2NMGF

u/ztasre · 2 pointsr/deeplearning

This is a good resource:

https://www.amazon.ca/Numerical-Algorithms-Computer-Learning-Graphics/dp/1482251884

Linear Algebra by Shilov is also good, but a little hard.

Also, please never write the words "Linear Algebraic Calculus" ever again.

u/SomeConcernedDude · 3 pointsr/deeplearning

You could go with a refurbished Dell Optiplex and this small form factor 1050 Ti:

https://www.amazon.com/dp/B06XHZ29N5/ref=twister_B01M3U7DDB?_encoding=UTF8&psc=1

I have an Optiplex 7040, and the Zotac 1050 Ti Small Form factor fits into the PCIe x4 slot. It's a x16 card but still works. However, I notice that its speed outperforms the CPU mostly with single precision computing, not double precision. From what I've read though, using single precision is not too critical for deep learning applications and it allows one to fit more data in the card's memory.

u/linklater2012 · 7 pointsr/deeplearning

I think the guaranteed admission to the other programs is the real seller here, not the content. And they know it.

I also found the guy obnoxious and his tendency to call himself a "YouTube star" is a red flag to me. He gives me the impression of tending towards fluff. Reviews of his book:
https://www.amazon.com/Decentralized-Applications-Harnessing-Blockchain-Technology/dp/1491924543

Despite all that, I am still considering enrolling because of the guaranteed admission to the other programs. Now whether those other programs are any good, I don't know.

A part of me is really skeptical that they can produce a job-worth autonomous vehicles engineer in a year. Perhaps a sort of technician maybe at best.

u/kailashahirwar12 · 4 pointsr/deeplearning

As far as I know, there is no MOOC course specifically designed for GANs. There are several books on Generative Adversarial Networks like "Learning Generative Adversarial Networks https://www.amazon.in/dp/1788396413/ref=cm_sw_r_cp_apa_i_iLdRCbMMMRD60"
and "Generative Adversarial Networks Cookbook: Over 100 recipes to build generative models using Python, TensorFlow, and Keras https://www.amazon.in/dp/1789139902/ref=cm_sw_r_cp_apa_i_SLdRCbK116413".
Recently, I released a book on Generative Adversarial Networks Projects. It is available at "Generative Adversarial Networks Projects: Build next-generation generative models using TensorFlow and Keras https://www.amazon.in/dp/B07F2MY1QH/ref=cm_sw_r_cp_apa_i_TMdRCb5X4375D". If you go through the book, let me know your feedback.