Siraj Raval
Deep Learning is the the most exciting subfield of Artificial Intelligence, yet the necessary hardware costs keep many people from participating in its research and development. I wanted to see just how cheap a deep learning PC could be built for in 2020, so I did some research and put together a deep learning PC build containing brand new parts that comes out to about 450 US dollars. I chose NewEgg for the parts because it has a global shipping policy, deep learning belongs to the world not just the United States. In this episode, I’m going to walk you through what the deep learning stack looks like (CUDA, Jupyter, PyTorch, etc.) , why i chose the various hardware components, and then I’ll show you how to setup the full deep learning software stack on your PC. Enjoy!
TWITTER: https://bit.ly/2OHYLbB
WEBSITE: https://bit.ly/2OoVPQF
INSTAGRAM: https://bit.ly/312pLUb
FACEBOOK: https://bit.ly/2OqOhx1
Please subscribe for more educational videos! It means a lot to me.
DIY Deep Learning PC parts list (about $450):
——————————————————-
GPU (GTX 1650): https://bit.ly/31Jb4Hu
Motherboard (MSI A320M ): https://bit.ly/2uyCaop
Hard Drive (Seagate Firecuda 1TB): https://bit.ly/2tKxUSi
RAM (SK Hynix 8 GB): https://bit.ly/2UMpWD8
Power Supply (Corsair 450W): https://bit.ly/2w74yOT
CPU (AMD Ryzen 3 Series 4 Core 3.1 ghz): https://bit.ly/31HwiFl
PC Case (2 fans built-in): https://bit.ly/39p7IMo
——————————————————–
Note* – each part price is always fluctuating +/- 10 dollars in price
The ABS $600 pre-built pc:
https://bit.ly/2OJjcCh
PyTorch’s Image Classifier Example:
https://pytorch.org/tutorials/beginner/blitz/cifar10_tutorial.html
Linus Tech Tips POV PC Build Guide:
https://www.youtube.com/watch?v=v7MYOpFONCU
Instructables PC Build Guide:
https://www.instructables.com/id/Build-a-Gaming-Computer/
Nvidia’s CUDA Documentation:
https://docs.nvidia.com/
Docker:
http://docker.com/
Petronetto’s Deep Learning Docker Image:
https://github.com/petronetto/docker-python-deep-learning
Another Deep Learning Docker Image:
https://github.com/NVAITC/ai-lab
Are you a total beginner to machine learning? Watch this:
https://www.youtube.com/watch?v=Cr6VqTRO1v0
Learn Python:
https://www.youtube.com/watch?v=T5pRlIbr6gg
Live C Programming:
https://www.youtube.com/watch?v=giF8XoPTMFg
CUDA Explained:
https://www.youtube.com/watch?v=1cHx1baKqq0
Hit the Join button above to sign up to become a member of my channel for access to exclusive live streams!
Signup for my newsletter for exciting updates in the field of AI:
https://goo.gl/FZzJ5w
Can’t afford a PC right now? That’s OK, use Google Colab for a free cloud GPU:
https://colab.research.google.com/
Credits:
Nvidia team
PyTorch team
Image/GIF assets are from across the Web, i take no credit for them
(except some memes)
Comedy Central (“Nathan for you” clip)
And please support me on Patreon:
https://www.patreon.com/user?u=3191693
Source
The content quality did really go down after recent events. However, it is good to see that now you properly reference other sources and other youtubers. Well done, Siraj!
C RULES !!!!! o.o/
I've just ordered a AMD Ryzen 2700X Eight Core 2TB 16GB ram with RTX 2070 8GB for my deep learning for my final year project at university is this a fast enough PC for face recognition with 12000 photos? i'm new to deep learning as i'm studying electrical engineering.
The main advantage of 1650, despite not having Cuda cores, is that it comes in a Low Profile form factor. meaning it can fit in the small desktop builds!
> torch.cuda.current_device()
"AssertionError: Torch not compiled with CUDA enabled"
Me: [not working intensifies]
Hello Siraj, have you checked out X79 Chinese motherboards that can run multi core multi thread Xeon processors plus use DD3 RAM as a platform for deep learning, I just ordered one suck kit for 96 USD, just need to add video board, HD and power supply.
Thanks Siraj, your channel is so useful.
Isn't it much easier to use anaconda to install the GPU enabled software needs? That is how I do it on windows and it is way less work than install cuda cudnn etc yourself.
gtx 1060 is better than 1650
Outstanding
Good to see you are raising man. Keep it up
Am I the only one who came here just to view the comments 😂😂
Thank god, the title doesn't say "Cheapest Deep Learning Course in 2020"
How about use Radeon RX580, it have 2307 shader unit same as cuda core
and we can use ROCm Tensorflow
RX580 -> 2307 cores, about $115
GTX 1650 -> 896 cores, about $150
Shilpa Shetty? Dude really?!!!
Could you Siraj recommend a cheap laptop for Deep Learning as well, please?
Super video 👍
Buy Dell M6800 Precision Workstation
Delhi market in 18000rs just under 270$ .
24 GB ddr3 1600mhz
8 GB Gddr5 nvidia
2 tb HDD Toshiba
3.0 GHz i7 4800MQ
8 mb L3 Cache.
Best workstation Laptop for AI ML DL.
By far the best with personal experience.
Myself..
Total IT exp 9+ yrs.
SAP technical architect exp 5+ yrs.
Save every pennies. Happy learning guys.
Training NNs is a waste of energy, just mine bitcoin to help the global economy, the bankers will find a real job and this will help the society in general… + NNs will be dead soon, similarly to 1980…
anyone else is stuck at 10:50? I never run docker before now and the screen is clipped. does it use windows or linux container?? how did he run the container? which file was opened? halp!!!
Pytorch download datasets from original servers not from pytorch servers. Read source code.
Here are some tensor figures to consider.
Titan RTX: 4,608 CUDA Cores, 576 Tensor Cores,
RTX 2080 Ti: 4352 CUDA Cores, 544 Tensor Cores,
RTX 2070: 2304 CUDA Cores, 288 Tensor Cores,
Jetson AGX Xavier: 512 Volta Cores, 64 Tensor Cores.
If we the tech community want to tackle the coronavirus using deep learning then the hardware doesn't really matter. It's much more important to get people up to speed on the software first. My worry is people will think they need to build a dedicate machine but that's not the case at all.
Tu frr se aagaya… Itni beijjati ke badd
Just buy dgx-2 , my time worths more
No hace falta tener cuda cores. Se puede paralelizar con cpu tradicionales siempre que se tengan muchas https://blog.inten.to/cpu-hardware-for-deep-learning-b91f53cb18af but with a big budget https://www.pcworld.com/article/3512145/amd-64-core-threadripper-3990wx-price-release-date.html
Thanks Siraj! Really appreciate your humor and energy, it's awesome!
i have a rtx 2060 to play rainbow six siege lol!!
I will do it. I can do it. I am doing it.
with 60,000 Images, what is the Time Taken to train with your recommended Deep Learning PC
Your Big fan from India!! 😄
with 60,000 Images, what is the Time Taken to train with your recommended Deep Learning PC
Am I the only one who came here just to view the comments 😂😂
Thanks Siraj! Really appreciate your humor and energy, it's awesome!
with 60,000 Images, what is the Time Taken to train with your recommended Deep Learning PC
GTX 1650 has zero tensor cores. You need at least a RTX 2060.
lol deep highly connected models
Could you Siraj recommend a cheap laptop for Deep Learning as well, please?
Why does Siraj look like he's on the verge of having a breakdown
with 60,000 Images, what is the Time Taken to train with your recommended Deep Learning PC
Thanks dude
I Really appreciate your work
Could you please provide us with a video comparing tensorflow and pytorch?
Just use Google Colab, it's free and you can use the TPU!!!
Why don't you use AMD GPUs? For example, RX570/580. Have you tried ROCm?
Thanks for the cheapest but what about the best .
I want to warn everybody who is going to follow those recommendations and buy a gtx 1650.
4gb memory is absolutely not enough!
I used 1060ti with 6gb for a long time. I would say 6gb is often not enough. Some large models won't train even with batch_size=1.
Don't buy GPU with small memory, unless it is very cheap like 1060ti or 1650. Maybe 4gb is enough when you are just starting, but very shortly you are going to need more. Better wait until you are very sure you need one, and use free kaggle's and colab's GPUs.
IMHO the best choice in terms of power/price is 1080ti with 11gb memory. It is about 1.2-1.5 times slower than the most powerful GPUs (which are used on kaggle VMs), and is about 2 times cheaper.
Also, no other GPU have more than 12gb memory (except tesla v100, which costs like 10 1080ti's), so it's current top memory.
I would also notice that the amount of cards matters. You can always buy 2-4 of 1080ti's, and gain 2-4x speedup (you would often train multiple models at a time). So even if you are buying just one GPU, better choose motherboard that supports >1, for the future.
Second best GPU I would say is 2080ti. But it is rather expensive now.
My own workstation for now has 2 cards: 1060ti + 1080ti. The whole computer costed me about $2000.
In the future im gonna add another 1080ti (or maybe 2080ti if it gets cheaper, or I feel richer 🙂
Interesting content
Your little quip about Skynet around 2:15 almost made me spit out my breakfast through my nose. Keep these videos coming!
Great one for cuda optimised c
Am I the only one who came here just to view the comments 😂😂
Thank god, the title doesn't say "Cheapest Deep Learning Course in 2020"
Great one for cuda optimised c
hey can I use rx 580 instead gtx 1650?
AWS
SK Hynix, are the guys whom sell ram to other brands.