Siraj Raval
The Movidius Neural Compute Stick is a miniature deep learning hardware development platform that you can use to prototype, tune, and validate, your AI at the edge. Intel reached out and asked if I would do a video for them, and since I thought the product was cool I said yes. They mailed it to me and I’ve had a lot of fun using it. In this video, i’ll talk about how it works and demo an image classification demo in Python using it.
Code for this video:
https://github.com/llSourcell/Deep_Learning_with_Intel
Please Subscribe! And like. And comment. That’s what keeps me going.
Want more education? Connect with me here:
Twitter: https://twitter.com/sirajraval
Facebook: https://www.facebook.com/sirajology
instagram: https://www.instagram.com/sirajraval
More learning resources:
https://developer.movidius.com
https://www.pyimagesearch.com/2018/02/12/getting-started-with-the-intel-movidius-neural-compute-stick/
https://medium.com/deep-learning-turkey/a-brief-guide-to-intel-movidius-neural-compute-stick-with-raspberry-pi-3-f60bf7683d40
https://github.com/movidius/ncsdk
Join us in the Wizards Slack channel:
http://wizards.herokuapp.com/
Sign up for the next course at The School of AI:
https://www.theschool.ai
And please support me on Patreon:
https://www.patreon.com/user?u=3191693
Paid by Intel
Signup for my newsletter for exciting updates in the field of AI:
https://goo.gl/FZzJ5w
Hit the Join button above to sign up to become a member of my channel for access to exclusive content!
Source
Intel is amazing.
The coding part makes people not really want to do this
We're currently testing Movidius V2 and it's now very impressive because we can use opencv dnn package to load TF model and use Myriad device. If you're interested, I'm preparing a Medium article.
I just ordered the second version, cant wait to start working with it.
just imagine what darpa etc may have secretly figured out already if they continued this neural chip from 1994…
From Wikipedia, the free encyclopedia:
"The Ni1000 is an artificial neural network chip developed by Nestor Corporation. The chip is aimed at image analysis applications, contains more than 3 million transistors and can analyze patterns at the rate of 40,000 per second. Prototypes running with Nestor's OCR software in 1994 were capable of recognizing around 100 handwritten characters per second."
Your using python. Ok. Does it only work on that.
Thanks for the demo. Could I ask you how fast it is for live video feed for person/pedestrian detection and maybe even with tracking?
Can I use it with PyTorch?
Could these be clustered together on the 20 USB mining mother board Asus put out?
The stick itself is not a GPU, right?
Oprional reading, proceed by your own choice do remove the comment for any reason, like any at all. Like deciding not to opt in (-to finding out what is to come next).
Well… Talk the talk but this and worse (examples) is how they walk – the duck…(?)
"Selfies" now, are turned upside down, mirror reversed after the shot is being stored in galleries yet they add obvious fake retarded imvasive smooth renderings in real time regardless if settings are on normal in no effect mode, on smart-phones which lacks ability (intelligent design choice for/to promote difficiencies)
and cannot depict the actual depth, light conditions and composition seen there, as with reflected on the display, mirror, glas et cetera (referenced other individuals, optics and real life approximations) strange that (range et cetera, away from the unit and perspective is manipulated. Lean the camera in an angle and get a big head who knew we cannot escape the haunted house of mirrors.
Keep the camera steasy n' straight and maybe get a huge nostril… in the likes of a paparazzi "kate moss" throwback ("deep fake"…) processing style. Guess that isn't a thing to post on a resume, cloud, in public, or social media outside of the dedicated meme-o-sphere.
– (Is this and the rest a tailored solution (…) and/or personal selective quasi-"vendetta"). Yaddi yaddi yadda
Cameras for example were better 10 years ago in many aspects. "Blah blah blah…"
(…)
– seems like it (the argument) holds up as an educational example if completed. (A minor) Abstract proof of concept (indication) that a.i. may very well become evil if it has the biasness or instruction from questionable programmers.
Just saying if certain people cant trust y'all with developing a camera anymore then imagine a super or general artificial intelligence. How does the algorithm choose what and who to recommend and forward to up the odds of getting more views/becomming a trend and so on. Are the rest subjects of sabotage/set as lesser priority (…) but may or may not be major advancements across industries. (…)" However the good future some envisioned is likely being flushed down a toilet. Replaced with… oh look at the time… Good luck building that wall and what have you… Do remember that, "good guys", utopians, benevolent transhumanists (…) the next time you get blocked. In one way or another." Or not mostly rambling incohetent nonsense not worth listening to or ungeard of before, by this point. The knowledge was sort of instilled, trickled down, pre-programmed and conditioned into "us" after(?) "Birth." Even though it often isn't spoken if outloud." (…) "
That is brilliant! Will be trying this soon once I get the stick!
How is this thing in 2019?
I learned more in this 5 min video than I did in 1 day Googling "neural compute stick get started".
ok so i can finaly buy some coffee and may pc learns how to play CS GO in the same time? NICE!
it's input is trained model,it is not able to train a model! i am right, ok? then this is not good for me.
I have brand new Intel Neural Compute Stick-2 …. if anyone interested please reach out to me on 7975970810…i will give it away for good price
Bangladeshi?
So what about people who own laptops with nvidia gpu's? Do they need this?
Can i create and train models usint this stick on a laptop?
Dude you are a fraud
Great video 😎
Does it have it's on memory inside? or there is a data transfer between the host machine memory and the device?
Is this device any kind of hardware accelerator to perform model training?….As I learn from your video it just test the model performance?