Ableton
In this panel discussion from Loop 2018, artists from YACHT and lucky dragons as well as Jesse Engel and Adam Roberts from Magenta Studio explore what machine learning means for creativity.
Read the full article and download Magenta Studio plug-ins for free:
https://www.ableton.com/en/blog/magenta-studio-free-ai-tools-ableton-live/
See more from Loop:
https://www.ableton.com/blog/loop/
Loop stage design by Dejha Ti:
http://dejha.com
#loop #ableton #machinelearning #creativepractice
Source
They made all the speakers sit too close together, makes them all look married 🙂
Listening to YACHT talk about themselves makes me never ever want to listen to another thing from YACHT ever.
This is actually a pretty awkward video.
'save it for the pen'
Don't get me wrong but this stuff is pretty cool and I can't wait to see where it goes.
I'm using it and I like what it does and brings to my musical table.
But music is about the live elements. About the moment.
And there's so much said here, mostly references, that is not.
The focus is on the code and the relationships within music between playing and listening, speaking while listening
and not the human intuitunal element in music (if that is even a word) we call improvisation.
I both love where this is going and equally dread some of the avenues it actually will.
Luiz.
Would love to see this same panel in dialogue with the team at iZotope
What does it mean if at 9:00 you hear exactly the same thing over and over again, with NO variation in your perception of the sounds?
EDIT: Tested on multiple speakers and headphones, same result.
this is awesome
Geeking out with lucky dragons …🤓
kinda scary
medium "competence" vs creativity
C est vraiment super cette évolution, bravo ça va bien nous aider pour la production merci à vous.
I want an AI that understands how I change throughout a piece so that I'm not having to guess more often.
I need an AI that can create a supplementary grid to Abletons global grid. When I change tempo or play a syncopated beat that rises or falls in tempo the global bpm grid does not align with these notes. A supplementary gird layered on top of Abletons global grid could easily solve this problem. The grid could be amorphous changing tempo over time, rising or falling proportionally between bpm's (set or determined by machine learning) all while staying in time (due to the common proportionality inherit in all music).
This amorphous machine grid could be done by way of simple proportionality rising or falling in varying degrees. Please please please look into this. It would make so much work easier and I would not have to edit every note by ear. Which in it of itself is difficult with any amount of latency and as we all know, " being exact is important in Electronic music"
I hope my vision makes sense here.
It may take the presence of a proprietary AI within Ableton to achieve this and at least a complex learning/analyzation algorithm.
This supplementary grid would not have to render anything, just act as a guide.
| | | | | | | |
| | | | | | | | | ||||||||
great stuff
Interesting
super inspiring