Theoretically Media
Today we’re taking a look at an amazing new Midjourney Command: Describe– where Midjourney will actually tell YOU what IT sees!
And even as amazing as that is, there is a lot of hidden details about how Midjourney views images and prompts, so we’ll be diving into Midjourney’s output prompts to examine what we can discover. Plus, we’ll be taking a look at how Tokens work in Midjourney, with some tips on how to maximize your prompt and get the most out of them!
FREE VIDEO HANDOUT: https://theoreticallymedia.gumroad.com/l/describe
AFFILATE LINKS:
Camera: https://amzn.to/3yXMDY2
Microphone: https://amzn.to/3K1jIZm
Audio Interface: https://amzn.to/3lDX9kf
Coffee: https://amzn.to/3JZuBeq
Follow Me on Twitter: https://twitter.com/TheoMediaAI
————————————————-
Thanks for watching Theoretically Media! I cover a large range of topics here on the Creative AI space, Technology, Tutorials, and Reviews! Please enjoy your time here, and subscribe!
Your comments mean a LOT to me, and I read and try to respond to every one of them, so please do drop any thoughts, suggestions, questions, or topic requests!
You can check out the Free Video Cheatsheet/Handout here: https://theoreticallymusic.gumroad.com/l/describe
Wow, the describe new prompt for MidJourney looks amazing and inspiring! I love how it allows for even more creativity and flexibility in generating text. Thanks for sharing this development with us!
Can you please point in the direction of what a token is. I had not heard it used with respect to midjourney
Hey everyone. stop using unreal engine 5. It makes your renders less realistic as it is trying to reference a game engine. Thank me later
So basically this will lead to even more chaos, confusion and gibberish obscuring what is ACTUALLY on the picture with buzzwords and pseudo-helpful formulas and clichés. As if it wasnt enough how much nonsense and totally contradictory stuff "proompters" put into their prompts thinking they are smart. Like in the first example: 50mm lens, wide angle, Unreal Engine, RayTracing, OpenGL, Tone Mapping…? UGH! These things are pretty much totally negating one another, let alone realistic results.
thx for u re work
Amazing!. Thank you Sir.
Those images were all so great. Is there anywhere where we can see or download those images you generated?
Great vid as always! Can you use describe prompt to use a source image of eg: a model in front of a plain background, then get it to add in backgrounds you describe?
Love the vid! Definitely liked the explanation for the tokens! Looking forward to more!
love love love this video and how it makes the picture 4 more pictures 🙂
Sorry to disappoint you: you generated NOTHING by just typing a few words. By ordering a pizza, you also did not generate it. Dream on being "creative". Pathetic stuff for nerds.
@0:36 why you you prompt for a photo and prompt for CGI and cel shading at the same time? Sign of someone throwing everything at the wall and hoping that helps. These mega prompts make me roll my eyes. You can achieve the same result in 10 words.
Thanks for the great tips Tim!
This video is so well-crafted that I can't help but watch it over and over again. It's truly a masterpiece.
Don't forget to put Promt in the video description
How are these output generated from the 4 points in the description? is there a way to load these descriptives in?
Very cool and beneficial video. Thanks for sharing
Midjourney is cool but users must pay to utilize the site after a brief period of free use. I've also experimented with Blue Willow and I'm really amazed to use it
I love this new command for MJ, I tested it already and the results were great. Thank you for the in depth video 💯
Hello marina bay sands architect is Moshe Safdie
Thx for this video. It is really helpful. A question: How can I get consistent images of characters out of Midjourney, if f.e. I want to use it in a novel with the same characters in different scenes?
Thank you so much for always taking the time to share all these sophisticated mid journey techniques with us. Your content is so thorough and inspiring. It's greatly appreciated.
2:56 😂
This was very helpful. Thank you
Great video wow a lot of information. Keep sharing subscribed and liked. Thanks again for posting
Great video, so interesting. Thank you 🙏
Best video I have found for Midjourney, back engineering baby!
As a 100% newbie to AI, I want to do a small project (a 10 min animation), where the only stipulation is that AI is used for every stage. Can someone correct or improve my rough process below please?
This is what I'm planning to do:
1) Script – write with Chat GPT
2) Storyboard – create images with Midjourney
3) Animation – use storyboard images (created with Midjourney) for style + homemade video recordings for motion, and feed into Deep Motion, to create the end animation.
4) Audio – export animation from Deep Motion to Premiere then record character voices on a voice-distorter AI (don't know which one yet), then create final edit on Premiere.
-Does someone have a better software idea to input at any of these stages?
Many thanks in advance!
GRACIAS ! GRACIAS GRACIAS ! you don't have idea how i love your videos , thank you for sharing your knowledge wish you a lot of success in your live 💖
The Indonesian style you mentioned is, to my understanding, pronounced "bateek".
Batik is pronounced bah-teek with emphasis on the second syllable FYI . Love your videos!
Does anyone know who prompted the thumbnail for this video?
The "describe" link you posted in the description is NOT WORKING. Can you please check?