Menu
Tablet menu

The Midjourney neural network generates stills from non-existent films: how to do it yourself?

​ 

For those who want to see Jodorowsky's "Dune" or Tarkovsky's "Spider-Man" The Midjourney neural network generates stills from non-existent films: how to do it yourself?

A lot of works of a neural network have appeared on the network, which represents what famous films would look like if they were shot by other directors.

The fourth version of the Midjourney neural network, released in mid-November, added new use cases. One of them is the ability to realistically generate stills from films. Moreover, films that never really existed: users combine pictures, directors and eras, edit iconic scenes and change actors in the lead roles.

The result was works that diverge on social networks: from Wes Anderson's The Avengers to Hayao Miyazaki's Alien. Some enthusiasts use neural networks to show footage from films that were planned but never made. For example, "Dune" by Alejandro Jodorowsky or "Harry Potter" by Terry Gilliam.

I've figured out how easy it is to generate my own stills from a movie, and I'm going to share with examples that I made myself and found in the thematic community on Reddit. TUTORIAL T⁠—⁠F New: a course on how to get a license The free course will help you become the master of the situation both in a driving school and on the road View program

Introduction to Midjourney: how to use a neural network

Midjourney is a neural network that generates pictures based on a text description. We have already talked in detail about the service in a separate article , but let's talk about the principles again.

This is the most affordable neural network of competitors. There is no public access to Dall-E , and you need a good computer to run Stable Diffusion. All work with Midjourney takes place in the Discord messenger: just download the application and join the neural network community. 

There, look for Newcomer Rooms channels, inside of which there are hundreds of Newbies chats. Together with you, thousands of users will generate pictures.

In order for the neural network to generate an image, you need to enter the /imagine command in one of the Newbies chats and come up with a request. The description of the desired picture must be formulated in English. When you come up with a request, click the "Submit" button. The bot will start generating an image. You will literally see live how the vague outlines turn into four pictures. After generation, you can improve the quality of one of the pictures - buttons U1, U2, U3 and U4 - or make new variations of a specific result with buttons V1, V2, V3 and V4. Source: Newcome Room Discord channel After generation, you can improve the quality of one of the pictures - buttons U1, U2, U3 and U4 - or make new variations of a specific result with buttons V1, V2, V3 and V4. Source: Newcome Room Discord channel

Midjourney has a huge number of settings and styles: you can create not only stills from films, but also art, realistic photos, 3D models , edit your own photos. We also wrote about some methods separately: for example, how to make a neuroavatar in any style.

The free version of the neural network has 25 attempts. An attempt is spent both on a full-fledged request and on an upscale of a specific image. Restrictions can be removed by paying for a subscription. The paid version has enough advantages: unlimited generation, increased quality, the ability to create pictures in a private channel. Unlimited for a month costs $30, but you cannot pay with a Russian bank card.

The second way to increase the number of attempts is to re-create Discord accounts after each trial ends. But there is a risk of getting banned on the platform.

Usually, Midjourney users first experiment with simple scenarios: they come up with simple queries, cross reference pictures without additional description. Creating stills from films is already a more complex and complex process. But with the right approach, the results of the neural network are impressive.

How to make a request and generate a movie

Make sure you are using the latest version of Midjourney. In Discord, enter the /settings command and make sure that the MJ version 4 setting is activated. Or just write --v 4 at the end of the request.

At the same time, I advise you to sign the --q 2 parameter, it improves the quality of generation.

starwars-midjourney.png

Star Wars by Wes Anderson. Source: reddit.com

Use the query for specific stills of the movie. Midjourney community alwaysstarts the wording with the words DVD screengrab.There are two reasons for this. Firstly , if you write just the name of the movie, then the neural network often generates its poster, and not specific stills. Secondly , in trying to take a screenshot from the DVD edition, Midjourney by default makes the pictures realistic, not art.

There is one main problem with DVD screengrab: sometimes the algorithm understands the request too directly and draws the TV screen. In this case, it is better to restart the generation or change additional parameters.

midj-movies.png

The neural network draws anime: how to process a photo using Different Dimension Me The addition of the DVD screengrab has greatly changed how the neural network sees the request to draw Ghostbusters. Zhenya Kuzmin / Midjourney

Choose movies and directors with a flamboyant style.

The quality of generation will directly depend on how well the neural network copes with a particular author or picture.

Midjourney was primarily trained on the works of artists, so we do not know for sure how well the neural network recognizes the styles of directors. So you have to experiment. What is clear with films is that it is better to choose films with recognizable pop culture imagery or with signature aesthetics and designs. The Avengers, The Matrix, Terminator, Alien, Star Wars, The Grand Budapest Hotel, Tron are great starting points. 

Directors are more difficult. Here are some of the references I've tried that definitely work.

midj-filstills.png

Hayao Miyazaki - for the recognizable style of Japanese animation. In fact, not even his name in the request would be better, but directed by Studio Ghibli. Solstice by Studio Ghibli. Source: reddit

Tim Burton - bring a little goth. By default, the director's name adds gloominess, but you can specify and choose, for example, "Charlie and the Chocolate Factory" as a reference.

Andrei Tarkovsky - In general, Midjourney does not do well with Russian and Soviet directors. But Tarkovsky stands apart here: the combination of his style with modern cinema looks especially interesting.

tarkovsky.png

How to edit photos using the Lensa app and make an avatar in different styles "Spider-Man" by Tarkovsky. Source: Zhenya Kuzmin / Midjourney

Wes Anderson - Midjourney very willingly conveys the characteristic features of the director's visual language: pastel colors, symmetry, the location of the characters in the center. Stylistics can be used literally with any film.

Alejandro Jodorowsky - for those who have long wanted to see "Dune" from the legendary Chilean director. It was a huge project featuring Salvador Dali, Mick Jagger and Pink Floyd that was canceled after years of pre-production. Only the concept art of the film, which was hailed as "the greatest unmade film", remained.

jadorowsky.png

This is what Jodorowsky's "Dune" looks like through the eyes of a neural network. Source: reddit.com

Choose a description of specific scenes. The query "Still from the DVD edition of The Lord of the Rings" is not bad, but it gives the neural network too much freedom. So there are more chances that you will not get the desired result. But the query "The scene from the Lord of the Rings DVD , where Aragorn, Legolas and Gimli run across the green field after the orcs" will work much more accurately.

If you don't speak English well enough to describe the scene, use an interpreter. I recommend the DeepL service - it translates texts from Russian with the help of neural networks.

returntooz.png

ChatGPT: how to use the neural network and what it can do A Reddit user has reimagined the 1985 film "Return to Oz", making it much darker with detailed scenes. Source: lmgur.com

If the description does not work, use reference pictures. They will help you get the most out of the specific movie scene you want to redo. Or choose the right actor for the main role. You can write as much as you like "Scene from the Matrix, where Neo stops flying bullets." But it’s really easier to send a picture and add text what exactly to change in it.

How to work with reference images, we described in detail in another article.

I repeat the instructions:

1. Either find a suitable picture in the public domain on the net, or upload it yourself to photo hosting. Simply uploading an image to Discord will not work.

2. Copy the link and paste it after typing the /imagine command in the Prompt field.

3. Add additional parameters: in what style to change the picture, what to add. Feel free to experiment, but don't forget the limited free trials.

4. If desired, you can insert several reference images at once. In this case, Midjourney will combine them in equal proportions. But you can’t greatly influence what exactly the neural network will borrow from each picture.

Adjust the aspect ratio of the picture. By default, squares are generated in 1:1 proportions. But you can add --ar 3:2 to the request, then the pictures will even more resemble real stills from the movie.

What queries should be the result

Below are real requests to Midjourney from Reddit users that they shared with the community. You can use them as templates: keep the structure, but change films, years, scene descriptions and directors.

An example of a simple request for a random still from a movie: DVD screengrab from the movie Return of The Jedi, 1983 --ar 3:2 --v 4.

An example query to generate a scene from a movie with the style of another movie or director: DVD screengrab of the [scene description] scene from the Star Wars movie directed by Alejandro Jodorowsky, 1975 --ar 3:2 --q 2 --v 4.

Example of a simple tag query: DVD screengrab, The Alien movie, 1989 Studio Ghibli anime movie style --ar 3:2 --q 2 --v 4.

Example of a complex tag query: DVD screengrab, 1989 Studio Ghibli anime movie, World War 2, [scene details] --ar 3:2 --q 2 --v 4 .

>
back to top