
We are using a bf16 version of the weights, which leads to type warnings that you can safely ignore. Loading the pre-trained Flax pipeline will return both the pipeline itself and the model weights (or parameters). dtype = jnp.bfloat16įlax is a functional framework, so models are stateless and parameters are stored outside them.
#Urban gridded notebook full#
We'll use it for our tests, but you can also use float32 to use full precision instead. TPU devices support bfloat16, an efficient half-float type. if not (()/ '.huggingface'/ 'token').exists(): notebook_login() The following cell will present a login interface unless you've already authenticated before in this computer. Or use notebook_login() in a notebook, which does the same thing.It will be saved in a file in your computer. Use the huggingface-cli login command-line tool in your terminal and paste your token when prompted.You have two options to provide your access token:
#Urban gridded notebook code#
If you do, you need to be a registered user in the Hub and use an access token for the code to work. The model card provides more details, so take a moment to read them and consider carefully whether you accept the license. It's an open license that claims no rights on the outputs you generate and prohibits you from deliberately producing illegal or harmful content. The Stable Diffusion model is distributed under the CreateML OpenRail-M license.


If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users.įlax weights are available in Hugging Face Hub as part of the Stable Diffusion repo.

If you want to follow along, click the button above to open this post as a Colab notebook.įirst, make sure you are using a TPU backend. If you want more details about how Stable Diffusion works or want to run it in GPU, please refer to this Colab notebook.
#Urban gridded notebook how to#
This post shows how to run inference using JAX / Flax. 🤗 Hugging Face Diffusers supports Flax since version 0.5.1! This allows for super fast inference on Google TPUs, such as those available in Colab, Kaggle or Google Cloud Platform. Patrickvonplaten Patrick von Platen Stable Diffusion in JAX / Flax 🚀
