Skip to content

Commit 47b3346

Browse files
yiyixuxuyiyixuxupcuencasayakpaul
authored
Shap-E: add support for mesh output (huggingface#4062)
* add output_type=mesh * update img2img * make style * add doc * make style * Apply suggestions from code review Co-authored-by: Pedro Cuenca <[email protected]> * add docstring for output_type * add a section in doc about hub mesh visualization/ rotation * update conversion script so default background is white * Apply suggestions from code review Co-authored-by: Sayak Paul <[email protected]> * Update src/diffusers/pipelines/shap_e/pipeline_shap_e_img2img.py Co-authored-by: Sayak Paul <[email protected]> * renderer -> shap_e_renderer * img2img renderer -> shap_e_renderer * fix tests --------- Co-authored-by: yiyixuxu <yixu310@gmail,com> Co-authored-by: Pedro Cuenca <[email protected]> Co-authored-by: Sayak Paul <[email protected]>
1 parent 07f1fbb commit 47b3346

File tree

9 files changed

+1040
-59
lines changed

9 files changed

+1040
-59
lines changed

docs/source/en/api/pipelines/shap_e.mdx

Lines changed: 57 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -128,6 +128,63 @@ gif_path = export_to_gif(images[0], "burger_3d.gif")
128128
```
129129
![img](https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main/shap_e/burger_out.gif)
130130

131+
### Generate mesh
132+
133+
For both [`ShapEPipeline`] and [`ShapEImg2ImgPipeline`], you can generate mesh output by passing `output_type` as `mesh` to the pipeline, and then use the [`ShapEPipeline.export_to_ply`] utility function to save the output as a `ply` file. We also provide a [`ShapEPipeline.export_to_obj`] function that you can use to save mesh outputs as `obj` files.
134+
135+
```python
136+
import torch
137+
138+
from diffusers import DiffusionPipeline
139+
from diffusers.utils import export_to_ply
140+
141+
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
142+
143+
repo = "openai/shap-e"
144+
pipe = DiffusionPipeline.from_pretrained(repo, torch_dtype=torch.float16, variant="fp16")
145+
pipe = pipe.to(device)
146+
147+
guidance_scale = 15.0
148+
prompt = "A birthday cupcake"
149+
150+
images = pipe(prompt, guidance_scale=guidance_scale, num_inference_steps=64, frame_size=256, output_type="mesh").images
151+
152+
ply_path = export_to_ply(images[0], "3d_cake.ply")
153+
print(f"saved to folder: {ply_path}")
154+
```
155+
156+
Huggingface Datasets supports mesh visualization for mesh files in `glb` format. Below we will show you how to convert your mesh file into `glb` format so that you can use the Dataset viewer to render 3D objects.
157+
158+
We need to install `trimesh` library.
159+
160+
```
161+
pip install trimesh
162+
```
163+
164+
To convert the mesh file into `glb` format,
165+
166+
```python
167+
import trimesh
168+
169+
mesh = trimesh.load("3d_cake.ply")
170+
mesh.export("3d_cake.glb", file_type="glb")
171+
```
172+
173+
By default, the mesh output of Shap-E is from the bottom viewpoint; you can change the default viewpoint by applying a rotation transformation
174+
175+
```python
176+
import trimesh
177+
import numpy as np
178+
179+
mesh = trimesh.load("3d_cake.ply")
180+
rot = trimesh.transformations.rotation_matrix(-np.pi / 2, [1, 0, 0])
181+
mesh = mesh.apply_transform(rot)
182+
mesh.export("3d_cake.glb", file_type="glb")
183+
```
184+
185+
Now you can upload your mesh file to your dataset and visualize it! Here is the link to the 3D cake we just generated
186+
https://huggingface.co/datasets/hf-internal-testing/diffusers-images/blob/main/shap_e/3d_cake.glb
187+
131188
## ShapEPipeline
132189
[[autodoc]] ShapEPipeline
133190
- all

0 commit comments

Comments
 (0)