GraspGen: Scaling Simulated Grasping
GraspGen is a large-scale simulated grasp dataset for multiple robot embodiments and grippers

We release over 57 million grasps, computed for a subset of 8515 objects from the Objaverse XL (LVIS) dataset. We release grasps for three grippers: Franka Panda, the Robotiq-2f-140 industrial gripper, and suction.

Dataset Format
The dataset is released in the WebDataset format. The folder structure of the dataset is as follows:
grasp_data/
franka/shard_{0-7}.tar
robotiq2f140/shard_{0-7}.tar
suction/shard_{0-7}.tar
splits/
franka/{train/valid}_scenes.json
robotiq2f140/{train/valid}_scenes.json
suction/{train/valid}_scenes.json
We release test-train splits along with the grasp dataset.
Each json file in the shard has the following data in a python dictionary. Note that num_grasps=2000
per object.
‘object’/
‘scale’ # This is the scale of the asset
‘grasps’/
‘object_in_gripper’ # boolean mask indicating grasp success, [num_grasps X 1]
‘transforms’ # Pose of the gripper in homogenous matrices, [num_grasps X 4 X 4]
Visualizing the dataset
We have provided some standalone scripts for visualizing this dataset. See the header of the visualize_dataset.py for installation instructions
Before running any of the visualization scripts, remember to start meshcat-server in a separate terminal:
meshcat-server
To visualize a single object from the dataset, alongside its grasps:
cd scripts/ && python visualize_dataset.py --dataset_path /path/to/dataset --object_uuid {object_uuid} --object_file /path/to/mesh --gripper_name {choose from: franka, suction, robotiq2f140}
Objaverse dataset
Please download the Objaverse XL (LVIS) objects separately. See the helper script download_objaverse.py for instructions and usage.
License
License Copyright © 2025, NVIDIA Corporation & affiliates. All rights reserved.
The dataset is released under a CC-BY 4.0 License.
The visualization code is released under the NVIDIA source code license.
Contact
Please reach out to Adithya Murali ([email protected]) and Clemens Eppner ([email protected]) for further enquiries