Update README.md
Browse files
README.md
CHANGED
@@ -10,9 +10,18 @@ size_categories:
|
|
10 |
# 📦 GS2E: Gaussian Splatting is an Effective DataGenerator for Event Stream Generation
|
11 |
> *Submission to NeurIPS 2025 D&B Track, Under Review.*
|
12 |
|
|
|
|
|
|
|
|
|
13 |
## 🧾 Dataset Summary
|
14 |
**GS2E** (Gaussian Splatting for Event stream Extraction) is a synthetic multi-view event dataset designed to support high-fidelity 3D scene understanding, novel view synthesis, and event-based neural rendering. Unlike previous video-driven or graphics-only event datasets, GS2E leverages 3D Gaussian Splatting (3DGS) to generate geometry-consistent photorealistic RGB frames from sparse camera poses, followed by physically-informed event simulation with adaptive contrast threshold modeling. The dataset enables scalable, controllable, and sensor-faithful generation of realistic event streams with aligned RGB and camera pose data.
|
15 |
|
|
|
|
|
|
|
|
|
|
|
16 |
## 📚 Dataset Description
|
17 |
Event cameras offer unique advantages—such as low latency, high temporal resolution, and high dynamic range—making them ideal for 3D reconstruction and SLAM under rapid motion and challenging lighting. However, the lack of large-scale, geometry-consistent event datasets has hindered the development of event-driven or hybrid RGB-event methods.
|
18 |
|
@@ -54,20 +63,9 @@ This synthetic event dataset is organized by scene, with each scene directory co
|
|
54 |
This structure enables joint processing of visual and event data for various tasks such as event-based deblurring, video reconstruction, and hybrid SfM pipelines.
|
55 |
|
56 |
|
57 |
-
|
58 |
-
|
59 |
-
|
60 |
-
| `/cam0/pose` | Camera Pose | 1000 |
|
61 |
-
| `/imu` | IMU measurements with simulated noise | 1000 |
|
62 |
-
| `/cam0/image_raw` | RGB image | 250 |
|
63 |
-
| `/cam0/depthmap` | Depth map | 10 |
|
64 |
-
| `/cam0/optic_flow` | Optical flow map | 10 |
|
65 |
-
| `/cam0/camera_info` | Camera intrinsic and lens distortion parameters | 10
|
66 |
-
|
67 |
-
It is obtained by running the improved ESIM with the associated `esim.conf` configuration file, which references camera intrinsics configuration files `pinhole_mono_nodistort_f={1111, 1250}.yaml` and camera trajectory CSV files `{hemisphere, sphere}_spiral-rev=4[...].csv`.
|
68 |
-
|
69 |
-
The validation and test views of each scene are given in the `views/` folder, which is structured according to the NeRF synthetic dataset (except for the depth and normal maps). These views are rendered from the scene Blend-files, given in the `scenes/` folder. Specifically, we create a [Conda](https://docs.conda.io/en/latest/) environment with [Blender as a Python module](https://docs.blender.org/api/current/info_advanced_blender_as_bpy.html) installed, according to [these instructions](https://github.com/wengflow/rpg_esim#blender), to run the `bpy_render_views.py` Python script for rendering the evaluation views.
|
70 |
-
-->
|
71 |
|
72 |
## Setup
|
73 |
|
|
|
10 |
# 📦 GS2E: Gaussian Splatting is an Effective DataGenerator for Event Stream Generation
|
11 |
> *Submission to NeurIPS 2025 D&B Track, Under Review.*
|
12 |
|
13 |
+
<p align="center">
|
14 |
+
<img src="./assets/teaser_00.png" alt="Teaser of GS2E" width="80%">
|
15 |
+
</p>
|
16 |
+
|
17 |
## 🧾 Dataset Summary
|
18 |
**GS2E** (Gaussian Splatting for Event stream Extraction) is a synthetic multi-view event dataset designed to support high-fidelity 3D scene understanding, novel view synthesis, and event-based neural rendering. Unlike previous video-driven or graphics-only event datasets, GS2E leverages 3D Gaussian Splatting (3DGS) to generate geometry-consistent photorealistic RGB frames from sparse camera poses, followed by physically-informed event simulation with adaptive contrast threshold modeling. The dataset enables scalable, controllable, and sensor-faithful generation of realistic event streams with aligned RGB and camera pose data.
|
19 |
|
20 |
+
<p align="center">
|
21 |
+
<img src="./assets/egm_diff.png" alt="Teaser of GS2E" width="80%">
|
22 |
+
</p>
|
23 |
+
|
24 |
+
|
25 |
## 📚 Dataset Description
|
26 |
Event cameras offer unique advantages—such as low latency, high temporal resolution, and high dynamic range—making them ideal for 3D reconstruction and SLAM under rapid motion and challenging lighting. However, the lack of large-scale, geometry-consistent event datasets has hindered the development of event-driven or hybrid RGB-event methods.
|
27 |
|
|
|
63 |
This structure enables joint processing of visual and event data for various tasks such as event-based deblurring, video reconstruction, and hybrid SfM pipelines.
|
64 |
|
65 |
|
66 |
+
<p align="center">
|
67 |
+
<img src="./assets/pipeline.png" alt="Teaser of GS2E" width="80%">
|
68 |
+
</p>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
69 |
|
70 |
## Setup
|
71 |
|