Title:
|
SAMPLE SCENES FROM A DISTRIBUTION: A CONTENT
CREATION PIPELINE FOR REALISTIC RENDERING
FOR NEURAL NETWORKS TRAINING |
Author(s):
|
Vadim Sanzharov, Vladimir Frolov, Alexey Voloboy and Vladimir Galaktionov |
ISBN:
|
978-989-8704-21-4 |
Editors:
|
Yingcai Xiao, Ajith P. Abraham and Jörg Roth |
Year:
|
2020 |
Edition:
|
Single |
Keywords:
|
Realistic Rendering, Procedural Pipeline, 3D Scene Generator, GPU Rendering, CNN Training Datasets |
Type:
|
Full |
First Page:
|
115 |
Last Page:
|
123 |
Language:
|
English |
Cover:
|
|
Full Contents:
|
click to dowload
|
Paper Abstract:
|
In this paper we present our experience with development of a content creation pipeline targeted at generation of realistic
image sequences with highly variable content. Our technique allows rendering of a single 3D object or a 3D scene in
variety of appearances which includes changing of geometry, materials and lighting. In our work we were able to
generate datasets for individual 3D objects and create procedural generator of interior scenes. Our solution is highly
controllable and allows generating datasets with desired distribution of features in a reproducible manner. Using synthetic
data in training, we have got increase of accuracy in CNN-based models comparing to usage of real-life data only. During
our work we had to significantly improve the content creation pipeline for the existing open source GPU rendering
system adapting it for our tasks. In this paper we suggest new approach for content creation which we call sampling
scenes from a distribution. |
|
|
|
|