Full metadata record
DC FieldValueLanguage
dc.contributor.authorWang, Han-Leien_US
dc.contributor.authorHan, Ping-Hsuanen_US
dc.contributor.authorChen, Yu-Muen_US
dc.contributor.authorChen, Kuan-Wenen_US
dc.contributor.authorLin, XinYien_US
dc.contributor.authorLee, Ming-Suien_US
dc.contributor.authorHung, Yi-Pingen_US
dc.date.accessioned2019-04-02T06:04:27Z-
dc.date.available2019-04-02T06:04:27Z-
dc.date.issued2018-01-01en_US
dc.identifier.urihttp://dx.doi.org/10.1145/3283254.3283263en_US
dc.identifier.urihttp://hdl.handle.net/11536/150970-
dc.description.abstractAs time goes by, the art pieces inside Dunhuang Grottoes have suffered from tremendous damage such as mural deterioration, and they are usually difficult to be repaired. Although we can achieve digital preservation by modeling the caves and preserving mural as textures in virtual environment, we still cannot have a glimpse of how the grottoes look like without damage. In this work, we propose a systematic restoration framework, which is based on Generative Adversarial Network (GAN) technique, for these high-resolution but deteriorated mural textures. The main idea is to make the machine learn the transformation between deteriorated mural textures and restored mural textures. However, the resolution of training texture images (i.e. 8192x8192) is too high to be applied with GAN technology directly due to GPU RAM limitation. Instead, our method restores a set of high-resolution yet color-inconsistent textures patch-by-patch and a set of low-resolution but color-consistent full textures, and then combines them to get the final high-resolution and color-consistent result.en_US
dc.language.isoen_USen_US
dc.subjectDunhuang mural restorationen_US
dc.subjectdeep learningen_US
dc.subjectgenerative adversarial networken_US
dc.titleDunhuang Mural Restoration using Deep Learningen_US
dc.typeProceedings Paperen_US
dc.identifier.doi10.1145/3283254.3283263en_US
dc.identifier.journalSA'18: SIGGRAPH ASIA 2018 TECHNICAL BRIEFSen_US
dc.contributor.department交大名義發表zh_TW
dc.contributor.departmentNational Chiao Tung Universityen_US
dc.identifier.wosnumberWOS:000455606800023en_US
dc.citation.woscount0en_US
Appears in Collections:Conferences Paper