Share Email Print
cover

Proceedings Paper

A research on generative adversarial network algorithm based on GPU parallel acceleration
Author(s): Haibo Chen; Tao Jia; Jing Tang
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

In recent years, Generative Adversarial Network (GAN) has received much attention in the field of machine learning. It is an unsupervised learning model which is widely used in image, video, voice, etc. Based on GAN's two-man zero-sum game theory, the researchers proposed excellent variant algorithms such as deep convolutional GAN(DCGAN), Conditional GAN(CGAN), Least Squares GAN(LSGAN), and Boundary Equilibrium GAN (BEGAN), which has gradually overcome the problem of training imbalance and model collapse. However, the time efficiency of model training has always been a challenging problem. This paper proposes a GAN algorithm based on GPU parallel acceleration, which utilizes the powerful computing power of GPU and the advantages of multi-parallel computing, greatly reduces the time of model training, improves the training efficiency of GAN model, and achieves better modeling performance. Finally, we used the LSUN public scene dataset and the TIMIT public voice dataset to evaluate the proposed algorithm and compare it with the traditional GAN, DCGAN, LSGAN, and BEGAN algorithms. The experiment has fully proved the time advantage of the model training of the algorithm introduced in this paper.

Paper Details

Date Published: 27 November 2019
PDF: 8 pages
Proc. SPIE 11321, 2019 International Conference on Image and Video Processing, and Artificial Intelligence, 113211R (27 November 2019); doi: 10.1117/12.2539238
Show Author Affiliations
Haibo Chen, China Unicom System Integration Ltd. Corp. (China)
Tao Jia, China Unicom System Integration Ltd. Corp. (China)
Jing Tang, China Unicom System Integration Ltd. Corp. (China)


Published in SPIE Proceedings Vol. 11321:
2019 International Conference on Image and Video Processing, and Artificial Intelligence
Ruidan Su, Editor(s)

© SPIE. Terms of Use
Back to Top
PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?
close_icon_gray