Publications from our Researchers

Several of our current PhD candidates and fellow researchers at the Data Science Institute have published, or in the proccess of publishing, papers to present their research.  

Citation

BibTex format

@article{Mo,
author = {Mo, Y and Wang, S and Dai, C and Zhou, R and Teng, Z and Bai, W and Guo, Y},
title = {Efficient Deep Representation Learning by Adaptive Latent Space Sampling},
url = {http://arxiv.org/abs/2004.02757v2},
}

RIS format (EndNote, RefMan)

TY  - JOUR
AB - Supervised deep learning requires a large amount of training samples withannotations (e.g. label class for classification task, pixel- or voxel-wisedlabel map for segmentation tasks), which are expensive and time-consuming toobtain. During the training of a deep neural network, the annotated samples arefed into the network in a mini-batch way, where they are often regarded ofequal importance. However, some of the samples may become less informativeduring training, as the magnitude of the gradient start to vanish for thesesamples. In the meantime, other samples of higher utility or hardness may bemore demanded for the training process to proceed and require moreexploitation. To address the challenges of expensive annotations and loss ofsample informativeness, here we propose a novel training framework whichadaptively selects informative samples that are fed to the training process.The adaptive selection or sampling is performed based on a hardness-awarestrategy in the latent space constructed by a generative model. To evaluate theproposed training framework, we perform experiments on three differentdatasets, including MNIST and CIFAR-10 for image classification task and amedical image dataset IVUS for biophysical simulation task. On all threedatasets, the proposed framework outperforms a random sampling method, whichdemonstrates the effectiveness of proposed framework.
AU - Mo,Y
AU - Wang,S
AU - Dai,C
AU - Zhou,R
AU - Teng,Z
AU - Bai,W
AU - Guo,Y
TI - Efficient Deep Representation Learning by Adaptive Latent Space Sampling
UR - http://arxiv.org/abs/2004.02757v2
ER -