Zobrazeno 1 - 10
of 55
pro vyhledávání: '"CHUAN-SHENG FOO"'
Autor:
Brenda Y. Han, Michelle K. Y. Seah, Imogen R. Brooks, Delia H. P. Quek, Dominic R. Huxley, Chuan-Sheng Foo, Li Ting Lee, Heike Wollmann, Huili Guo, Daniel M. Messerschmidt, Ernesto Guccione
Publikováno v:
Nature Communications, Vol 11, Iss 1, Pp 1-14 (2020)
PRDM family members are transcriptional regulators involved in cell identity and fate determination. Here, the authors characterize PRDM10 and show that it functions to ensure global translation efficiency during early embryonic development.
Externí odkaz:
https://doaj.org/article/e7989e77dc6240d282d3dcdcea55b0b3
Autor:
Brenda Y Han, Shuang Wu, Chuan-Sheng Foo, Robert M Horton, Craig N Jenne, Susan R Watson, Belinda Whittle, Chris C Goodnow, Jason G Cyster
Publikováno v:
eLife, Vol 3 (2014)
The generation of naïve T lymphocytes is critical for immune function yet the mechanisms governing their maturation remain incompletely understood. We have identified a mouse mutant, bloto, that harbors a hypomorphic mutation in the zinc finger prot
Externí odkaz:
https://doaj.org/article/ddad725609d84747b5161b6b5274b9bf
Autor:
Weizhuang Zhou, Yu En Chan, Chuan Sheng Foo, Jingxian Zhang, Jing Xian Teo, Sonia Davila, Weiting Huang, Jonathan Yap, Stuart Cook, Patrick Tan, Calvin Woon-Loong Chin, Khung Keong Yeo, Weng Khong Lim, Pavitra Krishnaswamy
Publikováno v:
Journal of Medical Internet Research, Vol 24, Iss 7, p e34669 (2022)
BackgroundConsumer-grade wearable devices enable detailed recordings of heart rate and step counts in free-living conditions. Recent studies have shown that summary statistics from these wearable recordings have potential uses for longitudinal monito
Externí odkaz:
https://doaj.org/article/53779a435fe84936a61d6c49417d853d
Publikováno v:
ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
Publikováno v:
IEEE Transactions on Image Processing. 31:3494-3508
Background clutters pose challenges to defocus blur detection. Existing approaches often produce artifact predictions in background areas with clutter and relatively low confident predictions in boundary areas. In this work, we tackle the above issue
Publikováno v:
IEEE Transactions on Neural Networks and Learning Systems. :1-12
Knowledge distillation is a learning paradigm for boosting resource-efficient graph neural networks (GNNs) using more expressive yet cumbersome teacher models. Past work on distillation for GNNs proposed the Local Structure Preserving loss (LSP), whi
Autor:
Cuong Nguyen, Arun Raja, Le Zhang, Xun Xu, Balagopal Unnikrishnan, Mohamed Ragab, Kangkang Lu, Chuan-Sheng Foo
Publikováno v:
Machine Learning.
Autor:
Lile Cai, Ramanpreet Singh Pahwa, Xun Xu, Jie Wang, Richard Chang, Lining Zhang, Chuan-Sheng Foo
Publikováno v:
2022 IEEE International Conference on Image Processing (ICIP).
Autor:
Yasin Yazici, Bruno Lecouat, Kim Hui Yap, Stefan Winkler, Georgios Piliouras, Vijay Chandrasekhar, Chuan-Sheng Foo
Publikováno v:
2022 IEEE International Conference on Image Processing (ICIP).
Autor:
RAGAB, MOHAMED, ELDELE, EMADELDEEN, WEE LING TAN, CHUAN-SHENG FOO, ZHENGHUA CHEN, MIN WU, CHEE-KEONG KWOH, XIAOLI LI
Publikováno v:
ACM Transactions on Knowledge Discovery from Data; Sep2023, Vol. 17 Issue 8, p1-18, 18p