Zobrazeno 1 - 10
of 225
pro vyhledávání: '"Chou, Yu‐Cheng"'
Touchstone Benchmark: Are We on the Right Way for Evaluating AI Algorithms for Medical Segmentation?
Autor:
Bassi, Pedro R. A. S., Li, Wenxuan, Tang, Yucheng, Isensee, Fabian, Wang, Zifu, Chen, Jieneng, Chou, Yu-Cheng, Kirchhoff, Yannick, Rokuss, Maximilian, Huang, Ziyan, Ye, Jin, He, Junjun, Wald, Tassilo, Ulrich, Constantin, Baumgartner, Michael, Roy, Saikat, Maier-Hein, Klaus H., Jaeger, Paul, Ye, Yiwen, Xie, Yutong, Zhang, Jianpeng, Chen, Ziyang, Xia, Yong, Xing, Zhaohu, Zhu, Lei, Sadegheih, Yousef, Bozorgpour, Afshin, Kumari, Pratibha, Azad, Reza, Merhof, Dorit, Shi, Pengcheng, Ma, Ting, Du, Yuxin, Bai, Fan, Huang, Tiejun, Zhao, Bo, Wang, Haonan, Li, Xiaomeng, Gu, Hanxue, Dong, Haoyu, Yang, Jichen, Mazurowski, Maciej A., Gupta, Saumya, Wu, Linshan, Zhuang, Jiaxin, Chen, Hao, Roth, Holger, Xu, Daguang, Blaschko, Matthew B., Decherchi, Sergio, Cavalli, Andrea, Yuille, Alan L., Zhou, Zongwei
How can we test AI performance? This question seems trivial, but it isn't. Standard benchmarks often have problems such as in-distribution and small-size test sets, oversimplified metrics, unfair comparisons, and short-term outcome pressure. As a con
Externí odkaz:
http://arxiv.org/abs/2411.03670
As massive medical data become available with an increasing number of scans, expanding classes, and varying sources, prevalent training paradigms -- where AI is trained with multiple passes over fixed, finite datasets -- face significant challenges.
Externí odkaz:
http://arxiv.org/abs/2407.04687
Publikováno v:
Mach. Intell. Res. (2024)
Creating large-scale and well-annotated datasets to train AI algorithms is crucial for automated tumor detection and localization. However, with limited resources, it is challenging to determine the best type of annotations when annotating massive am
Externí odkaz:
http://arxiv.org/abs/2310.15098
Early detection and localization of pancreatic cancer can increase the 5-year survival rate for patients from 8.5% to 20%. Artificial intelligence (AI) can potentially assist radiologists in detecting pancreatic tumors at an early stage. Training AI
Externí odkaz:
http://arxiv.org/abs/2308.03008
Publikováno v:
Machine Intelligence Research. 20, 92-108 (2023)
This paper introduces DGNet, a novel deep framework that exploits object gradient supervision for camouflaged object detection (COD). It decouples the task into two connected branches, i.e., a context and a texture encoder. The essential connection i
Externí odkaz:
http://arxiv.org/abs/2205.12853
Autor:
Ji, Ge-Peng, Xiao, Guobao, Chou, Yu-Cheng, Fan, Deng-Ping, Zhao, Kai, Chen, Geng, Van Gool, Luc
Publikováno v:
Machine Intelligence Research, vol. 19, no. 6, pp.531-549, 2022
We present the first comprehensive video polyp segmentation (VPS) study in the deep learning era. Over the years, developments in VPS are not moving forward with ease due to the lack of large-scale fine-grained segmentation annotations. To address th
Externí odkaz:
http://arxiv.org/abs/2203.14291
Autor:
Chou, Yu-Cheng.
Thesis (Ph. D.)--University of California, Davis, 2009.
Title based on PDF title page (viewed 06/18/2010). Degree granted in Mechanical and Aeronautical Engineering.
Title based on PDF title page (viewed 06/18/2010). Degree granted in Mechanical and Aeronautical Engineering.
Externí odkaz:
http://proquest.umi.com/pqdweb?did=1983628981&sid=1&Fmt=2&clientId=48051&RQT=309&VName=PQD
Publikováno v:
In Dental Materials August 2024 40(8):1208-1215
Owing to the difficulties of mining spatial-temporal cues, the existing approaches for video salient object detection (VSOD) are limited in understanding complex and noisy scenarios, and often fail in inferring prominent objects. To alleviate such sh
Externí odkaz:
http://arxiv.org/abs/2105.10110
Existing video polyp segmentation (VPS) models typically employ convolutional neural networks (CNNs) to extract features. However, due to their limited receptive fields, CNNs can not fully exploit the global temporal and spatial information in succes
Externí odkaz:
http://arxiv.org/abs/2105.08468