Zobrazeno 1 - 10
of 116
pro vyhledávání: '"Clément, COLIN"'
In many condensed matter systems, long range order emerges at low temperatures as thermal fluctuations subside. In the presence of competing interactions or quenched disorder, however, some systems can show unusual configurations that become more dis
Externí odkaz:
http://arxiv.org/abs/2411.10445
Autor:
Qi, Mengnan, Huang, Yufan, Wang, Maoquan, Yao, Yongqiang, Liu, Zihan, Gu, Bin, Clement, Colin, Sundaresan, Neel
Automatic Program translation has enormous application value and hence has been attracting significant interest from AI researchers. However, we observe that current program translation models still make elementary syntax errors, particularly, when t
Externí odkaz:
http://arxiv.org/abs/2310.14209
Autor:
Huang, Yufan, Qi, Mengnan, Yao, Yongqiang, Wang, Maoquan, Gu, Bin, Clement, Colin, Sundaresan, Neel
Software version migration and program translation are an important and costly part of the lifecycle of large codebases. Traditional machine translation relies on parallel corpora for supervised translation, which is not feasible for program translat
Externí odkaz:
http://arxiv.org/abs/2310.11476
Code coverage is a widely used metric for quantifying the extent to which program elements, such as statements or branches, are executed during testing. Calculating code coverage is resource-intensive, requiring code building and execution with addit
Externí odkaz:
http://arxiv.org/abs/2307.13383
Autor:
Huang, Junjie, Wang, Chenglong, Zhang, Jipeng, Yan, Cong, Cui, Haotian, Inala, Jeevana Priya, Clement, Colin, Duan, Nan, Gao, Jianfeng
Code generation models can benefit data scientists' productivity by automatically generating code from context and text descriptions. An important measure of the modeling progress is whether a model can generate code that can correctly execute to sol
Externí odkaz:
http://arxiv.org/abs/2211.09374
Autor:
Zlotchevski, Andrei, Drain, Dawn, Svyatkovskiy, Alexey, Clement, Colin, Sundaresan, Neel, Tufano, Michele
Large Transformer models achieved the state-of-the-art status for Natural Language Understanding tasks and are increasingly becoming the baseline model architecture for modeling source code. Transformers are usually pre-trained on large unsupervised
Externí odkaz:
http://arxiv.org/abs/2208.13928
Improving software performance is an important yet challenging part of the software development cycle. Today, the majority of performance inefficiencies are identified and patched by performance experts. Recent advancements in deep learning approache
Externí odkaz:
http://arxiv.org/abs/2206.13619
Autor:
Moghaddam, Roshanak Zilouchian, Garg, Spandan, Clement, Colin B., Mohylevskyy, Yevhen, Sundaresan, Neel
Continuous evolution in modern software often causes documentation, tutorials, and examples to be out of sync with changing interfaces and frameworks. Relying on outdated documentation and examples can lead programs to fail or be less efficient or ev
Externí odkaz:
http://arxiv.org/abs/2204.12648
Autor:
Kharkar, Anant, Moghaddam, Roshanak Zilouchian, Jin, Matthew, Liu, Xiaoyu, Shi, Xin, Clement, Colin, Sundaresan, Neel
Due to increasingly complex software design and rapid iterative development, code defects and security vulnerabilities are prevalent in modern software. In response, programmers rely on static analysis tools to regularly scan their codebases and find
Externí odkaz:
http://arxiv.org/abs/2203.09907
We study the feasibility of a Data Science assistant powered by a sequence-to-sequence transformer by training a new model JuPyT5 on all publicly available Jupyter Notebook GitHub repositories and developing a new metric: Data Science Problems (DSP).
Externí odkaz:
http://arxiv.org/abs/2201.12901