Zobrazeno 1 - 10
of 36
pro vyhledávání: '"Hirofumi Wakaki"'
Publikováno v:
Intelligent Decision Technologies ISBN: 9789811559242
KES-IDT
KES-IDT
In a generalized ridge (GR) regression, since a GR estimator (GRE) depends on ridge parameters, it is important to select those parameters appropriately. Ridge parameters selected by minimizing the generalized \(C_p\) (\(GC_p\)) criterion can be obta
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_________::d6d7f084d0eeafb1c2ce2ff75835f4a0
https://doi.org/10.1007/978-981-15-5925-9_23
https://doi.org/10.1007/978-981-15-5925-9_23
Autor:
Hirofumi Wakaki, Ya. Fujikoshi
Publikováno v:
Theory of Probability & Its Applications. 62:157-172
Let $\lambda$ be the LR criterion for testing an additional information hypothesis on a subvector of $p$-variate random vector ${x}$ and a subvector of $q$-variate random vector ${y}$, based on a sample of size $N=n+1$. Using the fact that the null d
Autor:
Hirofumi Wakaki, Tomoyuki Nakagawa
Publikováno v:
JOURNAL OF THE JAPAN STATISTICAL SOCIETY. 47:145-165
Autor:
Tomoyuki Nakagawa1 nakagawa.stat@gmail.com, Hirofumi Wakaki1
Publikováno v:
Journal of the Japan Statistical Society. 2017, Vol. 47 Issue 2, p145-165. 21p.
Autor:
Hirofumi Wakaki, Yasunori Fujikoshi
Publikováno v:
Teoriya Veroyatnostei i ee Primeneniya. 62:194-211
Autor:
Yu Inatsu1, Hirofumi Wakaki1 d144576@hiroshima-u.ac.jp
Publikováno v:
Journal of the Japan Statistical Society. 2016, Vol. 46 Issue 1, p51-79. 29p.
Publikováno v:
Hiroshima Math. J. 47, no. 1 (2017), 43-62
In this paper we obtain a higher order asymptotic unbiased estimator for the expected probability of misclassification (EPMC) of the linear discriminant function when both the dimension and the sample size are large. Moreover, we evaluate the mean sq
Publikováno v:
Scandinavian Journal of Statistics. 41:535-555
In real-data analysis, deciding the best subset of variables in regression models is an important problem. Akaike's information criterion (AIC) is often used in order to select variables in many fields. When the sample size is not so large, the AIC h
Publikováno v:
JOURNAL OF THE JAPAN STATISTICAL SOCIETY. 43:57-78
Principal components analysis (PCA) is one method for reducing the dimension of the explanatory variables, although the principal components are derived by using all the explanatory variables. Several authors have proposed a modified PCA (MPCA), whic
Autor:
Hirofumi Wakaki, Hiroaki Shimizu
Publikováno v:
Journal of Multivariate Analysis. 102(6):1080-1089
Let S be a pxp random matrix having a Wishart distribution W"p(n,n^-^1@S). For testing a general covariance structure @S=@S(@x), we consider a class of test statistics T"h=n@r"h(S,@S(@x@?)), where @r"h(@S"1,@S"2)=@?"i"="1^ph(@l"i) is a distance measu