Zobrazeno 1 - 10
of 49
pro vyhledávání: '"SALEM, FATHI M."'
LSTM or Long Short Term Memory Networks is a specific type of Recurrent Neural Network (RNN) that is very effective in dealing with long sequence data and learning long term dependencies. In this work, we perform sentiment analysis on a GOP Debate Tw
Externí odkaz:
http://arxiv.org/abs/2005.03993
Autor:
Akandeh, Atra, Salem, Fathi M.
We have shown previously that our parameter-reduced variants of Long Short-Term Memory (LSTM) Recurrent Neural Networks (RNN) are comparable in performance to the standard LSTM RNN on the MNIST dataset. In this study, we show that this is also the ca
Externí odkaz:
http://arxiv.org/abs/1901.06401
Autor:
Kent, Daniel, Salem, Fathi M.
The Long Short-Term Memory (LSTM) layer is an important advancement in the field of neural networks and machine learning, allowing for effective training and impressive inference performance. LSTM-based neural networks have been successfully employed
Externí odkaz:
http://arxiv.org/abs/1901.00525
Autor:
Salem, Fathi M.
Long Short-Term Memory (LSTM) Recurrent Neural networks (RNNs) rely on gating signals, each driven by a function of a weighted sum of at least 3 components: (i) one of an adaptive weight matrix multiplied by the incoming external input vector sequenc
Externí odkaz:
http://arxiv.org/abs/1812.11391
Autor:
Akandeh, Atra, Salem, Fathi M.
This is part II of three-part work. Here, we present a second set of inter-related five variants of simplified Long Short-term Memory (LSTM) recurrent neural networks by further reducing adaptive parameters. Two of these models have been introduced i
Externí odkaz:
http://arxiv.org/abs/1707.04623
Autor:
Dey, Rahul, Salem, Fathi M.
The paper evaluates three variants of the Gated Recurrent Unit (GRU) in recurrent neural networks (RNN) by reducing parameters in the update and reset gates. We evaluate the three variant GRU models on MNIST and IMDB datasets and show that these GRU-
Externí odkaz:
http://arxiv.org/abs/1701.05923
Autor:
Heck, Joel, Salem, Fathi M.
Recurrent neural networks with various types of hidden units have been used to solve a diverse range of problems involving sequence data. Two of the most recent proposals, gated recurrent units (GRU) and minimal gated units (MGU), have shown comparab
Externí odkaz:
http://arxiv.org/abs/1701.03452
Autor:
Lu, Yuzhen, Salem, Fathi M.
The standard LSTM recurrent neural networks while very powerful in long-range dependency sequence applications have highly complex structure and relatively large (adaptive) parameters. In this work, we present empirical comparison between the standar
Externí odkaz:
http://arxiv.org/abs/1701.03441
Autor:
Salem, Fathi M.
We present a model of a basic recurrent neural network (or bRNN) that includes a separate linear term with a slightly "stable" fixed matrix to guarantee bounded solutions and fast dynamic response. We formulate a state space viewpoint and adapt the c
Externí odkaz:
http://arxiv.org/abs/1612.09022
Autor:
Albataineh, Zaid, Salem, Fathi M.
This paper addresses the high dimensionality problem in blind source separation (BSS), where the number of sources is greater than two. Two pairwise iterative schemes are proposed to tackle this high dimensionality problem. The two pairwise schemes r
Externí odkaz:
http://arxiv.org/abs/1604.04669