Apologies for cross posting.
Special Issue on Neural Network Learning in Big Data
Neural Networks Special Issue: Neural Network Learning in Big Data
Big data is much more than storage of and access to data. Analytics plays an important role in making sense of that data and exploiting its value. But learning from big data has
become a significant challenge and requires development of new types of algorithms. Most machine learning algorithms encounter theoretical challenges in scaling up to big data. Plus there are challenges of high dimensionality, velocity and variety for all
types of machine learning algorithms. The neural network field has historically focused on algorithms that learn in an online, incremental mode without requiring in-memory access to huge amounts of data. The brain is arguably the best and most elegant big
data processor and is the inspiration for neural network learning methods. Neural network type of learning is not only ideal for streaming data (as in the Industrial Internet or the Internet of Things), but could also be used for stored big data. For stored
big data, neural network algorithms can learn from all of the data instead of from samples of the data. And the same is true for streaming data where not all of the data is actually stored. In general, online, incremental learning algorithms are less vulnerable
to size of the data. Neural network algorithms, in particular, can take advantage of massively parallel (brain-like) computations, which use very simple processors, that other machine learning technologies cannot. Specialized neuromorphic hardware, originally
meant for large-scale brain simulations, is becoming available to implement these algorithms in a massively parallel fashion. Neural network algorithms, therefore, can deliver very fast and efficient real-time learning through the use of hardware and this
could be particularly useful for streaming data in the Industrial Internet. Neural network technologies thus can become significant components of big data analytics platforms and this special issue will begin that journey with big data.
For this special issue of
Neural Networks, we invite papers that address many of the challenges of learning from big data. In particular, we are interested in papers on efficient and innovative algorithmic approaches to analyzing big data (e.g. deep networks, nature-inspired
and brain-inspired algorithms), implementations on different computing platforms (e.g. neuromorphic, GPUs, clouds, clusters) and applications of online learning to solve real-world big data problems (e.g. health care, transportation, and electric power and
energy management).
RECOMMENDED TOPICS:
Topics of interest include, but are not limited to:
1.
Autonomous, online, incremental learning – theory, algorithms and applications in big data
2.
High dimensional data, feature selection, feature transformation – theory, algorithms and applications for big data
3.
Scalable neural network algorithms for big data
4.
Neural network learning algorithms for high-velocity streaming data
5.
Deep neural network learning
6.
Neuromorphic hardware for scalable neural network learning
7.
Big data analytics using neural networks in healthcare/medical applications
8.
Big data analytics using neural networks in electric power and energy systems
9.
Big data analytics using neural networks in large sensor networks
10.Big data and neural
network learning in computational biology and bioinformatics
SUBMISSION PROCEDURE:
Prospective authors should visit
http://ees.elsevier.com/neunet/ for information on paper submission. During the submission process, there will be steps to designate the submission
to this special issue. However, please indicate on the first page of the manuscript that the manuscript is intended for the
Special Issue: Neural Network Learning in Big Data. Manuscripts will be peer reviewed according to
Neural Networks guidelines.
Manuscript submission due: December 15, 2014
First review completed: March 1, 2015
Revised manuscript due: April 1, 2015
Second review completed, final decisions to authors: April 15, 2015
Final manuscript due: April 30, 2015
GUEST EDITORS:
Asim Roy,
Arizona State University, USA (asim.roy@asu.edu) (lead
guest editor)
Kumar Venayagamoorthy, Clemson University, USA (gkumar@ieee.org)
Nikola Kasabov, Auckland University of Technology, New Zealand (nkasabov@aut.ac.nz)
Irwin King, Chinese
University of Hong Kong, China (irwinking@gmail.com)