Artificial Neural Networks (ANNs) have become a very popular Machine Learning model architecture; www.marketsandmarkets.com predicts a 20.5% Compound Annual Growth Rate (CAGR) over the coming years:
Factors contributing to ANN growth include:
Big Data - ANNs have the ability to use very large input data sets for model training and the amount of data available is growing rapidly
Research and Development - the number of ANN published research papers is growing rapidly as are new applications
Model Variations - there are a large number of ANN variations available, including: Convolutional Neural Networks, Deep Learning, Generative Adversarial Networks, Long Short-Term Memory, Recurrent Neural Networks, Reinforcement Learning, Transformer Neural Networks
Flexibility - ANNs can be configured using a number of modeling process hyperparameters, including: data features, network graph architectures, training learning rate, model training times to control factors such as the Bias-Variance Tradeoff
Hardware Processing Acceleration - technologies such as Graphics Processing Units, Tensor Processing Units, and other AI Accelerators can be used to reduce model training times
In effect, ANNs have become a virtual ‘Swiss Army Knife’ of Machine Learning.