Dynamic Embeddings, Batch Normalisation and Dropouts added to ELDR AI
We've been busy adding extra features to ELDR AI prior to release of v1.0 at the end of January. We have been able to successfully integrate dynamic Embeddings, Batch Normalisation and Dropouts in relevant layers throughout the AI Engine making it even more adaptable, robust and powerful.
Embeddings are vector-representations of single values - and in the case of Artificial Neural Networks, they allow categorical data (e.g. user ID, product ID, group ID) to be used in an ANN alongside normal numerical data - they also allow text/words to be used in ANN.
Embeddings also have a interesting by-product in that during the learning process, they naturally group similar categories together - meaning they can be used for recommendations.
During testing, we noticed that even though we provided normalised and good data to the ANN at the start, values were increasingly going off on a tangent as they passed though the ANN (weights were becoming highly variable). Although the ANN learned eventually, this variance made training harder and longer, therefore inefficient in some cases.
Batch Normalisation at every layer is there to keep values in check at each node and layer, preventing wide variance and ultimately speeding up learning time.
Dropout involves the random shutting off of nodes/neurons during learning.