Keras, the right entry to the Deep Learning world
Use Cases and Deployment Scope
Keras is being used together with TensorFlow, in our new Data & AIML department. Many business use cases are implemented on top of it. We use Keras as high level API, responsible for model designing, model training, model evaluation and model inference/prediction. It is part of our final Model As Service product, deployed in our production environment, allowing various business application [to] consume its prediction output and take the important decision as early as possible.
Pros
- As the high level API, clean and neat, allowing the data scientists to develop the standard deep learning model very quickly (using the mature and existed algorithm module blocks)
- Seamlessly integrate with [a] couple of Deep Learning framework, although we only use TensorFlow, and since TF2.0 Keras has become the 1st class native API
Cons
- Keras model itself is not thread safe, and the documentation regarding how the multi thread works during the prediction time is not quite clear and enough, we still need [to] tweak the TensorFlow old (session + graph) way to support our Model As Service, in the high concurrent scenario, but with limited memory constraint.
- Some API and default implementation still has a lot [of] improvement room, for example, the checkpoint callback, only saves the "best" model, natively can not save all other metadata of the "best" model, we currently have to extend that by ourselves to meet our project requirement.
- Some advanced topic, like distributed training, documentation is not so clear and very hard to find the real code example.
Likelihood to Recommend
Keras is quite perfect, if the aim is to build the standard Deep Learning model, and materialize it to serve the real business use case, while it is not suitable if the purpose is for research and a lot of non-standard try out and customization are required, in that case either directly goes to low level TensorFlow API or PyTorch
