Caffe, just good for your first taste
January 17, 2019

Caffe, just good for your first taste

Anonymous | TrustRadius Reviewer
Score 4 out of 10
Vetted Review
Verified User

Overall Satisfaction with Caffe Deep Learning Framework

Caffe was chosen by us, only for the experimental purposes of trying out some DL frameworks, as it is one of the earliest DL frameworks, and is dedicated for vision, image recognition and classification. We wanted to see how it may be used in the commercial invoice image auto-classification use case. As soon as TensorFlow was introduced, Caffe was not used anymore.
  • Caffe is good for traditional image-based CNN as this was its original purpose.
  • Caffe's model definition - static configuration files are really painful. Maintaining big configuration files with so many parameters and details of many layers can be a really challenging task.
  • Besides imagine and vision (CNN), Caffe also gradually adds some other NN architecture support. It doesn't play well in a recurrent domain, so we have to say variety is a problem.
  • Caffe's deployment for production is not easy. The community support and project development all mean it is almost fading out of the market.
  • The learning curve is quite steep. Although TensorFlow's is not easy to master either, the reward for Caffe is much less than the TensorFlow can offer.
  • Since we stopped using Caffe before it can reach the production phase, there is no clear ROI that can be defined.
TensorFlow is kind of low-level API most suited for those developers who like to control the details, while Keras provides some kind of high-level API for those users who want to boost their project or experiment by reusing most of the existing architecture or models and the accumulated best practice. However, Caffe isn't like either of them so the position for the user is kind of embarrassing.
Caffe is only appropriate for some new beginners who don't want to write any lines of code, just want to use existing models for image recognition, or have some taste of the so-called Deep Learning.