Get Flowing with TensorFlow
Updated April 29, 2021

Get Flowing with TensorFlow

Anonymous | TrustRadius Reviewer
Score 5 out of 10
Vetted Review
Verified User

Overall Satisfaction with TensorFlow

TensorFlow is used as a development platform for deep learning algorithms, in particular for:
1. Recommendations: selecting the best templates to recommend to users via email in the various countries the company has a market in, over 100 languages supported,
2. User feedback classification: when users provide feedback, natural language processing algorithms implemented in TensorFlow and Keras are used to classify issues so that stakeholders can identify the major issues with a product/product release,
3. Learning-to-rank for search: there is some development on improving search results by switching to deep learning algorithms from a gradient boosting one, and TensorFlow provides that capability, and
4. Computer vision: some experimentation performed on object detection and image classification.
  • TensorFlow is fairly easy to use, with adequate tutorials to get any user started quickly.
  • Tooling around TensorFlow, such as TensorBoard, is a gold standard: it has made the training and debugging process so much easier compared to most other deep learning platforms.
  • Community support for TensorFlow is very good. If there is a problem, there usually is an answer by just a little Googling. Also the documentation for TensorFlow is often top notch.
  • Prior to TensorFlow 2.0, setting up data ingestion for TensorFlow can be a huge pain. So much so that TensorFlow Lite and alternatives such as Keras make it more palatable. Things are changing with TensorFlow 2.0 though.
  • Some error messages from TensorFlow can be quite difficult to understand. For instance, a recent error using the dot product layer in TensorFlow 2.0 made it seem like there was a problem with data ingestion, but by downgrading to TensorFlow 1.14.0, the problem disappears.
  • Tooling with Bazel (our choice for a build tool) in our monorepo is a bit of a nightmare, partly because Bazel has poor Python support. However, we were able to integrate PyTorch easily with Bazel, but not TensorFlow.
  • Would love to have better bindings with the JVM, rather than just Python, considering that many companies have a JVM-based stack, making it easier to integrate.
  • TensorFlow has helped to improve recommendations and search at Canva, providing millions of users with better search results and recommendations as compared to other non-deep learning approaches. This has helped increase our activation, and monthly-active user count.
  • TensorFlow has helped us sort a variety of user feedback using deep learning based classification of text, providing product designers with feedback to understand where the pain points of the product is. This has contributed to improving our products.
  • It is now being used to help with user segmentation and help with prediction users at risk of churn. Eventually, this should help improve our revenue and reduce churn rates.
Can't seem to choose any deep learning platform in the above, so I'll list it here:
1. Apache MXNet: this has been used for one of our main algorithms for search as an end-to-end pipeline. We chose this because of the Scala bindings, which makes it easier to integrate with out JVM backend. MXNet seems comparable to TensorFlow, although community support is not as good as TensorFlow, and there are issues with memory leaks that are being worked on. TensorFlow in general is easier to use, but MXNet isn't too far behind.
2. Keras: still a favorite. Often I use this when paired with TensorFlow. TensorFlow 2.0 will make it even easier.
3. PyTorch: only used it a little, so it's hard to provide a good opinion.
4. DL4J: used it initially in an early days project because it has good JVM support. Harder to used not because of poor API design, but because community support is lacking and features don't come out as fast as TensorFlow.
Community support for TensorFlow is great. There's a huge community that truly loves the platform and there are many examples of development in TensorFlow. Often, when a new good technique is published, there will be a TensorFlow implementation not long after. This makes it quick to ally the latest techniques from academia straight to production-grade systems. Tooling around TensorFlow is also good. TensorBoard has been such a useful tool, I can't imagine how hard it would be to debug a deep neural network gone wrong without TensorBoard.

Do you think TensorFlow delivers good value for the price?

Yes

Are you happy with TensorFlow's feature set?

Yes

Did TensorFlow live up to sales and marketing promises?

Yes

Did implementation of TensorFlow go as expected?

Yes

Would you buy TensorFlow again?

Yes

TensorFlow is great for most deep learning purposes. This is especially true in two domains:
1. Computer vision: image classification, object detection and image generation via generative adversarial networks
2. Natural language processing: text classification and generation.

The good community support often means that a lot of off-the-shelf models can be used to prove a concept or test an idea quickly. That, and Google's promotion of Colab means that ideas can be shared quite freely. Training, visualizing and debugging models is very easy in TensorFlow, compared to other platforms (especially the good old Caffe days).

In terms of productionizing, it's a bit of a mixed bag. In our case, most of our feature building is performed via Apache Spark. This means having to convert Parquet (columnar optimized) files to a TensorFlow friendly format i.e., protobufs. The lack of good JVM bindings mean that our projects end up being a mix of Python and Scala. This makes it hard to reuse some of the tooling and support we wrote in Scala. This is where MXNet shines better (though its Scala API could do with more work).