Reliable data storage and access is a big challenge in AI applications and IBM ESS is a service you need to solve the problem. If you are building an AI service and you already use the IBM ecosystem for AI compute requirements, ESS can serve as an excellent software based storage solution. Integration with other platforms such as EMR/Databricks is still a challenge and should be improved.
Well suited for mainframe CKD based applications. There is unmatched synergy with IBM zSeries when compared to the competition, for example zHPF, SUPERPAV, zHyperLink, cache management and replication technologies. For distributed, the fact that Copy Services Manager comes with the box is a great move on IBMs part to provide replication capability no matter what the applicaiton. I think the DS8k is less suited for smaller distributed environments. I think product overlap with V9000 in this case
Remote Copy -- The DS8k has decades of excellent code built to handle the most demanding environments for mainframe or open systems and can be combined with automation solutions like GDPS or Copy Services Manager
Reliability and Availability -- This is the platform that gets used for the most demanding environments, even when FICON attachment isn't required
Continued enhancement -- IBM continues to put resources into the DS8k and the platform has advanced with the technology from z and power
Mainframe synergy -- Poughkeepsie communicates and works very closely with the DS8k developers
IBM ESS is optimized for AI and Big Data usecases while S3 is a general purpose storage solutions. EMR and Databricks have lakehouse/data warehousing solutions for distributed computing but are more optimized for just the big data pipelining solutions and not essentially for AI usecases, especially for inference, when you need to load model artifacts really quickly.