Posts: 8,560 +1,447
Even though it wasn’t the first time that Apple and IBM have announced partnerships in the enterprise space, as a long-time tech industry observer, there’s still part of me that finds it surprising to see an Apple executive speak at an IBM event.
Such was the case at last week’s IBM Think conference in Las Vegas, where the two announced that IBM’s Watson Services was going to be offered as an extension to Apple’s CoreML machine learning software. Essentially, for companies who are creating custom mobile applications for iPhones (and iPads), the new development means that enterprises can get access to IBM’s Watson AI tools in their iOS business applications.
On the surface, it’s easy to say that this is just an extension of some of the work IBM and Apple announced several years back to bring some of IBM’s industry-specific vertical applications to the iPad. In some ways, it is.
But in many other ways, this announcement is arguably more important and will generate more long-term impact than whatever new products, software and/or services that Apple announces at their education event in Chicago.
The Apple IBM machine learning announcement is arguably more important and will generate more long-term impact than whatever new products, software and/or services that Apple announces at their education event in Chicago.
The reasons are several. First, the likely focus of much of this work will be on the iPhone, which has a larger and more important presence in businesses throughout the world than iPads. Depending on who you ask, iPhones have nearly 50% share in US smartphones in business, for example, which is several points higher than their share of smartphones in the overall US market.
More important than that, however, is the new dynamic between Apple’s machine learning software and the capabilities offered by IBM. At a basic level, you could argue that there may be future battles between Siri and Watson. Given all the difficulties Apple has had with Siri, versus the generally much more positive reaction to Watson, that could prove to be a significant challenge for Apple.
The details of the agreement specify that Watson Services for CoreML will allow applications created for iOS to leverage pre-trained machine learning models/algorithms created with IBM’s tools as a new option. As part of CoreML, Apple already offers machine learning models and capabilities of its own, as well as tools to convert models from popular neural network/machine learning frameworks, such as Caffe and TensorFlow, into CoreML format. The connection with IBM brings a higher level of integration with external machine learning tools than Apple has offered in the past.
Initially, the effort is being focused on the visual recognition tools that IBM has made available through its Watson services. Specifically, developers will be able to use Watson Visual Recognition to add computer vision-style capabilities to their existing apps. So, for example, you could point your iPhone’s camera at an object and have the application recognize it and provide exact details about specific characteristics, such as determining a part number, recognizing whether a piece of fruit is ripe, etc. What’s interesting about this is that Apple already has a Vision framework for letting you do similar types of things, but this new agreement essentially lets you swap in the IBM version to leverage their capabilities instead.
IBM also has voice-based recognition tools as part of Watson Services that could theoretically substitute for Apple’s Foundation Natural Language Processing tools that sit at the heart of Siri. That’s how we could end up with some situations of Siri vs. Watson in future commercial business apps. (To be clear, these efforts are only for custom business applications and are not at all a general replacement for Apple’s own services, which will continue to focus on Siri for voice-driven interactions in consumer applications.) The current announcement specifically avoids mentioning voice-based applications, but knowing that ongoing machine learning efforts between Apple and IBM are expected to grow, it’s not too hard to speculate.
IBM has voice-based recognition tools as part of Watson Services that could theoretically substitute for Apple’s Foundation Natural Language Processing tools that sit at the heart of Siri. That’s how we could end up with some situations of Siri vs. Watson in future commercial business apps.
If you’re wondering why Apple would agree to creating this potential internal software rivalry, the answer is simple: legacy. Despite earlier efforts between the two companies to drive the creation and adoption of custom iOS business applications, the process has moved along slowly, in large part because so much of the software that enterprises already have is in older “legacy” formats that is difficult to port to new environments. By working with IBM more closely, Apple is counting on making the process of moving from these older applications or data sets to newer AI-style machine learning apps significantly easier.
Another interesting aspect about the new Apple IBM announcement is the IBM Cloud Developer Console for Apple, which is a simple, web-based interface that lets Apple developers start experimenting with the Watson services and other cloud-based services offered by IBM. Using these tools, for example, lets you build and train your own models in Watson, and even create an ongoing training loop that lets the on-phone models get smarter over time. In fact, what’s unique about the arrangement is that it lets companies bridge between Apple’s privacy-focused policies of doing on-device inferencing—meaning any incoming data is processed on the phone without sending data to the cloud—and IBM’s focus on enterprise data security in the cloud.
Another potentially interesting side note is that, because IBM just announced a deal with Nvidia to extend the amount of GPU-driven AI training and inferencing that IBM is doing in their cloud, we could see future iOS business apps benefitting directly from Nvidia chips, as those apps connect to IBM’s Nvidia GPU-equipped cloud servers.
More than anything, what the news highlights is that in the evolution of more sophisticated tools for enterprise applications, it’s going to take many different partners to create successful mobile business applications. Gone are the days of individual companies being able to do everything on their own. Even companies as large as Apple and IBM need to leverage various skill sets, work through the legacy of existing business applications, and provide access to programming and other support services from multiple partners in order to really succeed in business—even if it does make for some friendly competition.
Bob O’Donnell is the founder and chief analyst of TECHnalysis Research, LLC a technology consulting and market research firm. You can follow him on Twitter @bobodtech. This article was originally published on Tech.pinions.