Google’s open sourcing move is laying the groundwork for products based on the company’s renowned infrastructure.
Google recently open sourced its TensorFlow library for artificial intelligence. But why? There are a fair number of open source projects already available for the various AI stuff one might want to do, as well as companies offering services and products in the area. One explanation is altruism and a bit of indirect benefit from the advancement of research. This is possible, but an explanation more fitting for a profit motivated company would be that the open sourcing move was made to build a market for future product releases.
The hardest ingredient to acquire in order to make a market is that of the participants. Google hopes to bring them together by setting standards with its open source platform. Open source will attract more people than offering TensorFlow as a product, in large part because it is free. Since Google is widely recognized, and for its software, it should have little trouble reaching critical mass. From there, the sheer amount of usage will propel it forwards. As with any open source project with sufficiently rigorous standards for contributions, more users means more possible contributors, as well as a larger community that can be leveraged for support on technical issues and whatnot. It also means more pairs of eyes that can uncover problems in the software and provide fixes. These together make a well-built piece of software that will form a standard.
Setting the standards is great, but what does this mean? Just as Google set the standards for search engines and as a result captures mountains of advertising revenues, it may be trying to do the same for AI. The prediction being made here is that widespread adoption of Google’s new open source software will, in the future, provide the company with a market that will more readily buy its AI-related products. This can come in the form of greater recognition leading to greater acceptance of those paid-for things, but even better than that would be greater uptake of new Google products that interfaced with the TensorFlow library due to its already widespread use.
This is not too hard to imagine- Google released its software, but it has retained many of the bits that make the code fast on hardware. As a Wired article attests, “To be sure, Google isn’t giving away all its secrets. […] it’s not sharing access to the remarkably advanced hardware infrastructure that drives this engine (that would certainly come with a price tag).” Access to the hardware could be offered as a service, such as specially built hardware for TensorFlow operations. This would help Google compete with Amazon’s AWS, as this special purpose could make it cheaper and more powerful for its use cases than the generic computing provided by the current services. As the large companies such as Amazon, Microsoft, and Google build ever more attractive packages for their cloud computing services, their offered computing power will become cheaper, under increasingly flexible terms (e.g. pay for actual computing done, rather than buying hours allotted on the machines). The next step may well be special purpose infrastructure that can cater better to more specific customer needs, in which case Google is poised to make a killing.