The next big thing can often seem like a thing too big. When it comes to adoption of new technologies such as big data, machine learning and AI, it's easy for the enterprise to become daunted.
A common first step is to engage external experts with their shiny new toys to package up a solution.
However don't underestimate the value of the knowledge stored within your existing teams and trusted partners. They are the ones that understand the problem you're trying to solve, what the end solution should look like, the data you already own, and how to get the best insights out of your data haystack.
Data and knowledge is the intellectual property, not the models and tools. "In fact, Google and others give away the models, but they keep the data".
Today, as more enterprises pursue machine/deep learning explorations, hype rages around new AI tools and the wizardry of data scientists. But both of them have potential (and related) shortcomings that can damage enterprise AI strategies, particularly in the early stages. How organizations should go about getting into AI was a core topic of discussion by a panel of AI practitioners at Tabor Communication’s recent Advanced Scale Forum. Over the course of the conversation, a divide emerged: On one side are new AI tools and data scientists who have expertise in building AI models; on the other side is the data possessed by an organization combined with in-house staff knowledge of that data and of the organization’s business. The value of the former's bright-shiny-new-object can be inflated, the latter's value can be overlooked.