Company Blog

Enabling a Machine Learning Transformation Within Your Organization

machine learning transformation           

         As technology gets smarter and more advanced, organizations are looking to change the way that they make decisions. Many organizations are turning to machine learning and AI so that they can make decisions more easily and efficiently than ever before. But what does AI really mean? From our experience, AI’s aims to allow companies to make automatic, better decisions based on data. We’d propose that AI is at the top of a four-tiered architecture.  The technical stack has AI at the top, which is based largely on machine learning models.  Below the ML layer is data which acts as the foundation for building, training and deploying any model, and finally this is all based on fundamental computer infrastructure.

Screen Shot 2019-01-29 at 12.45.06 PM

         For decades, organizations have journeyed through this stack to drive better business outcomes – initially capturing new capabilities and scale with mainframes, to leveraging cloud infrastructure today.  More recently teams are building a machine learning transformation which is largely based on unprecedented amounts of useful data and relatively low cost compute infrastructure.  And, the market is telling us, that development and creation of machine learning models has been of large importance to their strategies and business outcomes. 

            Though, it should be noted, the ML transformation doesn’t come without challenges.  Many companies have found roadblocks in the area of machine learning models. Challenges arise when people, departments, processes, and technologies within the organization are not aligned towards the goal of enabling machine learning. It is crucial to ensure that the entire organization is working towards the creation and development of machine learning models, not just a few people or departments. It’s interesting to note that the ML transformation may expose mis-alignments not previously seen, due in large part to the relatively new nature of these approaches, and the sheer scale of the people and systems involved. 

Additionally, new processes and strategies must be put in place to successfully create and deploy machine learning models in a sustainable way.  The ML journey is just beginning, and what seems “certain” today, may seem like folly tomorrow.   Many companies are interested in incorporating AI and machine learning into their business, but often start by treating the ML transformation as an internal experiment. There are myriad (and seemingly infinite) decisions and paths to take during the journey. That vast array of approaches speaks important to this fact:  as much as possible, teams should build agnostic, cloud-ready, scalable approaches that incorporate change incrementally – rigid systems that assume “it will be this way forever” are simply a non-starter in the ML transformation journey.   And companies must quickly move from “experiment” to “impacting our business” mode – this means full cycle deployment and lifecycle management for these assets.

         The reality of a machine learning transformation is that it takes many additional resources and mental attitudes to make it work. New processes must be put in place, which leads to the addition of new technologies and applications. As companies strive to build an AI-enabled business, it’s smart to consider how best to build, buy and partner – with services, technology and people providers. 

All ModelOp Blog Posts 

Machine Learning Model Interpretation

To either a model-driven company or a company catching up with the rapid adoption of AI in the industry, machine learning model interpretation has become a key factor that helps to make decisions towards promoting models into business. This is not an easy task --...

Matching for Non Random Studies

Experimental designs such as A/B testing are a cornerstone of statistical practice. By randomly assigning treatments to subjects, we can test the effect of a test versus a control (as in a clinical trial for a proposed new drug) or can determine which of several web...

Distances And Data Science

We're all aware of what 'distance' means in real-life scenarios, and how our notion of what 'distance' means can change with context. If we're talking about the distance from the ODG office to one of our favorite lunch spots, we probably mean the distance we walk when...

Communicating between Go and Python or R

Data science and engineering teams at Open Data Group are polyglot by design: we like to choose the best tool for the task at hand. Most of the time, this means our services and components communicate through things like client libraries and RESTful APIs. But...

An Introduction to Hierarchical Models

In a previous post we gave an introduction to Stan and PyStan using a basic Bayesian logistic regression model. There isn't generally a compelling reason to use sophisticated Bayesian techniques to build a logistic regression model. This could be easily replicated...