AI Transformation with ModelOps

AI Needs to Break Free from “Frozen” Processes

4 Minute Read
By Scott Rose

There is no disputing that artificial intelligence (AI) has had a massive impact on a broad range of human activities, an impact that has been widely publicized.

Accounts like this one from WIRED magazine are impressive. But then frustration creeps in, because I know AI could have an even greater impact – scaled to a wider range of applications — if it was not held back by manual and inefficient processes!

I visualize the current state of affairs as an iceberg, with only a small portion visible above water and a much larger portion hidden underneath.

Beneath the impact that analytical models have, once deployed at scale, lies a huge amount of effort by all stakeholders in model creation, deployment, monitoring, and governance. First, they must develop truly useful models, then operationalize them, and, afterward, make sure those models continue to deliver value.

You might think what’s needed is an ice breaker – a way to crash through process inefficiencies. And that’s what many companies opt for initially. They use brute force to get models into production and keep them updated.

But this approach won’t scale. What’s needed is a redesign of the entire process — exactly what the discipline we call ModelOps provides.

Frustration all along the line

Today, a line-of-business manager with a thorny problem first must convince the organization’s analytics team that existing reports and recommendation engines do not adequately address the problem. Then data science experts are engaged to develop, train, and test a new or updated analytical model.

But data scientists are not experts at deploying such models within operational systems and may need to call on DevOps specialists. Even if a data science workbench facilitates deployment, a company’s IT organization needs to find the resources to integrate the model’s coding with its computing environment.

Setting aside the likelihood that data science, DevOps, and IT teams may have a backlog of projects, the back-and-forth between all parties is time-consuming and, typically, frustrating for all. And who is responsible for monitoring the lifecycle of this and other analytical models? That’s critical – and, in many organizations, pretty much up for grabs.

Here are some situations I often observe:

  • The line-of-business manager has to re-budget and re-balance resources to operationalize the AI model lifecycle to meet C-Level commitments to leveraging AI to drive top-line growth and bottom-line efficiencies. What was developed by the data scientists provided clear business value – but was developed and trained in a way that could not be supported in an enterprise’s execution environment — nor was it meant to, initially. As the number of models deployed grows, governing them through spreadsheets or simple repositories just won’t work.
  • The analytics team is also concerned about model governance. If model development is scattered across business units, the organization will find it difficult to audit models’ performance and trace their lineage. Models created with different tools or in different languages further complicate this task. On the other hand, if governance is centralized, the team may risk lock-in to a single data science platform, which can be costly and cuts off access to broader innovation in the market.
  • Data scientists resist using a single tool to develop models. Their expertise leads them to favor certain tools or languages for certain types of problems. They also know that newer languages hold great promise for model creation and want to be able to use them. While some data science workbenches do support model deployment, they do not support operationalizing the full lifecycle of the model at enterprise scale.
  • The DevOps team can’t help with governance either. The tools they work with are designed to identify and fix software, not errors in PMML, Python, or other languages used in model creation. Maintaining model lifecycle across the enterprise is simply not in their skill set.
  • IT would welcome a way to bring greater efficiency to the process of deploying models in the array of applications that business units rely upon. But IT very likely has responsibility for managing a complex, heterogeneous computing environment and the CIO is understandably worried about incurring greater technical debt.
  • The rest of the C-suite, however, sees the competitive advantage rivals are obtaining through AI and are being pressured by investors and the board to scale their use of AI across a wider array of internal and market facing applications and processes.

We are living in “the age of AI,” and its effects are all around us, from wearable devices that track our heart rates and sleep patterns to recommendation engines behind our favorite shopping sites and streaming services. AI is being employed in virtually every industry, from agriculture to medicine to software development. AI, for example, helps financial institutions spot money laundering and other potentially fraudulent activity more quickly.  It has helped US veterans transition to civilian life, improved monitoring of at-risk newborns, and saved lives by making weather forecasting far more accurate. But AI can do much more, and Operational Enterprise AI -the ultimate goal of ModelOps- is the most challenging automation and governance problem the enterprise has ever faced.

All ModelOp Blog Posts 

24 Basic Bullets  For Brewing Better Beer

24 Basic Bullets For Brewing Better Beer

4 Minute Read By Greg Lorence This time, I figured I’d rewind a bit from the last couple of posts, wherein I drove lots of beer a very long way for a very important work party, and change the focus a bit. Now, this is certainly not directly related to the work we do...

ModelOp Golden Ale Takes a Holiday – Part 2

ModelOp Golden Ale Takes a Holiday – Part 2

2 Minute Read By Greg Lorence Before we go much further, I feel obligated to state what is likely already obvious: I’m not all about that #InstaLife. All accompanying photography was snapped with little regard for composition, typically while stretching out from 4-6...

Q&A with Ben Mackenzie, AI Architect

Q&A with Ben Mackenzie, AI Architect

2 Minute Read By Ben Mackenzie & Linda Maggi How AI Architects are the Key to Operationalize and Scale Your AI Initiatives Each week we meet more and more clients who are realizing the importance of operationalizing the AI model lifecycle and who are dismissing...

Behind the scene of ModelOp by our Brewmasters- Part1

Behind the scene of ModelOp by our Brewmasters- Part1

2 Minute Read By Greg Lorence As a long-time homebrewer, when our President, Scott asked me, “wouldn’t it be cool if you and Jim brewed a beer to commemorate our rebrand later this year?” my reaction, after the immediate “heck yeah! Beer is awesome”, was honestly...

Open Data Group Officially Becomes ModelOp

Open Data Group Officially Becomes ModelOp

2 Minute Read By ModelOp Today, Open Data Group rebrands as ModelOp. Read more on Globe Newswire It is an exciting day for us, if only because people will stop asking “Why are you called Open Data Group?” after they understand what we do. More importantly the name...

Gartner & WIA Conferences Exit Poll

Gartner & WIA Conferences Exit Poll

2 Minute Read By Garrett Long As we continue into our “Year of Model Operations”, I thought it would be useful to highlight some of the key things I observed, learned and shared over the last few weeks at both the Gartner Data and Analytics Summit March 18-21, 2019 in...

Machine Learning Model Interpretation

To either a model-driven company or a company catching up with the rapid adoption of AI in the industry, machine learning model interpretation has become a key factor that helps to make decisions towards promoting models into business. This is not an easy task --...

Matching for Non Random Studies

Experimental designs such as A/B testing are a cornerstone of statistical practice. By randomly assigning treatments to subjects, we can test the effect of a test versus a control (as in a clinical trial for a proposed new drug) or can determine which of several web...