This is how you act as a disruptor


Containers for analysis

In the IT world containers are the latest important software deployment in cloud environments. They fit well with the way IT likes to test and run software because they implement software faster, manage upgrades easily, and make it easy to combine different software packages. You can try things without forcing a hard upgrade.

For software providers, it also makes it easier to build packaged deployments, integrate with other packages and add more automation into a self-optimization system for customer workloads or other requirements.

Ultimately, containers will help democratize the use of advanced analytics and lower the access barrier to try new software products. They make it very easy for you to get software at the pace you want to consume it. And you can experiment with new algorithms or new techniques without a lot of upfront risk or expense.

For example, imagine that you have implemented a machine learning algorithm for the next-best offer. When a new algorithm becomes available for the same task, you can automatically download it and funnel 5% of your traffic through this algorithm. Then you let the machine tell you if it works better than the previous algorithm. If so, the machine will begin to draw 10% of visitors through the new algorithm, then 25%, etc. until you have completely replaced the algorithm. If, on the other hand, the results are not improved, it returns the algorithm and goes back to how it did it before.

Switches are constantly testing new algorithms and running new programs this way. It is a way they continue to improve with small steps of scale. With containers, you can do the same.

ModelOps for machine learning

How do you cycle machine learning models from the data science team to the IT production team? Do you make regular deployments and updates? Looking for models to degrade and intervene? Are you putting all your best models into production?

ModelOps is a process you can use to move models from the lab to validation, testing and production as quickly as possible while ensuring quality results. It helps you manage and scale models to meet demand and continuously monitor them to spot and correct early signs of deterioration.

Using this ModelOps process will put more models into production and see continued results:

  • data: Explore and access data from a trusted and secure source.
  • Develop new models: Create models with implementation and monitoring in mind.
  • Register models: Preserve data line and track-back information.
  • Distribute models: Improve deployment rates with close collaboration between data scientists and IT.
  • Monitor models: Regularly track performance, then retrain or replace models as needed.

This process is designed to be sensitive to data fluctuations, model bias and model degradation. It improves time to implement new models and ensures regular updates.

A solution for ModelOps can help you compete with the switches that have perfected this process while implementing and monitoring models of the thousands.

Work like a disruptor

The three technology developments discussed above change the way we manage data and execute analytics projects. If you are already analytically mature, you can also use these new technologies to become technologically mature. Use these tips as a first step in your journey:

  1. Choose an area where you have data management issues and try to tackle them with intelligent data preparation.
  2. Look at where you have complex software and code footprints and see if containers can help simplify your infrastructure for analysis.
  3. Examine your current data science work and see where you can benefit from a managed process for managing, deploying and updating models at scale.

On the other hand, if you are technologically mature with a strong IT infrastructure, but not analytically mature, you may also benefit from these technologies:

  1. Consider what external data sources you can bring if you used intelligent data preparation, or what advanced algorithms you could possibly develop if your data was more intelligent.
  2. Look at using containers for analysis, not just for your operating systems. The same best practices that you have learned for automation and portability in these systems also help your analytics efforts.
  3. Determine where you can make the biggest impact with machine learning models managed as an enterprise asset, and start experimenting with advanced analytics while keeping the power of your software development intact.

Wherever you are on the continuum from traditional to disruptive, you can benefit from exploring the technologies described here.

Moving on the scale of technological maturity can be a challenge, but I’ve seen it happen. I’ve seen markets shift and opportunities open as traditional companies took their analytics capabilities and paired them with new technological skills. It may be you next.



Source link