fbpx

AutoML: How Do You Measure Return On Investment?

By Walter Paliska

So your company has decided to invest in an Automated Machine Learning (AutoML) platform. Excellent - AutoML promises that it can help accelerate and automate much of your data science process. At first blush, the return on investment (ROI) for your technology purchase would seem simple: Measure how many data science projects your team could produce on average before your platform purchase, and then measure again afterward. If your results are anything like what our clients have seen, you will likely measure ROI in terms of time: many of our clients are finding that they can deliver data science projects 10X to as much as 32X faster than they could manually. While those numbers are high, however, there are other even more powerful means of measuring ROI that will be even more meaningful and valuable to your business. Leaders should think beyond cost savings and look at developing sustainable competitive…

dotData’s AI-FastStart™ Program Helps BI teams Adopt AI/ML with AutoML 2.0

By Walter Paliska

Today dotData is thrilled to announce dotData AI-FastStart™, our new exclusive program aimed at helping Business Intelligence professionals with the adoption of AI and Machine Learning (ML) powered Business Intelligence (BI) solutions - regardless of the amount of expertise or infrastructure readiness of the organization. With AI-FastStart™, BI teams can quickly move from zero to a fully operational AI/ML experience in ninety days (90) or less. AI-FastStart™ was born as a direct response to a rapidly changing BI & Analytics world. AI/ML has become a critical technology investment but most organizations still suffer from scaling AI/ML practices. BI+AI (a.k.a. citizen data scientists) is no longer a “nice to have” but must become the new approach to scale AI/ML for organizations. dotData AI-FastStart™ makes AI/ML adoption simple, easy and fast.  The program was designed around four core principles: The right platform, education, providing fast time-to-value, and to be easy to deploy…

Data Science Operationalization: What the heck is it?

By Walter Paliska

Data Science Operationalization Defined Data science operationalization, in concept, is simple enough: Take Machine Learning (ML) or Artificial Intelligence (AI) models and move them into production (or operational) environments. In the words of Gartner Sr. Analyst Peter Krensky, data science operationalization is the "...application and maintenance of predictive and prescriptive models..." In practice, however, operationalizing ML and AI models can be a complicated and often overwhelming challenge. In a broader concept, one of the biggest challenges of operationalization is that AI and ML models get integrated with systems that contain live data that changes quickly. For example, if your model is designed to predict customer churn, your data science operationalization process needs to be integrated with your CRM system to predict churn effectively as your data volumes grow. What makes data science operationalization so hard? There are four critical aspects of data science operationalization that make it challenging to implement.…

What IS Feature Engineering?

By Walter Paliska

What Is Feature Engineering?(And Why Do We Need To Automate it?) The past few years have seen the rapid rise in the adoption of Artificial Intelligence (AI) and Machine Learning (ML) for a multitude of commercial use-cases. Beyond the “cute” factor of AI that can pick a cat out of a photo array, AI and Machine learning are being deployed to model and predict lending risk, to understand and manage customer churn, provide product recommendations, help with programmatic advertising and much more. The challenge for the business community is that the underlying practice that is at the heart of AI and Machine Learning - data science - is rooted in a complex world of statistical analysis, data manipulation, programming and more. Most businesses don’t have enough data scientists - a fact illustrated by research in 2018 by LinkedIn that showed that there would be a shortfall of over 150,000 people…

How to Operationalize Data Science in the Enterprise: The Five Challenges to Address

By Walter Paliska

The end-to-end process for launching a data science project is daunting - and many enterprise projects never make it to production.  The process is similar in most organizations and consists of: Data collection, last mile ETL, feature engineering, and machine learning. However, while the process is understood by most teams, the actual execution is very complex and involves a high-level of operational risk.We recently published a complete guide to operationalizing data science. In this guide, we identified five complex issues to be addressed, for a business to derive value from operationalizing data science. Highlights from the paper: Issue 1: Quality There are two groups in the data science process who are not aligned operationally:1) Data engineers build data pipelines with SQL or GUI-based tools, 2) Data scientists build machine-learning scoring pipelines using Python or R.  Software engineers must often reimplement much of the work from these two groups before production…