Media Center

Machine learning is not the pain point for organizations!

The entire chain of data to intelligence to solutions has been evolving for both academia and business. Big data which delivers unprecedented volume, velocity and variety of data has emerged as the oil for modeling engines. The modeling in turn has transformed from classical statistics to machine based learning and deep neural networks are breaking new grounds in lowering error rates. Amidst all this excitement, when we look at the industry, the pain points still remain at the two ends of the data chain; i.e. not being data ready and not being able to measure/ ascertain the value of data based intelligence on the bottom line. This unfortunately is the state even for a lot of businesses that are heavily dependent on data for their core operational decisions.

Let’s take the availability of data first. Many organizations still have a lot of unharnessed data in the form of unstructured text or voice recordings. Even the data that is harnessed suffers from quality issue. Many implementations of data architecture are too rigid and need heavy maintenance and upfront work for adding new data sources. In this fast paced world of newer data sources availability and changing data protection regulations, these traditional ETL driven DWH implementations move too slow for the like of analysts. DWH still have their place for adhoc reporting and business intelligence, but more fluid environments like datalakes are definitely more suitable for scaling analytics.

As for the other end of data chain; many organizations are not able to measure the business value of these data/modeling initiatives. A rigorous and disciplined control testing in operations can help to measure the business value of these initiatives. One building block that is needed to let that happen is a solid tracking of operational costs; at customer/ account level. This combined with iterative experimental testing can help organizations not just measure but optimize the value of the data driven models. This needs to be taken at an enterprise level for best value extraction.

Algorithms will continue improving, but their is no substitute to hard work with data discovery and discipline around value optimization.

Facebook
Twitter
LinkedIn
Categories
Recent Posts
  • “LLM (Large Language Models) here and LLM there, but not …
  • UK mortgages experienced a significant surge in arrears during the …
  •   In the realm of business, there’s a mystical force …
  • “RXAI – Reasoning and optimising over large datasets, automatically and …
  • DirectID and Zinia AI Unite to drive hyper-personalisation and assist …
Close Bitnami banner
Bitnami