Recently, the McKinsey Global Institute predicted that AI would generate at least $3.5 Trillion in economic impact by 2022 by valuing a set of 120 AI use cases across 12 industries. Based on my analysis, these use cases focus primarily on problems requiring prescriptive analytics, where the AI must make recommendations on what to do. Prescriptive analytics grew from a progression from Descriptive to Diagnostic to Predictive analytics, where each synthesizes many instances of the prior form to solve a harder problem.
If you’re finding this article, then you likely are looking for ways to help your kids (or roomates) know that you’re in a Zoom meeting and its not a good time to come in and bother you. I have a separate article that goes into the reasons that I built this solutiion, and how effective it has been at improving the work → home life relationship between me & my kids. The goal in this article is to walk you through the process so you can set it up yourself if needed. The basic setup is shown in the hero…
It’s a problem that everyone has had to contend with throughout the Pandemic: you’re making the big pitch in a Zoom meeting, and your daughter or son comes bursting through the door to run up, give you a big hug, and tell you that they love you (or that they need you to get the milk down, or that their sister is mean). This throws you completely off of your game as you shoo them out of the room, and it takes 5 minutes to regain your composure. …
It only takes a small problem to shake someone’s trust in data, but it takes a lot of deliberate effort to make them realize it was just one problem, not a larger issue. Even mature data organizations run this risk, as it is impossible to fully eliminate all data quality issues. This can lead to the rapid spread of mistrust throughout your organization, unless we adapt some lessons from the world’s efforts to flatten the curve of the COVID pandemic.
Despite the fact that nearly 98% of organizations are making major investments in their ability to become data driven, data…
It only takes a small problem to shake someone’s trust in data, but it takes a lot of deliberate effort to make them realize it was just one problem, not a larger issue. The difference in impact is the “Blast Radius” of the problem, and even the most mature data organizations can do a better job minimizing it.
Despite the fact that nearly 98% of organizations are making major investments in their ability to become data driven, data quality still costs the average organization $15M per year in bad decisions according to Gartner, and impacts 90% of companies according to…
At last year’s AWS re:Invent conference, I attended a roundtable session with a group of thought leaders focused on the effects of User Experience design in AI systems. It was the most profound experience I had that week in Vegas, and I’m already looking forward to getting back together with that group next year. The discussion covered everything from the practical (how to design for failure to recognize scenarios in Natural Language Understanding products like chat bots) to the philosophical (what level of surveillance should we accept to get the benefits of ambient computing). …
In chemistry, a catalyst is a substance that accelerates a chemical reaction without itself being affected. Without a catalyst, the reaction will happen eventually, but adding the catalyst causes the reaction to happen (often dramatically) faster. Even better, the catalyst is unaffected by the reaction and can be re-used again and again. In the picture on the right, you see a massive-scale version of the classic Elephant’s Toothpaste demonstration that shows how the addition of a catalyst can lead to an explosive speed-up of a reaction that would otherwise proceed very slowly. …
In a discussion with a colleague recently, I learned about an interesting paradox. Despite the massive rise in the amount of data generated, captured, stored, and analyzed (IDC Claims we will have 175 trillion gigabytes in 5 years), and the multi-trillion-dollar analytics valuations from Gartner & McKinsey, every year business leaders claim that their organizations are less and less data-driven. Executives consistently cite people and process issues as the primary blocker, with only a small percentage citing technology. This begs the question: what is stopping organizations from using technology to enable people and process changes that make organizations more data-driven?
The costs of poor data quality are so high that many have trouble believing the stats. Gartner estimated that the average organization takes a $15M hit due to poor data quality every year. For some organizations, it can even be fatal. I’m often reminded of a story told by my Data Science Innovation Summit co-presenter, Dan Enthoven from Domino Data Labs, about a high-frequency trading firm, Knight Capital, who deployed a faulty update to their algorithm without testing its effect. …
A few months ago I wrote a post on the coming rise of DataOps, in which I predicted that the world of Data Governance will see some of the same shakeups that IT Operations experienced during the rise of DevOps. In this post, I’ll share some practical tips for how your organization can get started down the path to leveraging the new set of practices that underlie the DataOps movement. The goal is to allow your organization to derive value while adopting new roles and processes.
Your organization should think of the development of analytics pipelines as a 3 stage…
Emerging Tech leader at Pariveda Solutions | Interested in how people & machines learn, and how to bring them together.