Data science is classically used to monitor social networks. Twitter, TikTok for instance, are goldmines of emerging trends, precise heartbeat monitor for product and topic popularity for those who are able to mine their huge and more-or-less-but-not-very structured data stores.
From the simplest to the almost-sentient, the algorithms used for that monitoring all start by querying their target social network, then submit the gigantic list of media they obtained – text, images, sounds, videos – to their special brand of analysis.
Back to Earth, the same can be done with your databases. In that case, spotting the next crop-top killer style or programming languages overlord might not be on the menu, but rather more internally-focused topics. For instance process the quality-of-service obtained through customer surveys into actionable items. And by “actionable” I mean that our processing identifies the teams or individuals that need to undertake the actions, and give them a pretty clear-cut image to fathom what they could do. Another example would be monitoring the business impact of strategic decisions and projects.
This post is about the latter example, the management of a company wanted to measure the impact of their market evolution and of the projects they had undertaken as the leader to follow along – and guide that evolution. Furthermore, they wanted to be able to derive actions from this measurement and correct or adjust their project portfolio.
The issue they were facing was that their systems were not designed for those new or modified market segments. They were adequate for the old ones but the changes were not even a question of updating a product list: sometimes the old products could be sold in new environments now, or on new markets.
A system overhaul did not make sense. It would take too long, and there was no way of specifying it until those change could be considered as finished and the market settled into a new, somewhat stable configuration. The changes were still momentous, and the projects expensive. This needed monitoring, quantitative monitoring.
While brainstorming the issue, it appeared that some level of information was in fact present in the company’s ERP system. Indeed, all individual contracts were logged, together with a title summarizing the service and the customer’s project. A keyword-based analysis began to make sense, and ad-hoc tables were build in Excel to prove the concept.
At this point, l’Atelier des Données could take over with the task of streamlining the whole process. I worked with a business contact at the company to refine the keywords, find more supporting elements in the ERP where possible like market tags, product codes. Then I worked alone to make the identification of those keywords smart: misspells, plurals, abbreviations, synonyms… were to be found as well. In fact, I would have started the same if I had been looking at Twitter than I did with that company’s ERP. Then I consolidated a list of extractions that could be made easily, if not completely automatically, to feed the algorithm.
With all that groundwork done, it was only a matter of getting the treatment done in a clean, reproducible and automatable manner, designing a secure web dashboard to communicate the results seamlessly to that company’s management.
It was the way we did it, and that dashboard has been in use for about a year. It was used to detect the less-performing project and put them back on track, to strengthen the sense of accomplishment of those whose projects appeared to be successful, as well as to reassure the other company stakeholders, including the Board of Directors, that the investment was both wise and under control.
Project information
- Delivery (data crunching and dashboard) : 3 weeks
- Datasets:
- Database of all contracts
- List of keywords from business workshops.
- Main features :
- Text analysis
- Online interactive dashboard
- Monthly update