I am lucky enough to meet customers from a wide variety of sectors on a weekly basis. This gives me an insight in to what they are trying to achieve and the challenges IT face in the enablement of them.
In a recent blog entitled “Leveraging data to continue to drive customer centricity” we looked at the importance of data leverage. When addressing need across the sectors, it is very interesting to see how tightly linked the majority of challenges are and how the cross pollination of ideas and initiatives can be used to create wider change and increased cross sector collaboration and insight.
Across all sectors what is clear is that there is a focus on driving initiatives around data and analytics and we see this underpinned by new roles such as the Chief Data Officer, Director of Big Data, Head of Prescriptive Analytics etc. Whilst many organisations have strategies in place, many are still trying to define a path and the common feedback we hear is;
- How big is Big Data, what is the difference between Big Data and Analytics
- Where should we start
- What are the key success factors required to get going and how do we plan investment and build ROI
- What type of resource do we need, what access to data science skills will be required
Where we see success a set of common principles are applied and these are
1. Understand your outcomes, what are you trying to achieve and in what spectrum. Firstly what do you need to know under what time paradigm:
- Descriptive Analytics – What happened in the past
- Diagnostic Analytics – Why did it happen
- Predictive Analytics – What will happen
- Prescriptive Analytics – What will we do
2. What do we want to present / use from the outcome:
- New applications, multi median, web mobile etc.
- Visualisation, dashboard etc.
- New business tools or decision making process
- New business process workflows
3. What data do I have and what data do I want to start to add in to traditional data sources through to social, geographic, seismic, public records, open records, etc.
4. In what context does the data need to be accessed
- Batch – Happy for process and analytics to run in the background, traditionally more focussed in Descriptive and Diagnostic use cases
- Near Time – Getting access to information in short time cycles, allowing visualisation of data close to when a situation, event, change, request has occurred or been actioned
- Real Time – Allowing decisions to be made in real time, driving increased responsiveness to change
And most importantly:
5. Start small, prove the value and drive outcomes. Don’t try and do too much all at once, the best projects succeed by driving cyclical value outcomes, use case after use case. Find a focus area, assign a small focussed budget, build a sprint and prove out the value of the use case. Always focus on picking something where the outcome will be tangible and visible to the business and one that you can link to a return. Acknowledge that not all sprints will succeed so keep the budget tight and scope focussed and learn from the projects.
As you move through this process, you will ascertain what the long term picture will look like and if you have pure play Analytics challenges, or if you need to consider how you also store hold and drive value form Big Data volumes. The beauty organisations have today is that the core platforms to underpin Big Data are now accessible in the Cloud, underpinned by highly scalable and commercially effective price modes and as such the focus can initially be on the outcome required, and the platforms will fall under to help support execution and leverage.
Latest posts by JonathanBridges (see all)
- Where next in the land of Advanced Analytics and Big Data? - June 29, 2015
- Leveraging data to continue to drive customer centricity - June 25, 2015
- Private & Public Cloud: A Hybrid Solution - March 19, 2015