//

A Data Scientist’s Guide to an Efficient Project Lifecycle

It is often seen that projects overshoot their normal completion data by at least three times most probably owing to shifting goals, inefficient approaches towards data collection and exploring various solution paths among others. A closer scrutiny often reveals that the delay was avoidable had there been a more disciplined decision making in place. To put it in a nutshell, there are three major principles which, if followed closely, can reduce the entire project completion time by a considerable extent without any need for sacrificing the end results. These three principles are:
1. Fail Fast
2. 40/70 Rule
3. Fermi Estimation
However, before the three principles are reviewed it is important to define the overall objective in its relevant context. The way you define a problem and an objective has a strong bearing on how you set the bar and the performance metrics that you use. It is of paramount importance that there is a close alignment with your problem, goals and data metric.

Failing Fast

It is important to quickly and accurately judge the viability of your problem-solving approach. Failing fast is a methodology that is used to determine the viability of an idea after the initial testing is done. For example, setting an initial goal and trying an approach X can let you know the feasibility of the solution. If there are any doubts, you can try other approaches.

The 40/70 Rule

It is often seen that people in order to mitigate risk to the greatest extent possible try to gather as much information as possible before making a decision.
This may make them feel better, but it is no guarantor of right outcome in the long run. Colin Powell theorized that leaders should make decisions with no less than 40% and no more than 70% of the information. Any decision with less than 40 per cent data at your command can be considered hasty whereas anything above 70 implies that you have wasted too much time collecting information before making a decision.

Fermi Estimation

Fermi estimation is using approximations to get a “good enough” answer to a complicated problem without wasting too much time and other precious resources. A data science course from an established online training institute can equip you with knowledge and skills that could help you solve complex data science problems. Relevant knowledge and efficiency as such can help you solve problems keeping the bigger picture in mind.

Share on:

Kevin Jacobs

I'm Kevin, a Data Scientist, PhD student in NLP and Law and blog writer for Data Blogger.