Wednesday, July 10, 2013

Bigdata – The latest art in decision making – Part 3 - “Bigdata’s role in predicting the US Presidential Election 2012”


The mind-boggling reality of Exabytes(1 EB = 1000 Petabytes = 1 billion Gigabytes),Zettabytes (1 ZB = 1000 Exabytes = 1 billion Terrabytes)and, soon to come, Yottabytes (1 YB = 1000 Zettabytes = 1 trillion Terrabytes), is well beyond the grasp of our intuition. An Economist video reports that the quantity  of global data is forecast to be an staggering 7,910 Exabytes by 2015, over 60 times greater than 2005. Twitter alone generates over 230 million tweets each day, equivalent to 46 megabits of data per second.

In the near future, people will live in a world of sensors and software in which their “every move is instantly digitized and added to the flood of public data. This is where a statistician (also referred as a Quant) will use is capability to forecast almost everything you need. As my topic suggests, Bigdata has indeed changed the way of traditional decision making style.

I wish to share with you the results of the US Presidential Elections 2012, which many of you would have studied on the internet and how big data has influenced the prediction of the results. Nate Silver, a political blogger had a clean sweep of 50-50 states plus the District of Columbia in forecasting the results of the election. He is neither a pundit nor a former politician, yet he made it.

How did he do it?

With Bigdata in hands, Silver uses the polls of other firms as a data stream that he analyzes and models. His methodology is highly developed, but it boils down to a handful of things. First, he looks at all available polls, except those with patently flawed methodologies. Second, he weights these polls on factors affecting their accuracy. Third, he fits regression curves and trend lines that bring the various polls together. Finally, he runs simulations on key parameters in his models. From all of this, he calculates several results, including two crucial ones – a predicted outcome and the odds of that prediction coming true.

Bigdata redefines expertise
 
Gurus and gut-feel won’t cut it in a world of Big Data, not in politics or marketing. Big Data requires statisticians (quants) who can wrestle it to the ground. In a world flooded with data, good, solid quantitative analytics will be table stakes for success.

Bigdata – The latest art in decision making – Part 2 - "How Bigdata changes your life"


The below listed are some of the major domains where the Bigdata's potential can be fully realized (Figure 1). There will be a transformative impact across the broad spectrum of everyday tasks and activities, both complex and common. These domains have opportunities with potential to improve efficiency and effectiveness, enabling organizations to do more with less and also produce high quality outputs.

                                                                         Figure 1

Data can also be leveraged to improve products as they are used. For example, a mobile phone that has learned its user's habits and preferences, that holds applications and data tailored to that particular user's needs, and therefore will be more valuable than a new device that is not customized to a users ‘ need.
The use of big data can enable improved health outcomes, lower prices due to transparency, and a better match between products and consumer needs. The use of real time traffic information to inform navigation will create a quantifiable consumer surplus through savings on time spent traveling and on fuel consumption.

Bigdata – The latest art in decision making


 What is Bigdata?

Big Data is a large pool of data that is brought together and analyzed to distinguish patterns and make better decisions. These analyzed patterns become the basis of competition and growth for individual firms, enhancing productivity and creating significant value for the quality of products and services.

The True Nature of Big Data

Big Data represents a revolutionary step forward from traditional data analysis, characterized by its four main elements: variety, volume, velocity and Value.

The variety of data comes in two flavors: structured and unstructured. Structured data enters a data warehouse already tagged and is easily sorted. The vast majority of today’s data, however, is unstructured, and fed by sources such as Facebook, Twitter, and video content. It’s random, difficult to analyze, and enormous.

The sheer volume of Big Data overwhelms the normal data warehouse. For example, Facebook reports that its users register 2.7 billion likes and comments per day. For many, this magnitude of data is intimidating: they can’t keep up with it, much less sort it, analyze it, and extract value from it.

All of that data can be challenging to manage when flooding in at a velocity that, for many players, far outpaces their processing ability. In order for Big Data to be a game changer, it needs to be analyzed at a rate that matches the blistering speed at which information enters data warehouses. In microseconds, decisions must be made as to whether a particular bit of data deserves to be captured, and whether it has relevance when combined with other data.

The truthfulness and quality of data is the most important frontier to fuel new insights and ideas. However, focus is required in order to increase the ability to make the right decisions. The value (also termed as Veracity, Variability) tells us how fast the data can be analyzed and acted upon to provide business value

Uses for big data

The vision for big data is that organizations will be able to harness relevant data and use it to make the best decisions. Technologies today not only support the collection and storage of large amounts of data, they provide the ability to understand and take advantage of its full value, which helps organizations run more efficiently and profitably. For instance, with big data and big data analytics, it is possible to:

  • Mine customer data for insights that drive new strategies for customer acquisition, retention, campaign optimization and next best offers.
  • Recalculate entire risk portfolios in minutes and understand future possibilities to mitigate risk.
  • Generate retail coupons at the point of sale based on the customer's current and past purchases, ensuring a higher redemption rate.
  • Send tailored recommendations to mobile devices at just the right time, while customers are in the right location to take advantage of offers.
  • Analyze data from social media to detect new market trends and changes in demand.
  • Use clickstream analysis and data mining to detect fraudulent behavior.
  • Determine root causes of failures, issues and defects by investigating user sessions, network logs and machine sensors.
and more...

Wednesday, July 3, 2013

Outcome based Training Model – The paradigm shift

Training is an integral part in every organization regardless of the size, which decides the fate of ROI of the relevant business involved. In today’s scenario, we have a lot of the Generation X and Generation Y engineers, where the training program should actually accelerate their technical and behavioral competency.

Outdated or Outcome
We are still prolonging with the doing the same things the same way while the market demand has changed and there are a lot of uncertainty in business outcome. As business owners and investors, it becomes our responsibility to deploy capital in the most efficient manner to achieve our business outcome. Training being the key factor for business growth, we should think on “outcome-based” method rather than the outdated-based in-order to the same thing in a more effective and efficient way.

Outcome-based thinking
The moment you shift to an “outcome-based” mode, you can build your business faster than before by thinking like an investor and forcing your employees to think the same. Outcome-based thinking encourages teams both internally and externally to question everything, structure curiosity into research, think openly, and create innovative solutions.

Outcome-based Training model
I am proposing a revolutionary new model for training a trainee. 'Outcome-based' Training model brings in the flavor of earn and learn concept. The time and effort they spend on their learning contributes a percentage of the growth for your business.

The image described below explains the conceptual differences between traditional and outcome-based training model.


The process flow of the model
The object-based training model brings in increased satisfaction of learning since it provides the freedom of learning both requirements based and experience based. The below mentioned image explains the process flow of the training model.
Object-based Training model promises high level of training for any type of organization as it facilitates the achievement of the outcomes, characterized by its appropriateness to each learner’s development level and active and experienced-based learning. Moreover, knowing that this system is going to be used would also give the learner, the freedom to study the content in a way that helps learning it. Object based model must not just involve the management and the training department, but everybody in the organization including employees from different groups for successful implementation.