Browsing by Author "Soyer, Emre"
Now showing 1 - 9 of 9
- Results Per Page
- Sort Options
ArticlePublication Metadata only Combining big data and lean startup methods for business model evolution(Springer, 2017-12) Seggie, S. H.; Soyer, Emre; Pauwels, K. H.; Business Administration; SOYER, EmreThe continued survival of firms depends on successful innovation. Yet, legacy firms are struggling to adapt their business models to successfully innovate in the face of greater competition from both local and global startups. The authors propose that firms should build on the lean startup methodology to help adapt their business models while at the same time leveraging the resource advantages that they have as legacy corporations. This paper provides an integrated process for corporate innovation learning through combining the lean startup methodology with big data. By themselves, the volume, variety and velocity of big data may trigger confirmation bias, communication problems and illusions of control. However, the lean startup methodology has the potential to alleviate these complications. Specifically, firms should evolve their business models through fast verification of managerial hypotheses, innovation accounting and the build-measure-learn-loop cycle. Such advice is especially valid for environments with high levels of technological and demand uncertainty.ArticlePublication Metadata only Communicating forecasts: the simplicity of simulated experience(Elsevier, 2015-08) Hogarth, R. M.; Soyer, Emre; Business Administration; SOYER, EmreIt is unclear whether decision makers who receive forecasts expressed as probability distributions over outcomes understand the implications of this form of communication. We suggest a solution based on the fact that people are effective at estimating the frequency of data accurately in environments that are characterized by plentiful, unbiased feedback. Thus, forecasters should provide decision makers with simulation models that allow them to experience the frequencies of potential outcomes. Before implementing this suggestion, however, it is important to assess whether people can make appropriate probabilistic inferences based on such simulated experience. In an experimental program, we find that statistically sophisticated and naïve individuals relate easily to this presentation mode, they prefer it to analytic descriptions, and their probabilistic inferences improve. We conclude that asking decision makers to use simulations actively is potentially a powerful – and simplifying – method to improve the practice of forecasting.ArticlePublication Metadata only Fooled by experience(Harvard Business Publishing, 2015-05) Soyer, Emre; Hogarth, R. M.; Business Administration; SOYER, EmreWe interpret the past—what we’ve experienced and what we’ve been told—to chart a course for the future. It seems like a reasonable approach, but it could be a mistake. The problem is that we view the past through filters that distort reality. One filter is the business environment, which focuses on outcomes rather than the processes that lead to them and celebrates successes while ignoring failures, thus making it hard for us to learn from mistakes. Another is our circle of advisers, who may censor the information they share with us. A third filter is our own limited reasoning abilities.ArticlePublication Metadata only The golden rule of forecasting: objections, refinements, and enhancements(Elsevier, 2015-08) Soyer, Emre; Hogarth, R. M.; Business Administration; SOYER, EmreIn providing a “golden rule” for forecasting, Armstrong, Green, and Graefe (this issue) raise aspirations that reliable forecasting is possible. They advocate a conservative approach that mainly involves extrapolating from the present. We comment on three issues that relate to their proposed Golden Rule: its scope of application, the importance of highly improbable events, and the challenges of communicating forecasts.ArticlePublication Metadata only Learning from experience in nonlinear environments: Evidence from a competition scenario(Elsevier, 2015-09) Soyer, Emre; Hogarth, R. M.; Business Administration; SOYER, EmreWe test people’s ability to learn to estimate a criterion (probability of success in a competition scenario) that requires aggregating information in a nonlinear manner. The learning environments faced by experimental participants are kind in that they are characterized by immediate, accurate feedback involving either naturalistic outcomes (information on winning and/or ranking) or the normatively correct probabilities. We find no evidence of learning from the former and modest learning from the latter, except that a group of participants endowed with a memory aid performed substantially better. However, when the task is restructured such that information should be aggregated in a linear fashion, participants learn to make more accurate assessments. Our experiments highlight the important role played by prior beliefs in learning tasks, the default status of linear aggregation in many inferential judgments, and the difficulty of learning in nonlinear environments even in the presence of veridical feedback.EditorialPublication Metadata only A picture's worth a thousand numbers(Harvard Business Publishing, 2013-06) Hogarth, R. M.; Soyer, Emre; Business Administration; SOYER, EmreThe article examines research on the subject of humans' difficulty in understanding probability and the value of graphic representations in improving that understanding. Topics include research by "Harvard Business Review" journal into statistical computer simulations, the hypothesis that probability theory has been helping humans analyze information for years before it was formally defined, and how graphs and simulations could improve businesses' decision-making processes.ArticlePublication Open Access Providing information for decision making: Contrasting description and simulation(Elsevier, 2015-09) Hogarth, R. M.; Soyer, Emre; Business Administration; SOYER, EmreProviding information for decision making should be like telling a story. You need to know, first, what you want to say; second, whom you are addressing; and third, how to match the message and audience. However, data presentations frequently fail to follow these simple principles. To illustrate, we focus on presentations of probabilistic information that accompany forecasts. We emphasize that the providers of such information often fail to realize that their audiences lack the statistical intuitions necessary to understand the implications of probabilistic reasoning. We therefore characterize some of these failings prior to conceptualizing different ways of informing people about the uncertainties of forecasts. We discuss and compare three types of methods: description, simulation, and mixtures of description and simulation. We conclude by identifying gaps in our knowledge on how best to communicate probabilistic information for decision making and suggest directions for future research.ArticlePublication Metadata only The two settings of kind and wicked learning environments(Association for Psychological Science, 2015-10) Hogarth, R. M.; Lejarraga, T.; Soyer, Emre; Business Administration; SOYER, EmreInference involves two settings: In the first, information is acquired (learning); in the second, it is applied (predictions or choices). Kind learning environments involve close matches between the informational elements in the two settings and are a necessary condition for accurate inferences. Wicked learning environments involve mismatches. This conceptual framework facilitates identifying sources of inferential errors and can be used, among other things, to suggest how to target corrective procedures. For example, structuring learning environments to be kind improves probabilistic judgments. Potentially, it could also enable economic agents to exhibit maximizing behavior.ArticlePublication Metadata only Using simulated experience to make sense of big data(Massachusetts Institute of Technology, 2015) Hogarth, R. M.; Soyer, Emre; Business Administration; SOYER, EmreSimulated experience can help companies communicate data analysis results to decision makers. Analysts' conclusions have been found to be different from what decision makers understand. Meanwhile, complex statistical information have been found to be misleading at times.