Improved accuracy by extremizing group forecast

The Good Judgment research team is preregistering new forecasters for the next tournament year of the the Aggregative Contingent Estimation (ACE) Program.

The Good Judgment research team is based in the University of Pennsylvania and the University of California Berkeley. The project is led by psychologists Philip Tetlock, author of the award-winning Expert Political Judgment, Barbara Mellers, an expert on judgment and decision-making, and Don Moore, an expert on overconfidence.

The team is one of five teams competing in the Aggregative Contingent Estimation (ACE) Program, sponsored by IARPA (the U.S. Intelligence Advanced Research Projects Activity). The ACE Program aims “to dramatically enhance the accuracy, precision, and timeliness of forecasts for a broad range of event types, through the development of advanced techniques that elicit, weight, and combine the judgments of many intelligence analysts.”

Their collective forecasts combine the insights of hundreds of forecasters using statistical algorithms that, ideally, help to extract the most accurate signal from the noise of conflicting predictions. Analyses of data from the first tournament season suggest that prediction accuracy can be boosted by “transforming” or “extremizing” the group forecast.

The Good Judgment research team is preregistering new forecasters for the next tournament year, expected to begin in June 2013. Those who register now will be placed on the waiting list and can expect to hear feedback in late April or early May, 2013. New forecasters will be admitted in order of their position on the waiting list.

You can keep in touch with the Good Judgment Project, see related news, and view updates of the project status at their blog.