According to Forrester, the average company's data warehouse today is somewhere between one and 10 terabytes in size. So what happens to analytics over the next decade, as the norm moves toward the petabyte range? How will this hunger for precise analysis, combined with a flood of raw new data, set the stage for powerful, advanced analytics outcomes?
New architectures for data and logic processing are ushering in a game-changing era of advanced analytics. These new approaches support massive data sets to produce powerful insights and analysis — yet with unprecedented price-performance. As we enter 2010, enterprises are including more forms of diverse data into their business intelligence (BI) activities. They're also diversifying the types of analysis that they expect from these investments.
At the same time, more kinds and sizes of companies and government agencies are seeking to deliver ever more data-driven analysis for their employees, partners, users, and citizens. It boils down to giving more communities of participants what they need to excel at whatever they're doing. By putting analytics into the hands of more decision makers, huge productivity wins across entire economies become far more likely.
But such improvements won't happen if the data can't effectively reach the application's logic, if the systems can't handle the massive processing scale involved, or the total costs and complexity are too high.