Big data poses a difficult challenge for traditional business intelligence (BI) tools. Most were not designed to handle terabytes to petabytes of data or run against legacy systems as well as the growing number of purpose-built databases and file systems that don’t use ANSI-standard SQL. Given the size and complexity of these environments, most can’t easily deliver an integrated view of back-end data.
At the same time, organizations expect self-service tools that enable a wide range of business users to query, prepare, visualize, and analyze data without assistance from an information technology (IT) professional. And they increasingly want the freshest data delivered to them in real time so they can quickly respond to problems and exploit new opportunities. These organizations expect the tools to provide granular security and access controls that prevent users from seeing unauthorized data or publishing reports and dashboards at will to anyone in the enterprise.
Written by Wayne Eckerson and Phil Bowermaster from Eckerson Group, this report defines the 10 most important characteristics of a big data analytics tool. Companies should evaluate their current or proposed reporting and analysis solutions against these 10 characteristics to ensure they are getting a product designed from the ground up to handle big data at speed that emanates from a variety of systems and sources, including the internet of things.