Indeed, there a few different dimensions to consider when planning to scale your BI environment. For instance,...
there is the "mixed workload" factor -- query processing versus ETL. Adding additional processing power and memory is the most effective means of addressing incremental growth of data. It's important to realize however, that there's a point at which the physical table structures -- not to mention the physics of disk I/O -- will prevent the linearity of queries (e.g., in order to support x percent new data, add x percent new storage and CPU). Because ETL is much more I/O intensive than standard query processing of most BI environments, simply adding more storage and CPU can be impractical. Bottom line, you can't just throw more horsepower at more data.
Dig Deeper on Extract transform load tools
Related Q&A from Jill Dyché
There’s a lot of confusion about agile business intelligence (BI). Get an expert’s take on what agile BI really is and if it’s a valid BI development... Continue Reading
Are some companies primed to get more use out of social media analytics than others are? Find out, plus learn how a social media analytics strategy ... Continue Reading
What’s the biggest BI problem companies keep running into? Overloading data at the expense of functionality, says an expert. Find out how to avoid ... Continue Reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.