Finding a pattern: "The essence of intelligence analysis is you don't know what you'll need to be looking for tomorrow morning" - a rather interesting article on the increased (percived) need of good algorithms for finding and understanding relevant information. But the challenge still remains - how to predict something that has not happened by something that is happening? As more and more data is available for analysis, does it make it easier to build connections - or to simple? If I buy a donut and a coke every morning before there is a bus-bomb in Israel - does that make me a part of the event? Or even a predictor?
"could grow to 120 terabytes" (same article) - how many words and numbers can you fit with that?
"Certain algorithms that might work with a couple megabytes of data don't work when you have gigabytes' worth of information" - and that is part of the problem ... the more data used, the greater the chance of "butterfly effect" - seemingly linked but actually random or chaotic connections, beyond our "control". And that could lead to a lot more "wild goose chases" than good preventive work.