I remember, back in the dark ages, when I ONLY had dial-up to the Web, that I saw DSL access as nirvana.
Gosh, when I get to that level of SPEED, watch out!
Well, I've had IT (dsl) for sometime now and guess what? I'm still waiting on downloads, just like in the Old Days. The connection IS faster, it's just that the downloads are humongous. Some of the files I'm getting now are so big (how big are they?), that they have to be broken up into smaller pieces.
Okay, a lot ARE iso images, for Linux Distros of course, but everything is bursting at the seams. And a goodly portion of these goodies are going to go into our ever-growing hand-held devices. That's a lot of crap for a small bag, if you'll pardon my french.
I have a better IDEA.
If All of this software is pretty-much standardized, or better yet, open-sourced, why don't we just download the stuff that changes. I would guess that 70% of web traffic is repeated data. Why?
I'll give you an example: TEXT. How many times do you think the word "the" gets transmitted? A number should replace "the". Maybe, 1. That's 1/3 smaller. We saved 66%
Example 2 GRAPHICS: Remember Christensen's animations in the early days of computing. He suggested the same thing for animated drawings (I'm not talking 3D movies here). In a screen with X by Y dimensions only a small portion of that screen changes. Don't redraw the whole screen, only redraw the pixels that change. Good idea. Made for really snappy animations.
This is not rocket science. 1 is better than 3. A portion is better than the whole. Traffic savings is huge. And a better way to included the hand helds that are begging for lush media. Because of propriety restraints, I suggest that open-sourcing these ideas and tools is a better way to solve the problem.