Streaming games to the desktop - any browser, all you need is a browser and a connection. This model works well for email and simple apps, but surely not for streaming video games? The way the high-end computer game model works now is for game players to download a client and then connect to a shared server to play. The graphics and animations are stored locally on a users machine, and are rendered by the users computer. Gaikai (and onlive) are changing this - now the game servers are sending rendered frames to the user - there is no longer any need for the users to have souped up computers. All the processing is done on remote computers - the network is king. This is important because as we transition to web 2.0 applications and move to a cloud-based idea of computing we can see where it is going. Gaming is a multi-billion dollar business, with global video game sales surpassing movie industry income. I think our paradigm of technology use is changing - or rather, returning to an idea of thin client solution we had in the 70's and 80's. Now, I'm not saying there is no need for local processing power. I cut my teeth on an IBM PCjr, TRS-80's, and TI-99a. I learned how to hack on these machines, and I still believe it's important for students to learn how to program. However, as a trend, the action will be on the cloud. Now, for education: how can we teach our kids to create cloud applications (real applications)?