Thursday, July 30, 2009

Yahoo! and Microsoft

Yesterday, Microsoft released GPL code, and we now know that there was nothing altruistic in that. Today, they ally with Yahoo! What now?

...

Search on the web is a wicked problem, so one typical methodology is to build multiple attempts of solution to the problem and let them evolve, compare... That was the case with multiple search engines.
Now we will have only two major ones: Google and Microsoft. I don't know if I should rejoice because the evolution has come to an end, or if I should cry because monopoly problems get in the way of solving the websearch problem.

...

Anyway, if Yahoo! ditch BSDs to favor Redmond technologies, they get onto my list of companies to avoid as much as possible.

Friday, July 10, 2009

Virus free OSes and Google Chrome OS

It's been buzzing all around about Google Chrome OS. Google announced they would create a new Linux-based OS called Google Chrome OS and they said "[they would make it] so that users don't have to deal with viruses, malware and security updates".

A lot of articles have reacted to the news, and to the claim. Bruce Schneier was quoted saying that it was an idiotic claim to pretend it would be a virus free OS. And he explained later that it was an answer on the phone, to a journalist, and that he hadn't read the news in the original text by then.

Indeed, Google didn't claim they would produce a virus free OS, and they did well. If I am not mistaken, it is always possible to create a virus on a Turing machine or equivalent. And, as Schneier quotes from Fred Cohen (1986), it's never possible to create a perfect antivirus program.

Google's claim is much more subtle and quite interesting. They said that the user would not have to deal with viruses, malware and security updates. And that seems quite possible to me, or at least quite feasible to improve on, compared to the current situation.
In my imagination, Google wants to silently push all that's needed from the web directly onto their OS. OS patches, antivirus definition files, and why not also manual patches when needed?

Take the example of the handling of spam by Gmail. They have a set of rules, which they can modify very quickly, and even modify "by hand" for a singular point. In comparison, at the workstation level:
  • in a typical open source environment, you would need an update command. Even if that's quick, that would require something like:
    # apt-get update; apt-get install last-spam-filter
  • in a typical closed source environment, it would require an update by hand.

Here, the rules, updates, patches, and even new versions of the soft immediately come through the browser. Even if the system makes no breakthrough in terms of fundamental security, you will get an excellent increase in overall security from the regular update of software. No more unpatched OS, unpatched browser, unpatched AV...

So far as I can tell, that would save companies big heaps of money on exploitation.

PS: That uncovers a lot of questions for me, such as: How will MS react? Why didn't MS try to do the same? How can competitors get a foot into the same market? Won't Google become a new empire of evil? Will Google's business survive to DoS attacks? How can any evil competitor prevent Google from getting into that market? How will the Google Chrome OS get onto the PCs in the first place, will it be shipped with PCs, or will users need to install it? Where do you set the limit between what Google remotely do and what they don't do? How will governments react? What about privacy of information? What about national spying issues?

Questioning marketshares of webservers

Nothing developed here, just a question: aren't the statistics about the market shares of the various servers obfuscated by the use of front-end technologies such as reverse proxies, web accelerators, load balancers, etc?