Monday, May 31, 2004

A File Integrity Verifier for Windows

A useful, albeit limited tool - Microsoft's File Checksum Integrity Verifier, sort of an extremely crippled and unsupported Tripwire for Windows users. Still, it's better than nothing, particularly if administrators have the good sense to burn the resulting information to write-once only removable media. Now that there are more than a few rootkits out in the wild for NT-based platforms, a tool like this is sorely needed.

SeDebugPrivilege vs. Debugger Users

It appears that membership of the "Debugger Users" group does not confer the dangerous SeDebugPrivilege on a user, and only makes possible the usage of Visual Studio.NET debugger (as opposed to CorDbg, which requires no special group memberships to use.) Members of the "Debugger Users" group are still only able to debug programs they've launched themselves, unless they're members of the Administrator group of course. Here's a useful MSDN article which gives further details on the issue.

Sunday, May 30, 2004

An Attack on the Needham-Schroeder Authentication Protocol

An interesting paper by Gavin Lowe which details an attack on the Needham-Schroeder protocol underlying Kerberos. What Lowe outlines is basically a man-in-the-middle attack that relies on the relaying of nonces. The message of the papers is as follows:

If the identity of a principal is essential to the meaning of a message, it is prudent to mention the principal's name explicitly in the message.

Protecting Against Kerberos Attacks on Windows

WindowsDevCenter is carrying an interesting article on Kerberos security in the Windows context.

The closer I look at Kerberos, the more I'm driven to wonder why it is the SixApart people didn't look at this as a template for their efforts, both as an example of how federated authentication systems can be implemented, and as a guide to what not to do. Did the Trotts bother to read the extensive literature on this topic beforehand, or did they fall victim to the "NIH" syndrome that is so common amongst hackers and codewriters?

ADDENDUM: I've just discovered this intriguing post by Krishnan Nair Srijith, author of the OpenPGPComments plugin for MT, in which he subjects the TypeKey protocol to a fair bit of rigorous scrutiny; like me, he also relates the TypeKey proposal to the Kerberos protocol, and the conclusion he comes to is fairly unsurprising:

In simple English, if you look at the steps involved in the process, you will see that as long as anyone can get hold of the URL sent from your browser to blog server after you have entered your username and password, that person can easily impersonate you in all the blogs that accept TypeKey authentication, for a fixed period of time (which can be a long period). Since this step occurs without any encryption involved, anyone with the right tools and in the right place can easily sniff the packets to get the "required" information. After that - *goodbye security*.
As I've said before, TypeKey is merely a way of making MT users feel that "something is being done", and is actually worse than useless as a practical matter, insofar as it inspires in people a false sense of certainty that people are who they claim to be.

Monday, May 24, 2004

Graphical Composition in Avalon

A Slashdot entry led me to this site, which has a collection of PowerPoint articles on Avalon's graphical composition system (see I, II, III and IV). The more I read about Avalon, the more it feels as if Bill Gates and company are attempting to realize my original vision of the PC as the ultimate extensible search/visualization application: put Avalon together with WinFS, and that is precisely what you have.

If the Redmond people can realize what they've set out to try to accomplish here, I have to say that Google's goose will be well and truly cooked, assuming Google intends to stick with just the web search interface. Visualization is the key to better information retrieval in the long run, as we are visually-oriented primates. Of course, I assume Brin and Page realize this much, so I expect to see some more desktop-oriented applications issuing from their firm sooner rather than later.

I'm starting to get a lot more excited about the potential of Longhorn to be a revolutionary breakthrough than I once was; if the vision is realized as currently planned, this will not be a mere OS X clone with some extra garish tweaks to it. This being Microsoft, though, I'll wait and see how the implementation matches up to the vision before I start swooning in admiration - the repeated delays in the Longhorn release schedule are a strong indicator to me that they're having serious difficulties going from the vision thing to implementation.

PS: The discussion at the end of this article between a Microsoft developer and a bunch and of commenters about the differences between Avalon and Quartz is also interesting.

Thursday, May 20, 2004

They Don't Make Games Like They Used To

This interview with Tim Sweeney of Epic Games has some eye-watering numbers in it; it's a long, long way from the days when an Amiga 500, with its 512K of RAM and provision for 32 colors on-screen from a palette of 4096, was considered the gaming machine to have.

Tim Sweeney: For the third generation Unreal Engine, we are building two versions of every model in our game. We are building a source model with several million polygons, between 2 and 6 million polygons. We use that model for all the lighting detail on the mesh. Then we go to the in game version which is usually about 10,000 polygons. So we get the lighting detail of the full high polygon mesh baked down into a normal map that gets rendered in game on a low polygon mesh. The normal maps are typically 2k by 2k.


BU: UT2004 required 5.5 gigs of hard drive space to install. This has got to be a strain a lot of people's systems. So you're talking 2048 x 2048 texture sets, what kind of system and memory is this next game going to take?

TS: Well, we are aiming at the kind of PC that we think will be mainstream in 2006. We will also be able to scale it down. Basically DirectX 9 cards will be minimum spec, so any DirectX 9 shipping today will be capable of running our game, but probably at reduced detail. If you only have a 256 meg video card you will be running the game one step down, whereas if you have a video card with a gig of memory then you'll be able to see the game at full detail.

1 GB of RAM on a video card? I don't think such monstrous cards even exist at present, or even will for quite a few years yet! And here I was thinking how l33t I was, all because my video card had an awe-inspiring 128 MB of RAM on it.

One thing though: for all the gigantic numbers being thrown around by game designers these days, I must say that gorgeous graphics aside, the quality and variety of gameplay available nowadays seems to have gone downhill since the days when I was game-addicted teenager, and I don't think it's really a case of fuddy-duddy nostalgia on my part either. There are emulators out there for the Amiga, the SNES, the Sega Megadrive, and nearly every other popular system right up to the Sega Dreamcast; I've played around with these emulators in the recent past, and I find that the games available to run on them really are easier to get into than most of the current fare, and are more rewarding of repeated play to boot.

From where I stand, it's looking more and more as if the entire gaming universe for today's PCs and top-tier consoles has been reduced to two basic varieties of games, the shoot-em-up and the race-car contest. Like Hollywood with its addiction to sequels, and its penchant for substituting ever more impressive special effects for innovative plots wherever possible, the impact of exploding development budgets on the gaming business seems to have been to encourage development houses ever more to play it safe, and go along with the same old hackneyed ideas of yore, tricked out in more and more impressive eye-candy.

Saturday, May 15, 2004

Freedom 0 [dive into mark]

Mark Pilgrim explains why he's moved to WordPress - software freedom means never waking up to discover you're being shanghaied into forking over outrageous sums of money.

Wednesday, May 12, 2004

FreeCache

This new service just goes to show that I was right on the money when I said that the ever-falling price of storage would enable a wave of new services; here's one already, and from the PetaBox providers to boot.

Internet Archive: Petabox

What better illustration of the breakneck speed at which storage capacity is increasing can there be than this? 100 terabytes per rack!

I'm convinced that this trend will have revolutionary implications for the computing industry. It's now pretty clear that Google's 1 GB limit on Gmail accounts isn't by any means as outrageous as it seemed at first blush, even in the extremely unlikely case that most users would get anywhere near that limit with several years of usage. I expect a whole flood of internet-centred applications that take advantage of this trend to follow on the heels of Gmail - Scott McNeely and his gang were a little premature in claiming that "the network is the computer", but they were definitely on the right track. Microsoft's Longhorn, for all the eye-candy it promises, seems ever more to be a case of fighting the previous battle - which in this case means aping Apple's OS X.

Tuesday, May 11, 2004

What This Blog is About

I've felt the need for some time for a place on which I can express my views on various technology-related matters that had little or nothing to do with the international-affairs focus of Foreign Dispatches. If I have any heavily technical stuff to discuss, here's where I'll be doing so.