Friday, June 24, 2005

Estimating the Airspeed Velocity of an Unladen Swallow

One of the greatest scientific mysteries of the last 50 years may have finally been solved by Jonathan Corum, through analysis of kinematic data on swallows.

And there was much rejoicing!

Friday, June 10, 2005

Apple, IBM, Intel - Missed One!

 Like everyone else who hasn't been living under a particularly large rock for the last couple weeks, I've become aware of a couple things.

Firstly, Apple has committed to moving its systems from PowerPC chips to chips from Intel over the next two years.
Secondly, the media is terribly excited about this.

I have both PowerPC hardware from Apple and Intel-based hardware from another vendor. It all runs Linux or OS X. I personally don't find this transition to be anything that I have to worry about. But the announcement, and the ensuing media coverage, have left me wondering a few things. Fortunately, I had a flash of realization this morning.

The media has consistently portrayed this as a move away from IBM and toward Intel by Apple. And that's not at all untrue - this does represent a win for Intel, and a vote of no confidence in the future roadmap of the G5 family.

But why, I ask, would Apple announce this now, when Intel's chips don't really offer any performance advantages over the G5? Won't changing from the G5 to the Pentium be a step backward, in fact, from 64-bit to 32-bit? And, says the media, won't this hurt their sales?

The answer, of course, is that it's not about IBM, and not about the G5. Not now. Not next year. Not until 2007.

Remember, Apple is starting at the low end, and won't go Intel on the high end until 2007. The G5 is only used in the Power Mac, Xserve, and iMac. And IBM doesn't make the G4. Freescale (formerly Motorola) does, and Apple's been discontented with Motorola for years - after all, that's why they went with IBM for the G5. G4 speeds haven't kept up with Intel, and memory bus speeds of G4 systems are woefully slow.

So this year, and next year, this isn't about IBM. It's about Freescale getting one last firm kick in the butt from Apple on its way out the door. And that is a huge win for buyers of Apple's low-end systems, even if the media ignores it.

Since IBM only makes the high-end G5, their arrangement with Apple probably won't end until 2007, when those systems go Intel. And that also means the G5 won't be getting replaced with current Pentiums - it will be getting replaced with something Intel's fabbing two years in the future. Something we may very well not even know the details of until 2006.

So... this isn't just about Apple and IBM. It's about Apple and Freescale right now and for at least the next year. Sure, IBM's been told its days as a chip supplier for Apple are numbered, but Freescale is first against the wall.

Monday, June 6, 2005

My Future in Academia: Or, Why I Won't Become a Professor

Some time ago, I mentioned to someone that at least one of the professors at the local university campus had, prior to becoming a professor and teaching people about a field in which I work, held jobs in that field quite similar to my own.

She responded by enthusiastically encouraging me to become a professor too.

Honestly, though, I don't see it happening, for a few reasons.

Firstly, I can count all my college credits without running out of fingers.
Secondly, I am not known for my rigorous pursuit of academic excellence.
And finally, professors in this university system tend to be at the top of the field.

A quick read of the graduate faculty shows that we are blessed with quite a few graduates of places like Berkeley, CalTech, Harvard, MIT, Princeton, Cambridge, and so on. There are only two people on the graduate faculty lacking PhD's - a librarian and a computer person.

The odds of me getting a PhD at all are slim, and the odds of me getting one from a school of as good repute as those listed above are, well, basically none.

So... I could take several years of my life and work on getting degrees, probably from some lesser school, and wind up as a professor, almost certainly at some little backwater college.

Or... I could stay where I am, maybe try to advance a little here and there, but remain a low-profile if somewhat vital cog in the works of a program that's widely regarded as one of the top 5 in this field, worldwide.

Fancy papers with my name on them versus being surrounded by the best and the brightest, with all the shiny toys?

The choice seems obvious to me.

Why I'm leaving Twitter.

I've stuck it out and continued participating on Twitter while Elon Musk has run it into the ground, made it progressively more inhospit...