Friday, January 21, 2005

 

Processor Developers Hit A Wall

There's been a lot of well-founded speculation recently in the IT industry that the phenomenal growth in the power of the CPU chip has started to tail off. A good article is here and another is here. Apparently, extrapolating from previous growth, by now we should all be using 4GHz chips. But Intel's fastest chip today runs at 3.2GHz.

I have a copy of the world's first spreadsheet, VisiCalc. Thanks to Microsoft's philosophy of backwards compatibility, I can run this 27K (yes, K, not MB) executable on my Windows XP machine.

Think about that. The first 'killer app' - which provided the impetus to sceptical managers and executives to bypass their IT (or Data Processing as it was then) departments, go out and buy microcomputers like the Apple II and become empowered to crunch their own numbers without waiting on the DP department to do it for them - fit in 27K. It would run - and allow the user to accomplish real work - on a PC with 64K of memory or an Apple II with 48K and a 1MHz 6502 processor.

It took true skill and insight to develop applications for platforms like that. The first version of Microsoft BASIC that I used was on a 6502 based computer. It fit into 8K of ROM and would work in 4K of RAM. Disassembling this ROM revealed tricks like squeezing a few extra instructions in by using the (fairly redundant) BIT instruction and hiding an opcode in its operand. I can't remember the opcodes exactly but if you did

LABEL BIT $A900

All that would happen in the normal flow of control would be that the processor flags would change. But if you jumped to LABEL+1 byte, the instruction hidden there (LDA #$00) would be executed.

That trick saved memory but the real challenge was getting sufficient performance, in other words, fast response times, out of the slow computers of the day. That took a great deal of knowledge and skill on the part of the application designer. "Real Programmers" in other words.

Part of the boom in technology over the past fifteen years can be attributed to the fact that prior to around 1991, if you wanted to develop a serious business application, you needed a great deal of skill as an application developer (programmer).

Now, everybody uses the "Dummies" books and code generation facilities in packages like Access, Excel and Visual Studio, in conjunction with pre-built components, and calls themselves a programmer. The programmer's role has been dumbed down of late.

Today people want to use their computers for processor intensive tasks such as full motion video editing, which were unthinkable ten years ago. It's ironic that precisely at the point that we've come up with applications which can use all the power the processor manufacturers have given us over the past five years, tempting us with all the possibilities, they hit a wall.

Getting adequate performance going forward will mean knowing how to exploit the parallelism technology in the new generations of multicore chips. Multithreaded and parallel programming requires a good deal of knowledge and experience to do right.

Maybe the development tools and operating systems we use will hide all the complexity. But maybe, just maybe, we are on the verge of the programmer needing to be a wizard once again. To build new development tools for the masses, if nothing else.


Comments: Post a Comment

<< Home

This page is powered by Blogger. Isn't yours?