Thursday, February 7, 2008

What stopped the space race?

Why do some areas of technology succeed while other, seemingly superior technologies wither and die?
Tesla, Einstein, and Edison are gone, and no one has replaced them. Why aren't there any new great inventors?

The great sci-fi writers of the past had us colonizing the moon and maybe mars by now. They predicted that our resources would come from asteroid mines as early as 2000. They were wrong. And yet we could have been where they predicted, if not for the cancellation of research programs.

We are at technological impasse. We continue incremental development. The easiest example of this is the serial nature of certain video games. They churn out a new copy every year. It is technologically superior to its predecessors, it introduces new features, but it is not really new.

Another example delves into the computer market: Why do we use inefficient processors when far more efficient ones are under development? The x86 architecture is flawed, and there are many far superior RISC architectures available. Even the x86 vendors know this, and to demonstrate that, both major x86 vendors have been using what is known as micro-ops for many years now. Micro-ops are essentially a way of translating a RISC processor into an x86 processor.

What was the last really revolutionary development that we, as the human race, accomplished? Nuclear power? That was over half a century ago. Space flight? The first Apollo landing was 1969, nearly fourty years ago. Since then, what have we accomplished? The transistor was a start, then the microchip, and finally the microprocessor in the late 1970s. It could be argued that the internet is a major achievement, but the number of pitfalls it brought with it is almost enough to call it a blunder.

Within the scheme of medical technology, there have been many advancements, particularly in imaging systems. These have been largely incremental as well. X-rays, ultrasound and MRI are the three truly new technologies that we have managed in that time. There are many derivatives of these three technologies, but the principles that govern them are unchanged from the original. In a CT scan, x-ray radiation still passes through the body and is received for interpretation. In FMRI, energized atoms still snap back into line with a magnetic field and release a characteristic signal in the process of doing so (the base principle of MRI).

We have accomplished very little since the 1950's. Let's look at what we managed before 1950. The early 1900s saw the rise of electricity as a practical phenomenon, and the beginnings of radio communication. 1903 saw the wright brothers manage the (arguably) first human flight. Vacuum tubes started the electronic era in the early 1900s. The wars of the early part of the 20th century brought about advances in surgery, transportation, submarines, and vast numbers of new ways of killing each other, culminating in the creation of the atomic bomb, the first step towards atomic power.

Many of these advances were truly revolutionary and forever changed the outlook of the human race as a whole.

So what happened in the 50s that killed off our scientific approach? What stopped the amazing innovation that we had through the last half of the 19th century and the first half of the 20th? It's a complex topic, and this is by no means representative, but I think it really boils down to two points: Money and Patents. So really just money.

You can tell when an inventor is working at something new. They'll come up with an idea, and THEN think it through. They'll often test the idea first and then work out the scientific explanation for it. In other words, offer experiments first, math later. This is practical intuition. The human mind has a massive potential for intuition, but we have been repressing this ability with the bastard child of the scientific method and capitalism. This works out to "1) Provide a business case for the idea to make money. 2) Prove mathematically that the idea will work. 3) Actually test the idea."

I doubt Tesla could have worked under those guidelines. He lived through the very beginnings of this regime and that is probably also why a large number of his inventions never made it out of of his lab.

The world operated on intuitive invention for many, many years. The simple reason is that the practical results were more important than the reason for them. The world has changed. With the stock market in control of most major corporations, trying out crazy ideas with no justification is a good way to get fired.

Back to the present. You can obtain a patent for an idea with no working model. You can then sit on the patent for 17 years from its issuance with no obligation to develop the subject of the patent. You can patent virtually any idea, and there is very little practical scrutiny of the idea. This can be proven by looking at the vast number of patent busting suits that the EFF has launched and won. How can we claim that this promotes innovation? If the patent holder at least had some kind of obligation to bring the invention to the market, there would be some hope of patents improving the quality of life for humanity. Patents also harm the possibility of producing derivative inventions. Because they are based on a patented idea, an inventor will likely end up paying royalties to the previous patent holder.

Patents are in place to provide financial protection to an inventor--there's that money again. An inventor who is not backed by a company likely cannot afford a patent. This means that, practically speaking, only companies can hold patents. So how would a modern-day Edison or Tesla get their ideas off the ground? They wouldn't. They'd be laughed out of a meeting because their idea was too far-fetched. Very rarely, an inventor will be hired or paid for their idea, at which point the company that buys it will own the patent, likely for a fraction of what it is worth.

The early days of the Web saw a brief change in this mentality. A good idea could sell for millions of dollars. In many cases, bad ideas sold for far more than they were worth, which is what ultimately caused the dot com bust.

Now we are in a post-dot com era and shareholders once again want proof that a concept will work before they are willing to finance its development. This is definitely the safer path, but we have lost something along the way. Call it our technological innocence. There's plenty of evidence of that in the computer security world.

So what does all this have to do with men on the moon? Well, I imagine that no one managed to produce a business case to support the early stages of development of a moon base, so they scrapped the program. And that led the move away from development for the sake of development. There are arguments for and against development for development's sake, but I'll give an example of where it will make a difference in many peoples' lives.

Since the American civil war, the technology behind prosthetic arms has stayed the same because there was no money in development of new ones. Some people came up with new prosthetics, but after several weeks of use, most amputees would stop using them due to the discomfort they caused. Well, now DARPA has funded development of a new generation of prosthetic arm. It's not quite a neural linkage, (I think that would have been better, but DARPA specifically said non-invasive) but it's close. The new bionic arms interface with a normal movement through nerves spliced into the pectoral muscle. Sensors on the muscle tell the arm how to act, and vibrating motors on the muscle provide feedback to the user. If it weren't for the DARPA funding, this project would have never made it off the ground, and yet we could have had it years ago if only someone had been willing to commit the money.

Thursday, August 2, 2007

Kick the drunken hooligans out

There's been a lot of discussion about the local entertainment district's problems with people committing liquor-related offences, such as fighting or urinating in public. The common line that people are taking is that there need to be harsher fines, and ones which aren't so easy to evade.

Apparently, the fines are only $58, which many feel makes them little more than a slap on the wrist, especially considering that they appear to be easily evaded.

I have a suggestion for a more direct solution. Everyone is required to produce government issued photo-ID to enter clubs, bars, etc., or purchase alcohol. Currently, all this does is provide proof of age. I say we step it up a notch.

Much like some driving fines come with a suspended license, I think we should make alcohol purchase a privilege, not a right. It would be simple to implement. Simple install card readers at the clubs/bars. You have to swipe your license to get in. The card reader would go out and check your age, and whether you had any restrictions.

Next, make the liquor related offenses come with alcohol restrictions. If someone fights in public and receives a ticket, they get barred, for two months, from entering clubs or bars, and from purchasing alcohol at licensed restaurants or BC liquor stores. Now, maybe that ban should only extend over the affected area, but it could be extended to include other liquor related offences, such as drunk driving. Imagine that if someone were convicted of drunk driving, not only would they lose their license, they would also lose the ability to purchase their own alcohol in this province for, say, two years after they're out of jail! That consequences of a penalty like that might be enough to coerce people into slightly better behaviour.

The infrastructure is not unlike the one in place already to ensure that you don't get conflicting prescriptions from two different doctors. And the consequence scheme is not unlike the suspended license. The big difference is that liquor consumption appears to be considered a right.

Wednesday, August 1, 2007

You should all buy macintosh computers.

Now, for those who know me, this will sound like heresy. I personally don't like macs. The reason for this is simple: I don't have any variety. For hardware, I don't have the massive selection of the PC industry. For software, I don't have--scratch that, all the software I want to use is GPL'd and most of it is available for mac.

Now, if you're a gamer, it's a bit of a harder sell. But still, the more of an installed mac base there is, the more game developers will be willing to think of the mac user. So it's a chicken/egg problem. And by buying a mac, you help to produce more chickens.

It used to be that I wasn't a fan of macs in large part because of the OS. I found it too limiting for what I could do. Now that OSX is built on a BSD kernel, and there's a significant amount of *nix functionality exposed, that argument no longer stands.

Do I want to use a mac? Not particularly. I don't feel that you get a lot of value for the money. But I'm a techie. I can deal with weird things in windows or strange configuration problems in Linux. Most people would rather not have to deal with either of these.

On top of that, Windows is left inherently insecure. Most people use an administrator account by default because other accounts are too restrictive. On other platforms, that idea is ludicrous. You simply use super-user privileges when necessary and otherwise, you're a standard user. This is, in part, why Windows is so vulnerable to virii.

So why am I suggesting Macs if I don't want one myself? Well, it's simple. First, I think most people--excluding gamers, due to lack of games--would actually enjoy their use more than a standard PC. Second, it might just save you from the horror that is Windows Vista. Third, it will force software companies to start looking beyond the windows market.

The jump from supporting a single OS platform to supporting two platforms is huge unless the software has been written to be portable from the start. The jump from two platforms to three is minimal. Added to this is the benefit that Mac OSX is not only POSIX compliant, but a lot of the kernel, and API is pretty similar to Linux.

The end result is that the more software there is written for Macs, the more software will be available for Linux. Which is what I'll be running.

Tips

Over the past two days, there have been a lot of letters to the editor surrounding the practice of tipping.

The fact is that a lot of people who are waiters/waitresses rely on tips to make up some portion of their income. Like it or not, that's how the industry is structured.

Tipping has some good points. It's one of the few ways in which customers can give monetary feedback on how well someone in the service industry has performed their duties. I actually wish that more service positions worked that way.

Commission is a similar concept, but in reverse. It gives equivalent feedback to the personnel, but the source of the feedback is the organization, rather than the customer. In this case, the feedback is based more on units sold rather than on customer experience. The two are linked, but not as intimately as tipping.

Tipping is ingrained in our culture. It needs to be looked on less as an obligation and more as an opportunity to let people know how well they're doing.

Tuesday, July 31, 2007

Control Loops

Part of my work is programming; as such, I occasionally run into a limitation of a language. Today, the language is C/C++ and the limitation is control loops.

Here's my beef:
If I want to make a loop where I do a complicated test every time I go around the loop, there are two options open to me:

First, I can use a while loop, and duplicate code.

foo()
{
a = xyzzy();
b = zxc();
c = bop();
while (a-b>c){
gazonk(a, b, c);
a = xyzzy();
b = zxc();
c = bop();
}
}
This is ugly. It demonstrates the problem, however. If xyzzy(), zxc(), and bop() are functions that are not fast to execute, saving them to a variable is imperative to performance. They can't simply be combined into a single true/false variable because gazonk needs them separately.

The second option is that I can embed my other calls in a function with references, and use a for loop.
foo()
{
for (bar(a, b, c); a-b>c; bar(a, b, c)){
gazonk(a, b, c);
}
}
bar(&a, &b, &c)
{
a = xyzzy();
b = zxc();
c = bop();
}


This one's not much better. It requires an extra function call per loop, and still requires a duplicate function call. I'd rather have a structure where I can always perform the same "init code" before the test. An extended do-while loop, if you will. Here's an example:
foo()
{
do{
a = xyzzy();
b = zxc();
c = bop();
} while (a-b>c){
gazonk(a, b, c);
}
}

It could be used with a for loop as well; for loops could be extended into a do-for loop.
foo()
{
do{
a = xyzzy();
b = zxc();
c = bop();
} for (int i=0; a-b>c; i++) {
gazonk(a, b, c, i);
}
}

The most interesting thing about this is that there is no good reason that this structure shouldn't exist. It would implement perfectly well in assembly. The loop test would simply not be the loop entry point.
So where's my do-for loop? And do any other languages implement this very useful and frequently used structure