Posts

Showing posts from August, 2006

Linux vs. Windows and OS X: it is the economy that will be the driving factor

I had to laugh a bit this morning while enjoying my coffee: I was reading responses on Slashdot to Tom Yager's optimistic article on Apple's market share growth. While I am a Mac fan (I am writing this blog on a Mac and I have been writing Common Lisp code on a Mac since 5:30 this morning), I think that so many people miss the point of operating system dominance in the future: Step outside of pure technology for a minute, and think about the global economy and were the buzz is right now: the US is educating a very small fraction of engineers and scientists as places like China, India, Europe, etc. The largest growth will be in what are now poorer countries, but expect a more level playing field in the future. There is a good reason why high technology companies are increasingly setting up research labs outside of the US: less expense and a good supply of highly motivated and educated talent. Using Linux in developing countries makes the most sense: computer science students get

Open source, free, and commercial software

Slashdot linked an article about Eric Raymond's support for proprietary binary Linux drivers . I think that he is partially right, but I would like the freedom to choose whether to use binary only drivers on my Linux boxes. As a software developer, I look at commercial development tools as a way of life. While I find the free Ruby + Eclipse + RDT + Radrails combination to be just fine for Ruby development, for other work the commercial tools are just a cut above. I very much enjoy using Franz Lisp, but it is expensive: Franz has been in business for 20 years continually improving its one product (Franz Lisp), and it shows. I am using Franz right now for a large project - the licensing costs are a good investment for my customer. I hope to restart my knowledgebooks.com business next year (I have set it aside the last few years because the consulting market has been so hot) and I am going to try to justify to myself the costs of Franz because that is what I would like to use. Another

I started a new blog just on AI theory

Artificial Intelligence Theory will probably be a very low volume blog. I am planning on using it more in an essay or white-paper writing mode. One thing that I will probably write about, in addition to more practical topics like probabilistic networks and reasoning systems is a long time interest that started in 1976 when I bought Bertram Raphael's great book The Thinking Computer: Mind Inside Matter : Computer Go Programs. I spent a lot of free time in the late 1970s writing what I am quite sure was the world's first commercial Go playing program Honnibo Warrior. I am still very interested in trying to develop some cross between NGRAM style hashes for local board positions and efficient storage mechanisms like AllegroCache to solve some tasks that if not strictly required by a Go program, would at least be more like the way human experts play Go: in other words, figure out how to implement the temporal and spatial memory in the human neocortex, but in software, and efficient

Writely has started accepting new accounts

I was especially interested in Writely because I wrote something similar, but simpler myself ( KBdocs.com ). I tend to use a lot of computers, and I wanted to have a simple online word processor for my own use. KBdocs.com took me about three evenings to write (a really easy project, because I used Kevin Roth's JavaScript Cross-Browser Rich Text Editor). In any case, Writely seems to be well done, although it did throw an error during one operation. I expect that it will soon be excellent. I have become rather addicted to Latex recently - even more so than in the past, and with a handy subversion server, web based word processors are less of a draw to me because it takes a few seconds to grab working document files. That said, web applications like GMail, flicker, Picasa Web Photo site, Netflix (excellent!) and others take up almost all of my non-working time with computers.

Indie Game Development, AI in games

Slashdot has a discussion on Microsoft's "free" PC and XBOX 360 game development kit. There are also other good low cost alternatives for Indie development like Torque . I spent a few years doing AI game development at Angel Studios (2 Nintendo games, prototype networked PC hovercraft game, and a VR system for Disney) and although I have been working more on 'practical' AI applications since moving to Sedona 7 years ago, I still have a keen interest in gaming and AI for games. A few years ago I thought of setting up a cooperative game development community for fun and maybe some profit, but my consulting business keeps me too busy, at least for now. Another thing that keeps me from making a large investment in an independent game making co-op is thinking how much money was spent writing commercial games at Angel Studios: teams with dozens of professional artists, programmers, a few musicians, etc. are expensive. That said, game AI programming is great fun and surp

OpenCyc 1.0, AI in general

I noticed on Slashdot that OpenCyc 1.0 has been released. I spent a short while reading comments and realized how different my own views on AI are from many Slashdot commentators. To me, AI is all about writing software that makes decisions given uncertain and sometimes contradictory information. AI is about modeling problem domains and working both within that model and changing the model as new information becomes available. AI is about using problem domain models to provide human users with useful, interesting, and unexpected results by matching a model of a user's inquiry. AI is about solving the game of Go: the branching complexity of the game is so great that having perfect information is not enough. So, a tool like OpenCyc is not really a match to my personal view of what AI development is: Cyc and OpenCyc try to define ontolological knowledge of real world common sense knowledge. I appreciate decades of hard work, and I have myself spent many hours experimenting with earli

How much more productive is using Latex rather than Word or OpenOffice?

I had to start writing some software documentation this afternoon and also start a separate set of research notes. I am not sure why, but it seems like I get more work done (quicker) if I grab a Latex template file and just start typing - saving generating a PDF for viewing until the end of each work cycle. There does not seem to be any overhead at all for using Word or OpenOffice, but I still have the feeling that work goes faster just blasting in plain text, with a little markup. I am too busy with consulting work to spend much time on it, but I started a new for fun writing project using Latex and some custom code for inserting both program listings and the output from running the program examples - it is clear to me why Latex + my custom code is more efficient for programming texts using than using a word processor. BTW, my new 'for fun' writing project is "AI Programming in Ruby".

Globally unique identifiers

I really enjoyed listening to Tim Bray's talk on developing the ATOM specification on ITConversations. He made a lot of interesting points, but the one that resonated most was ATOM's requirement for a globally unique identifier for every feed and entry. With more syndication, we all see lots of duplicate material. Examples of duplication can readily be seen on rojo.com (used to be my customer, and I still enjoy their site a lot) and technorati.com: we end up with many URIs that refer to the same material. It is possible to write software that detects duplicate feeds, but comparing two articles is not an inexpensive operation, and when comparing a very large number of feeds, the O(N^2) runtime is painful. I have experimented with much a less accurate algorithm: hash NGRAMs of articles and check for duplication using a hash lookup. I have found that this gives poor results - at least in my experiments. If you do partial matching of NGRAMs, you are back to O(N^2). (If anyone know

Yes languages affect our thoughts, even in programming

The Sapir-Whorf hypothesis poses that our native language affects how we think. Computer programming languages also strongly affect how we think about, design, implement, and maintain code. I was working on some tricky code this morning that builds on some Common Lisp CLOS class libraries. The new code is really orthogonal to the existing functionality and it seemed like a poor idea to merge the new in with the old, especially since the old codebase will probably be used as-is for a while. I decided to start a new module (as defined by physical file organization) that added the new functionality to the existing classes as generic methods. The new module stays small, and anyone needing to use the original codebase is not confused with extra code for functionality that they do not need. In Ruby, I like to do the same sort of thing: have different modules (as defined by physical file organization) where new orthogonal functionality is added by defining new methods to existing classes in