When I was 15, I read one of the three books that shaped my early attitudes, perceptions, and thought process. That book was Frank Herbert's Dune. The book is about an extremely gifted 15 year old who rises to political power because of his talent and training. But he has trouble finding people who relate to him as a person because of how special he is. As another gifted 15 year old, the entire story resonated with me, and many sections of the book left their mark on my memory. My favorite quote isn't the popular "Fear is the mind killer..." quote, however. It's about finishing things:
Arrakis teaches the attitude of the knife--chopping what's off what's incomplete and saying: "Now, it's complete because it's ended here."
The attitude of the knife is an attitude of firm boundaries. It is the decision of royal fiat, the parent's "because I said so". The attitude of the knife is the act of will that sacrifices short term opportunities for long term gain by ensuring that things never grow beyond the size of usefulness.
Once you've embraced the attitude of the knife, you have significantly more peace in life. If you stain your favorite shirt, you might be able to wash the stain out, but you'll be stressed until you know one way or the other. If you accidentally rip that shirt in half, there's nothing you can do-- that's the end of the shirt. Certainly you'd rather have the shirt intact (and unstained) than not, but that's no longer an option. The shirt has met its end and your only option is to throw it out. The attitude of the knife reduces many choices to just one, even if it's not the "best" one. It is the recognition of when a problem is not solvable, at least by you, despite being a real problem.
You might be wondering what this has to do with writing Artificial Intelligence. One of the bigger problems in software development is feature creep, the desire to add "just one more feature" or extend the usefulness of an existing feature. But things can easily grow out of control, beyond the ability of the project's developers. Often the features become so numerous that it's difficult for users to do what they want to do. All the features you don't care about get in the way. (Microsoft Office and Window's Vista, I'm looking at you.)
The attitude of the knife puts an end to feature creep. It says, "Sorry, that problem is out of my jurisdiction. I understand that's a real problem and you have to accept that it will continue to be a problem." Because Artificial Intelligence is so complicated, there's an awful lot of problems and it's very easy for a project to grow out of maintainability. You need to the attitude of the knife to write AI, or you'll never finish.
When I first started BrainWorks, I decided on two strict boundaries. First, every piece of source code has to be run on the game server alone, only running when the server requests AI processing. No changes to non-AI parts of the server were allowed. Second, no modifications to the core game engine were allowed either. The core engine that ships with Quake 3 runs the most complicated AI algorithms, such as navigation, motion prediction, and item pickup. If I found issues in the engine, I was allowed to write my own version but I could not modify the engine, even to fix bugs I found.
As fate would have it, all of the core engine algorithms for AI were dysfunctional. I did end up rewriting most of them-- weapon selection, item pickup, and motion prediction were all rewritten from scratch. But the one unsolvable problem was navigation. The core engine doesn't do a good job of navigating anything but a flat, two dimensional map. What's worse is the core engine doesn't even give the AI code enough map information for the AI to try a better job of making navigation decisions. The problem simply cannot be solved given the constraints of the system.
That's a real problem, but thankfully it wasn't my problem. There were enough other issues to tackle in writing BrainWorks. Many times it was a relief to find a bug that could immediately be ignored. Not because it wasn't a problem, but because nothing could be done about it anyway. And to be honest, most of the time the navigation decisions are actually correct. It only messes up in certain areas of certain levels. On the other hand, the core engine's item pickup and weapon selection decisions were universally wrong.
For those of you curious what two other books formed my life attitude, they are Godel, Escher, Bach: An Eternal Golden Braid and the Christian Bible. While I'm no longer a Christian, I've still taken to heart the positive humanitarian parts of the Bible. And I've not forgotten my promise to explain why I'm no longer a Christian and what part Artificial Intelligence played in that story.