Computing with Knowledge


 

Back in the day, there were two distinct paths for making computers "think." The first was a family of inventions that included multi-layer perceptrons, neural nets, genetic programming and machine learning. These are forms of curve-fitting. If you have ever performed a linear regression then you are familiar with curve-fitting: one graphs a set of points and derives a linear equation that models their behavior. Just a few years ago, higher order curve-fitting appeared as the science behind deciding what kind of music you'd like to hear next, or what else you might want to buy. It's gotten a lot more sophisticated, and is now called "AI."

AIs are trained. The products of training are matrices of numbers. The equations for the curves are unknown. AIs reproduce the data they were trained on when similar conditions present themselves. Generative AIs are large, intensely capable, but static. They know a great deal, but they don't reason.

The other old-school approach to making computers "think" was the product of linguistics and epistemology. Language loosely captures knowledge. Use of a computer to model language and interact with language was called Natural Language Processing (NLP). In the 1970s, classical NLP was a popular pursuit, and it seemed like an academic avenue that would bear intelligent computing more quickly than curve-fitting. But, NLP was difficult, brittle and not scalable. Excitement waned and other forms of artificial intelligence, such a Expert Systems, took the limelight until their prospects dimmed as well. NLP has since come to mean other things like text summarization, translation, data mining and command-and-control, like Siri.

Brainhat is not AI

Brainhat is a lonely adult orphan of this second approach to teaching computers to "think," thirty years in the making. It treats human language like a programming language. The problems of brittleness and scale are addressed with massive coarse-grained parallelism. A Brainhat instance can include an unlimited number of computers, each with their own knowledge domains. And, that same swarm of machines can serve many users, all at once, across the globe. Where toy implementations with old-school NLP systems failed, Brainhat approaches the problem with Gestalt. More is better!

Brainhat can complement AI

The two approaches aren't mutually exclusive. Brainhat and curve-fitting can work together. Brainhat can source information from ChatGPT, and act as a front-end to generative AI that adds short-term memory, the ability to learn and share new information, a means to execute knowledge-based processes, a way to assume perspectives and identities, and the power to reason openly and to explain its conclusions. NLP and Neural Networks--two old-school paths to making computers "think"--are meeting again in the mid-2020s and will be stronger together.

An invitation to you

Over the course of 2024, I am working to release the current version Brainhat and explain it. There is much to cover--from the basics of coding and parsing, to debug, to networking, and to creation and sharing of knowledge. My approach will be to verify a portion of the code, address issues as may have been introduced over the years, update the web site and make a release. And then do it again, repeatedly, until the process is complete. A programmer/user manual is in the works. I will publish it when the vetting process is finished.

Thank you for your patience and please join me! I hope that you will find that creating knowledge for use with Brainhat is interesting, fun and democratic.

 
Copyright © 2024, Kevin Dowd.