Help Us Create AGI - Developer
Thank you for your interest in working with LOGICMOO! We are implementing AGI with a whole new methodology devised after 30+ years of research.
If you came here from outside you may want to first read our Introduction to find out where we have our discussion forums.
Optional Meetings every Thursday at 2pm PST/5pm EST in Discord voice channel "Scrum & Paperwork"
Every week or so we update a list of needs for the project ToDo Here.
WHAT IS LOGICMOO?
Logicmoo is a blackboard system that allows a robot to maintain an internal representation of a metaverse.
A metaverse always has at least one dimension such as time. A metaverse can be a full 3D world, or a 1D world or many dimensions as it’d like. Whenever the robot imagines or senses something, that makes that something appear in a new or forked metaverse.
Proto-narratives are programmatic manipulations of a metaverse. IOW, narratives are made of a series of directives that cause changes and side effects. Each side-effect creates a qualia event. The “give” proto-narrative transfers objects' possessions from one entity to another. “Turn-taking” is a proto narrative involving more than one entity using a resource but along separate time coordinates. Some proto narratives manipulate only ideas. These ideas are part of the metaverse.
Primary Narrative is what the robot imagines itself to be doing. It is an autobiographical 1D scene of itself experiencing. For example: Thinking about who is the current president of its country there is a proto narrative of an instance of “it” “remembering something”, sometimes it only involves “it remembers a speakable name” and other times it might involve a “picture of it imagining their face.” The content of what it remembers isn’t part of the Primary Narrative. it may now however transfer that content into a sub-narrative (sub-metaverse) that is beside the Primary Narrative.
Metaverses are the result of interpreting source code that are directives that cache into an “animating state”. These states are “experienced” by the agent as imaginings. Memories are stored as the source code that when interpreted reproduce them.
LOGICMOO uses several interpreters that produce these Narrative metaverses at runtime. So far only a handful of the Textual metaverses can be logged into by humans. The code that presents the Textual metaverse is generated by lots of forward chaining that mitigates a set of things that can exist in the metaverse.
During runtime, the system can assert and retract narratives and have them make the 1000s of changes to metaverse. This source code is what generates metaverses.
The metaverses have programming languages. For example, the directive createVt(“color”) when executed produces a new value types that get added to the metaverses programing language as vtColor .. and then adds an extension method for any class to use to allow color to be placed on objects.
Source Code Access for Devs
With your GitHub member name add a new Issue at https://github.com/logicmoo/logicmoo_workspace/issues/new with the title:
Membership: @yourname
For large file pushes also Join Us on Gitlab!
Semi-Technical Overview
In the very simplest description there is a Nomic-MU (Game World) for the LM489 (AGI Bot) and the player to interact. Developed from Franken-SWISH notebook.
You'll want to take a look at the Intro to LOGICMOO Modules to create AGI
Then you should see this Platform Overview.
It's important to understand that LOGICMOO is a "representational-ist" cognitive architecture that uses several multi-coding agents. For information about existing cognitive architectures see Psychologically Inspired Symbolic Cognitive Architectures. The LOGICMOO system is similarly complex.
Developer Skills and Knowledge
We'd love to have you and if you're looking for a project that will push you to hone and learn new skills this is probably it. (see also our guide for non-programmers) If you want to really understand the larger how and why of every detail you will need both a Logic and Programming skillset - LOGICMOO implements a paraconsistent open world defeasible modal temporal epistemological deontic logic.
You'll need to be at least comfortable with some Knowledge/Ontology Representation Language(s), Discursive Logic(s). And for the deep software development you'll need to learn Prolog, Picat, or Answer Set Programming (like DLV.)
We have started to compile a Developer Study Guide to get you started.
Fill out our "Future Meetings" Survey
Source Repositories
Source repo https://github.com/logicmoo/logicmoo_workspace/
INSTALLER: https://logicmoo.org/gitlab/logicmoo/logicmoo_workspace#install-and-run
Within week or so we will have updated a list of needs for the project. Here ToDo
(Whenever a link comes to a blank document.. navigate to its parent and then from there back to the Link (that will take you to the Git Module that was made external))