GithubHelp home page GithubHelp logo

maxstrange / artie Goto Github PK

View Code? Open in Web Editor NEW
0.0 4.0 1.0 18.37 MB

Research in Artificial General Intelligence via Developmental Robotics

License: MIT License

Python 29.21% Dockerfile 0.99% CMake 0.26% C 68.70% C++ 0.33% Shell 0.50%

artie's Introduction

Artie

This repository contains the code for a robot that I am slowly working on.

This code is pre-release: you cannot yet build an Artie bot - not all of the software and hardware is ready!

The purpose of Artie is twofold: data collection and testing developmental robotics theories.

The vision is that Artie will be fully open source, 3D-printable, and as cheap as is feasible, while still being easy to use and extend.

Get Started

Before you can use Artie, you need to build him.

Please note: you cannot yet build an Artie! These instructions will certainly change, the bill of materials will change, the schematics will change, etc.

Building Artie

Building Artie is composed of the following steps:

  1. Get your parts
  2. Flash the parts
  3. Build your bot

See here for the full instructions

Using Artie

The Artie ecosystem consists of the following:

  • Artie: The actual robot itself - the PCBs, 3D printed parts, the single board computers (SBCs), microcontroller units (MCUs), sensors, actuators, LCDs, etc.
    • Firmware: The MCU firmware.
    • Yocto Images: The custom embedded Linux images.
    • Libraries: Libraries shared between all the software components.
    • Drivers: Applications that run on the SBCs and interface with hardware.
  • Artie CLI: The simplest way to control a physical Artie. Used mostly for testing.
  • Artie Tool: A single tool to flash, test, build, release, etc.
  • (Planned) Artie Workbench: A web app that allows an authenticated user to control and configure Artie.
  • (Planned) Demo Stack: A demo that uses traditional robotics algorithms to control Artie.
  • (Planned) Reference Stack: A reference implementation of developmental robotics theories used to control Artie.
  • (Planned) Simulator: A simulated environment and simulated Artie for training and testing without a physical bot.

Artie is meant to be used to collect data in real-life social situations as well as to test theories of developmental robotics.

Artie Out of the Box

There are three planned ways to deploy an Artie:

  • Cloud
    • You own, operate, and maintain Artie's hardware
    • Artie compute required for administration is provided by the cloud
    • Any additional compute required for experiments or workload is provided by the cloud
    • See here for how to get started with this route
  • Fog
    • You own, operate, and maintain Artie's hardware
    • Artie compute required for administration is owned, operated, and maintained by you in the same network as Artie
    • Any additional compute required for experiments or workload is provided by the cloud
    • See here for how to get started with this route
  • Edge
    • You own, operate, and maintain Artie's hardware
    • Artie compute required for administration is owned, operated, and maintained by you in the same network as Artie
    • Any additional compute required for experiments or workload is provided by you locally in the same network as Artie
    • See here for how to get started with this route

Note that since Artie is so early in development, none of these are viable options yet! Additionally, Edge is the highest priority for development, while the other options are ones I hope to get to, but may not.

Architecture

Here are a few links to architectural discussions:

Overviews

Low-Level

  • CAN Protocols - We overlay several protocols on top of CAN. This document describes them in detail.
  • MsgPack Schema - We use MsgPack for some of the serialization/deserialization.

TODO: We should probably have a 1:1 mapping of Artie bots to Elasticsearch nodes.

Motivation

Why Developmental Robotics?

Developmental robotics is the study of human development by means of simulating it. It is an important field of study for at least the following two reasons:

  1. Developmental robotics informs the study of human development. What better way to test a theory of how a human develops than to try to build a human?
  2. Developmental robitics informs the study of artificial intelligence. Although not typically the main focus of developmental robotics, AI can benefit from any advances in our understanding of human intelligence.

Here's a great excerpt from Wikipedia:

As in human children, learning is expected to be cumulative and of progressively increasing complexity, and to result from self-exploration of the world in combination with social interaction. The typical methodological approach consists in starting from theories of human and animal development elaborated in fields such as developmental psychology, neuroscience, developmental and evolutionary biology, and linguistics, then to formalize and implement them in robots

Developmental robotics is such a cool field. I wish way more people were interested in it.

Embodiment

A central tenet of developmental robotics is that embodiment (agency) is necessary for intelligence. Human intelligence stems from a need to manipulate the world and ourselves in response to the world. It does not exist in a vacuum. Hence, it makes sense to embody an intelligent agent in a physical (or at least virtual) world.

Why a Robot?

If developmental robotics is the study of human development by means of simulating it, why build an actual robot? Why not just simulate one in a physics simulator?

The answer is simple: humans aren't simulated - they exist in a messy, physical world. Any theory of how a human develops must ultimately be put to the test in the same world with the same parameters.

Nonetheless, there is a place for virtual simulation in developmental robotics, and Artie will hopefully incorporate a simulator, since working with an actual robot is way less convenient than working in a video game.

One last thing about robots: embodiment is a two-way street. Though you can get by to an extent with datasets that are impersonal and aggregated (as in the typical supervised learning paradigm of machine learning), humans do not learn this way. Humans learn from interacting with their environment and by their environment interacting with them. Parents speak directly to their children using infant-directed speech. Teenagers navigate social environments that are unique to their particular group of friends. Young adults take elective classes at university that are interesting to them. An intelligent agent has an active, often causative relationship with the data from which it learns. Hence, you need to place the subject into an environment to truly study the development of intelligence.

Also, robots are awesome!

Why not Buy a Robot that Already Exists?

A few reasons:

  • They're so expensive. Holy crap are they expensive:
  • Affordable ones cannot feasibly be used to study human development:

Artie is built from the ground up explicitly for the purpose of developmental robotics. He's also open source, and as cheap as is feasible (though it turns out, that's still pretty expensive).

artie's People

Contributors

maxstrange avatar

Watchers

 avatar  avatar  avatar  avatar

Forkers

pippianders

artie's Issues

Neck Vertebrae

Needs to be 3 DOF while also housing:

  • Water cooling
  • CAN bus
  • Common GND
  • MCU Reset lines

OOBE Docs

Create documentation for bringing up an Artie, including:

  • Artie Admin Node
  • Artie Compute Nodes
  • Artie
  • Dev machine

Eyebrow FW

Need to test with v0.2 of PCB.

Need to test servo limits.

Build System

Artie command line for building everything:

  1. Pulls in Python file from whichever item you are trying to build, then runs its 'build' command to build that artifact.
  2. Can be used to build all.
  3. Can be used to build a single item.
  4. Can be used to build more than one item.
  5. Can be used to release.

Head Power PCB

PCB for the power rail. This PCB also happens to have the pump MCU and circuit.

Controller Node Yocto Image

  • CAN integration
  • Out Of Box Experience Integration
  • OTA upgrade mechanism for system layer
  • Release version (bunch of changes from dev version)

CAN Protocol

Thinking at least three protocols on top of CAN:

  1. Real time messaging for safety critical RPC, such as "stop motor"
  2. Low priority, large block writes for FW updates
  3. Medium priority pub/sub messaging
  4. Medium priority RPC

CAN library

Make sure that it can be linked into MCU FW and shared between application and bootloader AND make sure it can be dynamically linked against ARM64-Linux applications.

Eyebrow Driver

Need to test with Controller Node v0.2 PCB.

Should integrate with Kafka or whatever pub/sub I end up using.

Reset Addresser MCU FW

Super simple: just take in an address over I2C and convert it to a GPIO equivalent (with strobe/clear) for resetting the appropriate MCU.

Telemetry Instrumentation

Use OpenTelemetry. Figure out the telemetry observability back-end later (Graphite? Jaeger probably?)

Pump Circuit

Need to:

  1. Provide 12V to pump when switch is in either position
  2. Provide voltage to the pump control IC when the switch is in either position
  3. Need a button that can run the pump at full speed
  4. Voltage on the I2C bus needs to be investigated for implications when switch is on pump mode

When in normal operation, the pump controller MCU should have the following sensor information:

  1. Flow rate
  2. Coolant temperature
  3. Pump speed
  4. Pump overtemperature detection
  5. Pump failure detection

It should log information onto the CAN bus

Water Cooling System

Need to:

  • Source parts: pump/reservoir, flow meter, tubing, cooling blocks, clamps, compression fittings, sensors, radiators.
  • Test parts with no electronics for several days.

Head Sensor FW

This should share a common FW image with all sensorimotor nodes. Probably an RTOS?

Build system should pass in a unique address (which should match their reset line) via pound-define.

Sensorimotor PCB

The sensorimotor PCB is the same for as many of the servo control/sensor nodes as possible to cut down on complexity.

See here for the motors that are currently spec'd. Torque calculations are pending though, so the exact model is likely to change.

Mouth Driver

Need to test with v0.1 mouth PCB and v0.2 controller module PCB.

Pub/Sub Design

Determine which pub/sub library to use. Probably Kafka?

Also, driver model: displays and actuators are managed via RPC, sensors publish data via pub/sub.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.