Skip to main content
Blog

Creating the Production Prototype, and Adapt 0.2 Released!

By January 20, 2016 No Comments
Top of main PCB

The hardware team has the first production prototype boards off the printer and they are on their way. Once they show up we’ll start integrating them into our first production prototype, which will be very similar, if not identical, to the unit that our backers will be getting. No more benchtops covered in jumper wires!

mycroft_splayed

Arron testing Mycroft hardware with jumper wires on the bench.

Adapt Update

Adapt will now run on Python 3. Igor Vuk, a member of the Mycroft community took the time to make the Adapt engine work with Python 3. Thanks so much Igor for your contribution!

Other community members are making contributions and doing interesting things with Adapt as well! For instance, community member: B00703D is building an IRC bot using open source python-based IRC bot Sopel and the Adapt Intent Parser.

New Skills

In addition to several performance improvements, the dev team has built out some of the promised skills this week. We’ve implemented a reminder skill that lets you tell Mycroft to remind you of a task in the future. For instance, you could say “Hey Mycroft, remind me to take out the trash at 7 AM” and you can count on a reminder to take out the trash promptly at 7:00 AM.

We’ve also implemented a sleep skills that allows you to put Mycroft to “sleep” so that he ignores his keyword. This is useful if you are having a lot of conversations about Mycroft in your offices, or using the work Mycroft a lot during a presentation (so this skill is very relevant to the Mycroft team). You can wake Mycroft easily with “Wake up Mycroft”.

System Core Update

Back of faceplate PCB, that connects the LED matrixes, microphone, and IR sensor.

Back of faceplate PCB, that connects the LED matrixes, microphone, and IR sensor.

We’re now able to write the Arduino sketch via the Raspberry Pi using the ICSP header on the Arduino board. This allows us to update both the Mycroft software and the faceplate software at the same time. We’ve also implemented a service that listens to the onboard message bus and triggers the various faceplate behaviors. This allows us to accurately time the animations and expressions on the screen to coincide with the AI’s behavior.