Show newer

we've come full circle:
the js implementation thats meant to run outside a browser, being run in the browser. how? why, by implementing an operating system in the browser for the non-browser framework to use.

Will be phasing out all webapps in the next couple of weeks, I'm so happy to have finally found a way to make this all work.

Look if we venerate thinkpad's as "repairable" because you can open them with a set of screw drivers and unplug components and replace them with new components, well you described almost all tech.

There is nothing inside a macbook that I couldn't replace fully, that you could on a thinkpad. With the recent exception of ram.

Same goes for a playstation or a smart phone or a toaster oven.

For less than $20 you can buy a kit of screw drivers that can open any security screw at any size.

The Mu shell's error handling is now much improved. Errors in programs you typed in were already showing up consistently in a trace without crashing the computer. However, _writes to the trace_ could cause it to crash in cryptic ways. No more.

Now I'm back to my long-term plan: a prototyping environment that nudges people to write tests, so that it's easier to throw away the prototype and rewrite it from scratch. Making codebases rewrite-friendly.

Main project page:

Show thread

I was just reminded of the existence of an idea called double-buffering.

It's easy to add:

The current implementation is quite naive. Copies one byte at a time, makes several redundant copies per byte. In spite of all that, it makes a _huge_ difference in the video quality.

Show thread

Intuition is simply a way to communicate a thought or idea with yourself. Usually by way of mental modeling.

If something is intuitive to you, you have been primed genetically or environmentally to understand it or someone has effectively communicated the intuition to you.

Developing an intuition is often seen as a sign of fuller or deep understanding which is not always the case.

Intuition is a model, a map, you understand the model but the model is not the real world.

Using Mu to play with some ideas from Hest by Ivan Reese (video; 3.5 minutes)

Putting more animation and control of time into the debugging experience.

More info on Hest:

Main project page for Mu:

Show thread

This is really neat: an online collection of programs that can pass the type checker but fail at runtime, in a bunch of languages (Java, Scala, OCaml, Haskell, Rust):

It also discusses the design tradeoffs that led to these behaviours.

@neauoire Have you done anything with curves yet with ?

git clone
cd mu
qemu-system-i386 disk.img

Use the Tab key to cycle the cursor (a hard-to-see green square) between the 3 control points (circles), arrow keys to move the control point at the cursor. Watch the curve (red) adjust in response.

ASMA - A assembler written in Uxn itself, by @alderwick! It successfully assembles all the projects in the repo with a 1:1 accuracy with the C assembler.

The Mu computer now dumps a snapshot of the call stack when it fails catastrophically

This was not fun. And the debug information in the second half of the code disk is now larger than the code itself.

On the other hand, I hope debugging will now be more fun!

Show thread

You could probably just rename university programs in "Graphic Design" to "Applied Adobe Patronage".


So, yeah, there's this whole book, not all of which is going to be applicable of course but gives a starting point:

then there's

Wikipedia has a small bit on itself

season to taste

Zooming into the Mandelbrot set on the Mu computer

Apologies for the garish colors. Still a work in progress.

I recorded this on a Mac then sped it up 8x. I estimate that makes it 3x slower than my Linux computer accelerated with KVM.


Main project page:

Don't called them signed ints and unsigned ints. Call them ints and addresses.

Most of us don't care about the increased range of integers yielded by that final extra bit. But addresses sometimes need to set it. And we never want to think of addresses as negative.

"von Neumann estimates 27 binary digits (he did not use the term "bit," which was coined by Claude Shannon in 1948) to suffice (yielding 8 decimal place accuracy) but rounds up to 30-bit numbers with a sign bit and a bit to distinguish numbers from orders. For multiplication and division, place the binary point after sign bit; all numbers are between −1 and +1."

So not quite 32-bit little-endian twos-complement integers like we understand them. Still pretty cool.

Show thread
Show older

Merveilles is a community project aimed at the establishment of new ways of speaking, seeing and organizing information — A culture that seeks augmentation through the arts of engineering and design. A warm welcome to any like-minded people who feel these ideals resonate with them.