Posted on

Making an instrument from a scan of my own teeth.

My dentist recently mapped my mouth with a 3D scanner. I asked if I could have a copy of the file, promising to create a bespoke instrument from the scan.

The information arrived as an STL file. The scan had a good level of detail, but some problems needed to be addressed before I could turn this into a printable model. It has been many, many years since I last used Blender, so I decided to use this project as a re-introduction. I looked for ‘Blendercam‘, the modified version I used to use for creating cutting files for my CNC mills, but there were no valid downloads that would run on my Mac. Vanilla Blender it is then, with Ultimaker Cura as the 3D printing processor.

The raw model was inside out – the inner surface of the model was on the outside, preventing me from turning it into a printable object. I used Autodesk’s Meshmixer to clean up the edges of the mesh and ‘flip’ the faces so the outside was properly outside. I am aware that Blender has similar tools but using Blender is very similar to using Avid’s Pro Tools – it is filled with a seemingly random mix of useful and esoteric functionality that is not navigable until you have spent a few weeks unraveling the interfaces and learning the hotkeys. It is often quicker to use another, simpler tool that is focused on the task you want to achieve. Meshmixer can also turn a surface, like my scan, into an solid object. It does a good job but I was unhappy with the loss of detail in the final model. I imported the fixed mesh back into Blender and manually extruded the scan into an object.

Rough scan of teeth.
The original state of the scan mesh in the raw STL file.
Meshmixer: The pink faces are inside out – they show the ‘outside’ of the object.
Meshmixer’s ‘Make Solid’ command does a good job, but will take away some detail.

In Blender I used the circle select tool to separate the teeth into objects. I have a very old 3D Connexion Space Navigator that is still supported, even in Monterey on an M1 Mini. Flying around the object with the left hand and controlling selection with the right makes Blender so much easier to use than just a mouse/keypad/keyboard combination. After isolating the teeth I used ‘Fill’ to create new faces, filling the open mesh holes in the teeth and in the gums. Blender’s sculpting tools filled the faces with a dynamic mesh that I could push and pull into shapes that I felt comfortable printing. I added rods to the teeth and subtracted them from the gum object – they will allow me to mount the teeth and run wires into them from underneath the gums.

Removing the teeth from the gums in blender.
All of the teeth as separate objects
Using Blender’s sculpt tools to fill in holes in the gums after removal of the teeth.
Posts for mounting teeth and routing wires through the gums.
Tooth mounted on a post.
Test print after turning the mouth scan into an object.

Posted on

DIY 26 Speaker Ambisonic Dome – Part 5

This small Ambisonic dome is, by design, a one-person experience. These works combine the immersion of Ambisonic audio with interactive augmented and virtual reality, creating very personal worlds that cannot be experienced through 2 dimensional media. Each work gives the audient different levels of agency, discovery and immersion. The VR headset views and projections are all generated in real-time with MAX. The audio content is a combination of Ambisonic processing in MAX, Cherry Voltage modular synthesiser and Native Instruments Reaktor.

These pieces were made to be experienced, not watched. Watching an immersive experience from the outside is like eating the menu at a restaurant – but below is a collection of short clips showing the system in action. ‘Coil’ and ‘Living Room’ use the Vive controller as an exploratory tool.

‘Coil’ places you inside a gradually intensifying map of the electromagnetic radiation emitted from consumer devices in a kitchen & lounge room. Discovering the unseen topology of the fields we live with every day is surprisingly visceral.

‘Living Room’ scales Australia down to 3 meters inside the dome. Dynamic maps of bushfire, rainfall and temperature variation can be selected via a bluetooth footswitch. Although you are inside the data projection, an FM modular synthesiser controlled by the position of the Vive handset is your only feedback, growing more discordant as time passes and the maps intensify. Areas without change are the only respite but they grow fewer and smaller as change accelerates.

‘Workspace’ is a subset of tools designed to make a complete VR mixing environment. The spherical audio emitters can be placed in 3 dimensional space, adjusted for volume and given animation paths that they will repeat until reset. Evaluating the efficacy of my DIY Ambisonic dome when combined with immersive headset VR in this manner was the subject of my thesis.

‘Drown’ is simple, largely passive and surprised me with the nasty intensity of the conveyed experience. This work is entirely dependent on the power of visual and audio VR immersion. Seated in the centre of the dome, over several meditative minutes your mind accepts the reality of the undulating wireframe ocean and drifting sound emitters. Then you realise that the level is rising. The moments when the water surface is just at head height and the waves are higher than you can crane your neck are genuinely disturbing.

A slightly redacted version of the artist statement can be downloaded here.